Who ordered the scrambled brains?

Make your voice heard! Comment on a post today.

Ferguson: Black and White and Read All Over

Yesterday, a WAPO journalist filed this piece, documenting his personal abuse by the Ferguson PD. It’s far from the only such account.

Today, the same journalist filed this one, a glowing review of the Missouri Highway Patrol.

It’s interesting to see just how much the press will editorialize, select, and de-/re-contextualize to shape issues. The highway patrol coverage is so positive it comes off as blindly faithful—but they’ve only been on the job for one day, and this article focuses on but a single person. The FPD sucked, as did their militarization culture, but I’m sure there were decent folks in it. Likewise, I’m sure there are power-hungry, inept, thugs in the highway patrol. But the press wants to craft their story about trends in the militarization of Main Street, police brutality, and racial profiling, so they keep the issues black and white.

It’s nothing new. They do this with gun control and the failures of the war on drugs and the backwardness of Russia and China, with varying degrees of success. Worse are entertainment outlets that dress up like the press (e.g. FNC), whose singular purpose is to editorialize to shape issues.

I think these trending law enforcement issues need all the pushing they can get, so I’m behind some editorialization. But in the end, all sides must be understood so that society’s reaction isn’t an overreaction.

The Law of Brittleness

No, I’m not referring to some French culinary legislation regulating the feel of that peanut confection, but rather to a property of—what else—software.

break things fastThe idea of technical debt is critical to employee work-life balance and to customer relationships. However, many companies these days seem willing to carry that debt, obsessed with a “move fast/break things” culture where any efforts resembling code perfection are feverishly rejected. But technical debt is a strategic concern, so one would think all those business strategists in their fancy suits would find want to address it. Without adequate recognition, the emphasis inadvertently shifts from “moving fast” to “breaking things”. I believe this lack of due attention is largely a matter of poor communication by engineers with stakeholders. Technical debt is often misrepresented as a temporary issue. “Give us a week to erase this technical debt.” On the other hand, saying “Each work item carries with it a 15% implementation tax to mitigate technical debt” is no more reassuring.

The term “technical debt” is an attempt to use business-friendly language to describe a less tangible quality of software: its capacity to be changed. Code that took a lot of shortcuts will be harder to change in the future, so it’s said to have technical debt. Removing the shortcuts and doing things more “cleanly” (if also slightly more “perfectly”) makes the code easier to change in the future and therefore reduces its technical debt. However, while it nicely sums up the idea that work will slow down (the “interest” that must be paid when changing a piece of software that has technical debt), it misrepresents the cost as a static sum that has already accrued. In reality, the cost of difficult-to-change code is proportional to the changes desired in the future. So you can only really assess the technical debt in the context of proposed changes. In fact, technical debt has no cost if no changes will ever be made. The problem is, one can never be certain that changes won’t be needed. Especially in today’s rapidly-evolving media and technology landscape, it’s far better to assume changes will be needed.

a better metaphor for technical debtSo it’s important for the language used to capture the business vulnerability that is associated with difficult-to-change software. While we could adopt a serious-sounding color-coded threat level system, I prefer a less theatrical, more tangible metaphor. I think of software change capacity as “software malleability” or “software brittleness”. The more brittle some software is, the more difficult it is to change it. Therefore, changing brittle software requires relatively more time. Unfortunately, the Law of Brittleness (I just made that up) states that changes to software tend to make the software more brittle. Only consistent focused effort can manage brittleness and keep software in a malleable state. This explains why brittleness increases particularly when changes are made in restrictive time constraints. Critically, as changes are made without adequate time, brittleness can compound such that the time cost of future changes increases exponentially.

That is a dangerous, unsustainable place to be in. It means the business is less able to respond to changing demands, threatening their value, customer relationships and brand. And it means that more and more pressure is directed onto engineering teams, who cannot fight the Law of Brittleness and therefore can only alleviate the pressure by eschewing work-life balance. Saying “we have a lot of technical debt” doesn’t convey “we are dead in the water unless the engineers work overtime.”

Then who gets the final say in whether software is brittle or not? The engineers do, since they are the ones who know the software internals and will pay the price later on. They should be trusted to assess the brittleness of software, and therefore they should be allowed to mitigate it regularly. Anything else is unsustainable. “Won’t they just abuse the privilege and underestimate work just so they can sip cocktails on the beach?” Unlikely. And if stakeholders are unhappy with this structure in any way, they always reserve the means to address it (replace the engineering team, hire more engineers, outsource, or accept the unsustainability and adopt a “burn-and-churn” human resourcing strategy).

This also means that engineering teams are charged with assessing brittleness and raising brittleness as a concern. Since it’s so intangible, assessing it can be hard. Something that gets the team pretty close is to rank the codebase (or parts of a codebase) on a brittleness scale form 1 to 5. What this assessment really translates to is a gut feel of how comfortable they’d be making changes to that system. Ideally, engineers would feel no hesitation or discomfort to the notion of changing some module. But for time-impacted projects, there will invariably be areas that the team hopes will never need changing, and may exhibit violent involuntary spasms at the thought, due to the disastrous brittleness of the code. Assessing and raising these concerns doesn’t just benefit the business. It protects their own work-life balance, which keeps their average hourly compensation from reducing to peanuts (brittle or not).

Blocvox has officially launched!

A few years ago, I was inspired by the posts I was reading on Facebook, on Twitter and on my friends’ blogs. It bothered me how difficult it was for great viewpoints to reach the larger audience they deserved–and how those who do have large audiences can be so out-of-touch (politicians) and sensationalistic (mainstream media). So I built a better discussion platform, that allows us ordinary folk to paint for ourselves a clear picture of the world. Blocvox officially launched over the weekend!

Blocvox amplifies your voice by letting you align with cultural groups and causes, who in turn cooperate to promote their viewpoints to the world. But this vision of democratic communication is missing one ingredient: passionate, outspoken people! If you’re sick of the disconnect between what you know and what those in power say–and if you believe that we’re responsible for fixing the way we communicate as a society, because no one will do it for us–then Sign Up! Use the site and let me know how to continue improving it. The community is small now, but with your help this simple website can turn into a movement!

See the official launch announcement on the Blocvox Blog. You can also check out an illustrated tour to get a quick overview of how Blocvox works.

Homage to the Second Law of Thermodonutics

Update: Research into this topic continues. More recent thinking on the Second Law of Thermodynamics can be found here.

The Second Law of Thermodynamics is by far my favorite of the bunch. If I were to be stranded on a desserted island (one preferably laden with blueberry pie and blue velvet cake) with one of them, my first choice would be the Second. It’s the one that describes the tendency of heat to spread out rather than stay in a particular spot. It seems so simple you could almost ignore it for triviality, but it actually has huge physical and philosophical implications (even underlying our perception of time*), and underlies a lot of my worldview. I just realized I’ve never honored it by blogging about it (no, that’s not contradictory, thank you).

A cup of coffee cools down, only because it’s heat tends to spread out. Likewise, a pot of water on a stove will boil and evaporate, only because boiling and evaporating help the heat spread out. The Second Law is always there in the background, giving molecules a little nudge to remind them to “DISPERSE YOUR HEAT, INFERNAL HEATHENS!!”

Like the pot of water, the Earth is subject to a constant blast of heat from the Sun. Instead of boiling, atoms on Earth do other things to help dissipate heat. One thing is to randomly form compounds with neighboring atoms, because bonding actually cools the atoms down a little–they release a tiny amount of heat! (Electrons in covalent bonds reduce their kinetic energy, and electrons in ionic bonds reside at a lower energy level.) If this didn’t release heat, they wouldn’t spontaneously form a compound in the first place.

What I find most astounding about this is that if you extrapolate a few billions years of spontaneous bonding in an ever-constant drive to dissipate heat, life itself can originate. This isn’t a new idea and I’m not the first to have thought of it. First, those molecular compounds, subject to random bond formations, are probabilistically bound to form a compound that just happens to have the unique capacity to react with other atoms in the environment. Given heat and contact with those atoms, these compound will reconfigure those atoms in different, random ways. Probablistically, one such compound will eventually form that reacts with heat and certain other atoms to reconfigure those atoms into a copy of itself. This first, spontaneously-formed, self-replicating compound will have free reign over the environment to replicate at will without competition.

Once it becomes the norm, it will also be motivated to react in random ways with material in its environment. Probablistically, it will form variants of itself that are more efficient at dissipating energy (by growing larger and incorporating more energy-releasing bonds, or being more efficient at self-replication). These variants (possibly polymers or ribozymes) will survive and replicate at the cost of the earlier ones. Probablistically, such compounds can construct other compounds that only aid in the survival and replication efforts of the primary compound. Perhaps it constructs compounds that draw in certain raw materials, or protect the primary compound from destruction by competitor compounds.

That might sound far-fetched and overly-complex, but it is all driven by the very simple dictum of the Second Law. Given time and chance, such a variation can spontaneously occur. And once it does, it can quickly take over an environment because of its superior fitness. Now the cycle resets at a higher level of complexity, leading to cellular life, organelles, simple organisms, and more complex ones. In each case, random chemical variation can create improvements that will dominate an environment. It may be shocking to consider, but there is no inherent moral meaning here. This is all an inevitable consequence of the tendency for matter to dissipate the energy being blasted upon it. The evolutionary cycle of increasing complexity can even extend beyond biological life, to explain group and societal behavior. Humans that are better able to set aside differences and cooperate to promote their general welfare will out-compete those that don’t or that do so less effectively (such as those that didn’t articulate and defend their ideas as attractively as others on Blocvox *wink*).

At this point, it is the propagation of the collective entity as defined by it’s memetic make-up that serves as the foundation for another cycle of increasing complexity. Social groups that cooperate in stable partnerships will outlast those that do not. Again, all because of the simple need to dissipate that interminable heat as efficiently as possible. O mighty Second Law, we are here to serve you.**

A paper was published last year that asserts progress along these lines. I would love for the science behind this to be tightened up, and it’s an area I would love to study much deeper at some point (retirement?!). In the meantime, I’m content with the idea that we exist for no other reason than to more efficiently absorb heat from the Sun. The cycle above even explains the tendency of humans on a large scale to be skeptical of each other, as we compete to prove to the universe who’s best at dissipating heat. (Rumor has it, the winning group gets an expenses-paid vacation to Risa *doublewink*.) So it’s obvious what we need to do in order to achieve world peace: extinguish that meddling Sun. While we work on that, I suggest we resign ourselves to absorbing as much energy as possible.

 
 
 

…in the form of donuts.
 


* If a cool cup of coffee were actually sucking heat out of the room and heating up, you would conclude that you were actually moving backward through time. Well, you would if you had the mind of a theoretical physicist.

** This statement is included for comedic effect only. The Second Law of Thermodynamics is a physical principle and is not intended for use as an anthropomorphic deity. Stunts performed by professionals on a closed course. Batteries not included. Side effects include dry mouth, swelling, dizziness, and tizzyness.

Notes on UK government

Last year I threw together a description of the UK government for a friend whose work focus shifted there. It didn’t take much to nudge this cultural organization aficionado to do it, and the historical relationship and contemporary closeness between the US and the UK made it especially interesting. I added a smidge of analysis around some standout features, and contextualized for the American reader. I just revisited it and felt, in light of the surveillance revelations, it might be of interest to a wider American audience. Enjoy.

Download Notes on UK Government (PDF).

Happy New Year!

2013 was a woozy-doozy non-stop code-binging adventure, but I’m proud to finally say “Blocvox is alive!” In the end, I managed to chew all that I’d bitten off only with the tremendous support of my wonderful girlfriend. I couldn’t possibly be more humbled by, or thankful to her.

In a nutshell, Blocvox allows us ordinary folks to bring attention to the issues that matter to us, by combining our individual voices into powerful collective voices that are hard to ignore. It functions like the web’s town square, providing a convenient, democratic place to take a stand with your causes in order to tell their story and address other groups. I invite you all to sign up and weigh in on the world, because it’s just too important to leave it up to politicians, celebrities and the mainstream press.

I’m looking forward to 2014 and the next chapter of Blocvox! Happy new year to you all!

Speaking of which, the >americans voxed on Blocvox:


Happy New Year, worl­d!

May tole­rance and unde­rsta­nding of othe­rs allow us to leve­rage our diff­eren­ces to make the most of 2014 and the chal­leng­es it hold­s!

The puzzle of Japan

Few things fascinate me as much as sociology, especially when looked at over long timespans. My interest began while taking an evolutionary anthropology course at UCLA, which combined evolutionary processes with human behavior, and increased as my experience as a software developer trained me to critically identify generalizations and abstractions. I credit Dawkins’ memetics (though imperfect) for crystallizing this fascination.

Anyway, I just read an interesting article in this realm about the widespread drop in interest in sex among younger Japanese. It’s an interesting account of an apparent cultural existential crisis, in which the author surveys cultural and governmental opinions, and then portray several individual stories.

The all-or-nothing work culture for women—if you get married, your career is over—carries over to the men: if you get married, you have to solely bear the burden of income for your family, despite the exorbitant cost of living. So the disinterest in sex seems inevitable and it’s hard to blame them. I speculate that this lifestyle will not be as rewarding as the current youth think. It replaces the huge demands from society with a simpler, attainable self-serving ethos, all about having time to shop, go on vacation, earn money for yourself, etc. While not applicable to every individual, I think humans find longer-term satisfaction in contributing to something greater, such as family or society and I wonder how happy these people will be in old age. One could argue their professional life is a contribution to something bigger, but since that is involuntary and tied to selfish ends, I don’t think it counts (though jobs outside the high-salary limelight could count).

The desire to contribute to something bigger, however it comes about, actually promotes individual survival and quality of life, since it leads to strong group bonds and the benefits of cooperation. A society of outlaws or anarchists will have difficulty enduring, because they’d be fighting an uphill battle against those that, however it comes about, prefer cooperation. The recent Japanese shift away from procreation can also be framed within evolution, though it may appear ironic. A healthy organism comprises a set of internal organs that harmoniously promote each other and meet perceived environmental constraints. The heart benefits the brain benefits the skin, etc, just as various economic sectors and cultural movements contribute to each other to create a resilient vibrant society. When external factors (appear to) change, internal systems can be thrown out of balance. In Japan, the elevation of sexual equality, I would guess from America and Europe, has altered the behavior of the Youth system, such that they no longer find old family customs attractive. But just as organs respond to environmental demands to promote the survival of the organism, I would guess a successive generation of Japanese (though perhaps smaller in number) will naturally identify and react to the deficiencies of the prior generation. “Look at all those unhappy old people that spent their whole lives serving themselves and are now dying alone. Sure they may have sustained our economy but there has to be a better way!” Like a pendulum that has reached it’s highest point, they will correct those deficiencies by effecting shifts in the culture.

This might not occur in the very next successive generation, but I do think it will eventually happen. The basic (more philosophical than scientific) idea is that whatever children are born will have parents that rejection, to some degree, with the recent cultural shift away from family. That rejection will likely be passed to the children from the parents. At the same time, those in society who had accepted the all-or-nothing work culture and had not procreated will not have anyone to propagate the all-or-nothing ideal to. The principle here is that belief propagation through family causes societies to tend toward a sustainable culture. Observing a rebound in the value of family in future Japanese generations will be fascinating, and will exemplify how multiple human generations correct for each other to adapt their culture toward sustainability. Of course, such a model ignores the influence of non-family learning, as well as the social effect of increasing globalization.

The kink here is that these successive Japanese must identify strongly enough as Japanese. If they were to absolve themselves of that identity, and perhaps move to other places where their beliefs (balance between family and work equality) are already widely held, then the Japanese may fall into some more dire, erratic situation. Revolution? Severe economic depression? This all calls into question why the current generation, who prize equality, do not move to other countries where they can be treated equally but also pursue substantial romantic relationships. My only thought is a perceived or real language barrier. I have heard that Japanese are generally embarrassed about practicing English. This might in fact be one of those mutually-reinforcing ideas that allows the entirety of the current overall culture to endure.

I liked the static glimpse into another culture the article gave, but I felt it puzzling and incomplete. (In fact, contributing to this kind of cross-cultural understanding is one of the reasons I’m so driven to develop and grow Blocvox.) The article left me wondering how this unequal all-or-nothing work culture has remained as rigid as it has to date. I would think a loosening of standards would be required to attract, hire, and retain a workforce that is fed up with those impossible pressures. I wonder if this rigidity and sense of order might be a relic of a post-war identity crisis. Also, I wonder how young and old feel of the concerns held of them by the other. Of the youth, I wonder how they feel about the declining birth rate and their responsibility toward the Japanese nation and culture to procreate. Of the old, I wonder how they feel about being responsible for bringing about this unexpected outcome. Hopefully one day, I’ll be able to ask them directly.

Shutdown plurality voting!

I’m on the Congress-hating bandwagon, but I’m also on the we-have-no-one-to-blame-but-ourselves bandwagon, up on the roof of it for a while, jumpin’ and shoutin’ (and tweetin’):

  • Government shuts down, due to
  • Ideological fundamentalism in Congress, due to
  • Ideological fundamentalists voted into Congress by the American people, due to
  • Dysfunctional, big money, two-party political system generating few election options, due to
  • Widespread use of simplistic Plurality Voting system based on single-mark ballots, due to
  • It being used when the country was founded and now being ingrained in our culture, due to
  • Single-mark ballots being easy to tabulate by hand.

Yes, in 2013, after landing rovers on Mars, sequencing the human genome, and creating a machine than can beat humans on Jeopardy!, Americans still use a simplistic system based on single-mark ballots and plurality voting because historically they were easier to count by hand!

Fortunately the Constitution (and local and state law) can be changed. ;) Do we want to move to more evolved, robust systems, such as Preferential Voting (ranked voting) or Proportional Representation (wherever possible), to create a more representative, satisfactory, efficient government? Or do we want to keep our ‘merican gladiator system that produces ideological meatheads who’d rather fight on camera than solve problems?

Read about it for yourself: proportional representation and single transferable vote (fancy for “preferential, ranked voting”).

NGO worth checking out: http://www.fairvote.org/

Aside from Congress and the American people, I also question Obama’s 2-week absence and the spotty coverage of the mainstream press (our vaunted Fourth Branch of government). Why hadn’t Obama, his minions, or the press been driving attention to concrete consequences, rather than abstractions like “Americans will be hurt”–as much as I care about Americans, I wanted to know the direct affects of The Shutdown TM on me. Over the last two weeks, I’d found very little, and only last night did the deluge of stories about what will and won’t be affected by The Shutdown TM come out. If only there had been an easy-to-use website for a normal guy like me to drive attention to that issue…

Upgrade your gray matter

I’ve wanted to properly avail a useful software creation as open source for a long time, but work (both employment and entrepreneurial) had always preoccupied me. As my startup Blocvox matured, I saw in it the possibility to fulfill this desire. Recently my task list has been clearing up, so I spent the necessary few hours isolating, documenting and polishing some code for you, dear reader. Blocvox is fairly well decoupled and many parts are ripe for packaging, but there was one piece that stood out as particularly useful and well-contained. I give you, ScrambledBrains.GrayMatter.

Blocvox uses a command-query responsibility separation (CQRS) architecture to simplify both the domain model and the user interface code, and to enhance performance by computing derived values once at write-time rather than repeatedly at read-time. ScrambledBrains.GrayMatter plays a major role in connecting these two sides of the Blocvox brain, allowing them to remain simple and agnostic.

This is a custom Castle Windsor facility that allows easy registration of event handlers, and high-performance, decoupled attachment at resolve-time. On the registration side, the package provides a strongly-typed fluent API and a reflection-friendly API. On the resolution side, the facility dynamically constructs, compiles, and caches the code needed to add event handler Delegate instances to events, in order to remove the reflection performance hit. It also wraps the delegate to provide “just-in-time” resolution of the subscriber, in order to mitigate huge dependency graphs being resolved up-front which might not even be used. This takes advantage of, and reinforces, the decoupled semantics of the event keyword.

Call me old-fashioned, but I find event to be the Way to do decoupled event-driven programming in C#. No reactive/observer/observable ceremony and threading black boxes. And no God-awful pub-sub/service bus God objects. (Unless you’re dealing with significantly disparate business units, pub-sub is doing it oh-so-painfully-unnecessarily-wrong. But it’s easy, right?) Despite its own warts, nothing is as idiomatic, well-understood, and portable as event for decoupling event providers from subscribers.

It’s not a lot of code, but I’ve found it it to be immensely valuable, primarily in keeping the cognitive load of the codebase low. I hope it is of use to others. I plan to maintain and improve it, but I don’t foresee drastic changes in the near future.

I leave you with these timeless words from a futuristic poet: “Upgrade your gray matter, ’cause one day it may matter.”

Decentralizing trust on the web

Update 2013-10-30: Yes, I am an idiot. This is all moot, as SSL’s single-point-of-failure is mitigated by cipher suites using perfect-forward secrecy. Carry on, nothing to see here.


I ‘d like to sketch out an idea for a practical, improved encryption mechanism on the web. As it stands, HTTPS relies on SSL key certificates, which are “endowed” with trustworthiness by certificate authorities. There are relatively few certificate authorities on the whole of the Internet. Because a compromise of those certificates means a nefarious agent can create/sign their own authentic-looking certificates and then perpetrate a man-in-the-middle attack on any so-protected web server, I contend the current state of web encryption depends on too few points of failure.

I propose replacing/augmenting this centralized trust model with the decentralized one of asymmetric public-key cryptography. Rather than one-key-serves-all, in public-key cryptography each communicating pair has their own key set. As a practical requirement, I propose relying on HTTP or HTTPS as the transport, but encrypting all request and response bodies with the parties’ individual public keys. Ideally, support is built into the browser, but short of that (or in the interim) we can use browser extensions/add-ons to hook into request/response events and perform the encryption/decryption.

Browsers that support this would notify the web server with a HTTP request header, perhaps X-Accepts-Key with a value of a supported public key’s fingerprint. This would allow the server to lookup the supported public key via fingerprint. Such browsers could also send messages encrypted with the server’s public key and indicate this in the response with the header X-Content-Key specifying the server’s key fingerprint. Likewise, server responses would include X-Content-Key in their responses to indicate the user’s public key. These headers should be considered alongside other HTTP content negotiation parameters (influenced by request Accept* headers and specified in response Content* headers) in determining HTTP cacheability.

Web servers will have to retrieve the public key specified in the request headers. I do not propose the exact mechanism for this, but a simple approach would be to allow users to associate a public key with their “user account” (i.e. their unique security identity), either by POST-ing a web form over plain-old HTTPS—or perhaps in person at a corporate or field office! (I imagine a market for physical key delivery could crop up if the public demanded it… think armored trucks and bankers boxes.) Likewise, the server will provide a public key to the user/user-agent; this associated keypair should be unique to the user to provided enhanced security. (Users can check this among themselves by comparing keys used between their individual communications with the server.)

Servers should also support the OPTIONS request and include something like X-Allows-Encryption: rfc4880. Both server and user agent should dynamically fall back to “plaintext” HTTPS when either side lacks support. In particular, due to non-idempotency of certain HTTP methods, URLs of encrypted requests should first be OPTIONS-ed. Unfortunately, OPTIONS is not cacheable, but this overhead is a small price to pay when security is tantamount. It would be nice to simply try the encrypted request and rely on the server to properly reject it with a 400 (which would indicate the need to retry without encryption), but it’s conceivable that the semantics of certain resources do not allow differentiation between plain and cipher text (PUT-ing or POST-ing binary data).

Ultimately, while not being the end-all of web security, this seems to me to add a “pretty good” layer of complexity to existing security conventions. Of course, I’m no security or cryptography expert, so I can’t assert the merit of this idea. But that doesn’t stop me from discussing and thinking about this important issue.


Update, 2013-10-03: Perhaps another alternative would be to make it easier for users to act as certificate authorities, and for websites to create unique SSL certificates per-user that the user can then sign. For example, a new website user will log into a website that is protected with an SSL certificate signed by a third-party “trusted” authority. The website will then, perhaps in automated fashion, create a unique SSL certificate for that user and request that the user act as a certificate authority and sign the certificate. Thereafter, the user will access the website via a unique subdomain, perhaps of the form https://<username>.example.com. While leaving the initial certificate-signing stage protected by only a common (single point of failure) SSL certificate, this does create a proliferation of SSL certificates thereafter, and cracking/forceful entities will have significant difficulty conducting mass surveillance.

1