Who ordered the scrambled brains?

Documenting the only time in his life in which he has the time to document his life.

Happy New Year!

2013 was a woozy-doozy non-stop code-binging adventure, but I’m proud to finally say “Blocvox is alive!” In the end, I managed to chew all that I’d bitten off only with the tremendous support of my wonderful girlfriend. I couldn’t possibly be more humbled by, or thankful to her.

In a nutshell, Blocvox allows us ordinary folks to bring attention to the issues that matter to us, by combining our individual voices into powerful collective voices that are hard to ignore. It functions like the web’s town square, providing a convenient, democratic place to take a stand with your causes in order to tell their story and address other groups. I invite you all to sign up and weigh in on the world, because it’s just too important to leave it up to politicians, celebrities and the mainstream press.

I’m looking forward to 2014 and the next chapter of Blocvox! Happy new year to you all!

Speaking of which, the >americans voxed on Blocvox:


Happy New Year, worl­d!

May tole­rance and unde­rsta­nding of othe­rs allow us to leve­rage our diff­eren­ces to make the most of 2014 and the chal­leng­es it hold­s!

The puzzle of Japan

Few things fascinate me as much as sociology, especially when looked at over long timespans. My interest began while taking an evolutionary anthropology course at UCLA, which combined evolutionary processes with human behavior, and increased as my experience as a software developer trained me to critically identify generalizations and abstractions. I credit Dawkins’ memetics (though imperfect) for crystallizing this fascination.

Anyway, I just read an interesting article in this realm about the widespread drop in interest in sex among younger Japanese. It’s an interesting account of an apparent cultural existential crisis, in which the author surveys cultural and governmental opinions, and then portray several individual stories.

The all-or-nothing work culture for women—if you get married, your career is over—carries over to the men: if you get married, you have to solely bear the burden of income for your family, despite the exorbitant cost of living. So the disinterest in sex seems inevitable and it’s hard to blame them. I speculate that this lifestyle will not be as rewarding as the current youth think. It replaces the huge demands from society with a simpler, attainable self-serving ethos, all about having time to shop, go on vacation, earn money for yourself, etc. While not applicable to every individual, I think humans find longer-term satisfaction in contributing to something greater, such as family or society and I wonder how happy these people will be in old age. One could argue their professional life is a contribution to something bigger, but since that is involuntary and tied to selfish ends, I don’t think it counts (though jobs outside the high-salary limelight could count).

The desire to contribute to something bigger, however it comes about, actually promotes individual survival and quality of life, since it leads to strong group bonds and the benefits of cooperation. A society of outlaws or anarchists will have difficulty enduring, because they’d be fighting an uphill battle against those that, however it comes about, prefer cooperation. The recent Japanese shift away from procreation can also be framed within evolution, though it may appear ironic. A healthy organism comprises a set of internal organs that harmoniously promote each other and meet perceived environmental constraints. The heart benefits the brain benefits the skin, etc, just as various economic sectors and cultural movements contribute to each other to create a resilient vibrant society. When external factors (appear to) change, internal systems can be thrown out of balance. In Japan, the elevation of sexual equality, I would guess from America and Europe, has altered the behavior of the Youth system, such that they no longer find old family customs attractive. But just as organs respond to environmental demands to promote the survival of the organism, I would guess a successive generation of Japanese (though perhaps smaller in number) will naturally identify and react to the deficiencies of the prior generation. “Look at all those unhappy old people that spent their whole lives serving themselves and are now dying alone. Sure they may have sustained our economy but there has to be a better way!” Like a pendulum that has reached it’s highest point, they will correct those deficiencies by effecting shifts in the culture.

This might not occur in the very next successive generation, but I do think it will eventually happen. The basic (more philosophical than scientific) idea is that whatever children are born will have parents that rejection, to some degree, with the recent cultural shift away from family. That rejection will likely be passed to the children from the parents. At the same time, those in society who had accepted the all-or-nothing work culture and had not procreated will not have anyone to propagate the all-or-nothing ideal to. The principle here is that belief propagation through family causes societies to tend toward a sustainable culture. Observing a rebound in the value of family in future Japanese generations will be fascinating, and will exemplify how multiple human generations correct for each other to adapt their culture toward sustainability. Of course, such a model ignores the influence of non-family learning, as well as the social effect of increasing globalization.

The kink here is that these successive Japanese must identify strongly enough as Japanese. If they were to absolve themselves of that identity, and perhaps move to other places where their beliefs (balance between family and work equality) are already widely held, then the Japanese may fall into some more dire, erratic situation. Revolution? Severe economic depression? This all calls into question why the current generation, who prize equality, do not move to other countries where they can be treated equally but also pursue substantial romantic relationships. My only thought is a perceived or real language barrier. I have heard that Japanese are generally embarrassed about practicing English. This might in fact be one of those mutually-reinforcing ideas that allows the entirety of the current overall culture to endure.

I liked the static glimpse into another culture the article gave, but I felt it puzzling and incomplete. (In fact, contributing to this kind of cross-cultural understanding is one of the reasons I’m so driven to develop and grow Blocvox.) The article left me wondering how this unequal all-or-nothing work culture has remained as rigid as it has to date. I would think a loosening of standards would be required to attract, hire, and retain a workforce that is fed up with those impossible pressures. I wonder if this rigidity and sense of order might be a relic of a post-war identity crisis. Also, I wonder how young and old feel of the concerns held of them by the other. Of the youth, I wonder how they feel about the declining birth rate and their responsibility toward the Japanese nation and culture to procreate. Of the old, I wonder how they feel about being responsible for bringing about this unexpected outcome. Hopefully one day, I’ll be able to ask them directly.

Shutdown plurality voting!

I’m on the Congress-hating bandwagon, but I’m also on the we-have-no-one-to-blame-but-ourselves bandwagon, up on the roof of it for a while, jumpin’ and shoutin’ (and tweetin’):

  • Government shuts down, due to
  • Ideological fundamentalism in Congress, due to
  • Ideological fundamentalists voted into Congress by the American people, due to
  • Dysfunctional, big money, two-party political system generating few election options, due to
  • Widespread use of simplistic Plurality Voting system based on single-mark ballots, due to
  • It being used when the country was founded and now being ingrained in our culture, due to
  • Single-mark ballots being easy to tabulate by hand.

Yes, in 2013, after landing rovers on Mars, sequencing the human genome, and creating a machine than can beat humans on Jeopardy!, Americans still use a simplistic system based on single-mark ballots and plurality voting because historically they were easier to count by hand!

Fortunately the Constitution (and local and state law) can be changed. ;) Do we want to move to more evolved, robust systems, such as Preferential Voting (ranked voting) or Proportional Representation (wherever possible), to create a more representative, satisfactory, efficient government? Or do we want to keep our ‘merican gladiator system that produces ideological meatheads who’d rather fight on camera than solve problems?

Read about it for yourself: proportional representation and single transferable vote (fancy for “preferential, ranked voting”).

NGO worth checking out: http://www.fairvote.org/

Aside from Congress and the American people, I also question Obama’s 2-week absence and the spotty coverage of the mainstream press (our vaunted Fourth Branch of government). Why hadn’t Obama, his minions, or the press been driving attention to concrete consequences, rather than abstractions like “Americans will be hurt”–as much as I care about Americans, I wanted to know the direct affects of The Shutdown TM on me. Over the last two weeks, I’d found very little, and only last night did the deluge of stories about what will and won’t be affected by The Shutdown TM come out. If only there had been an easy-to-use website for a normal guy like me to drive attention to that issue…

Upgrade your gray matter

I’ve wanted to properly avail a useful software creation as open source for a long time, but work (both employment and entrepreneurial) had always preoccupied me. As my startup Blocvox matured, I saw in it the possibility to fulfill this desire. Recently my task list has been clearing up, so I spent the necessary few hours isolating, documenting and polishing some code for you, dear reader. Blocvox is fairly well decoupled and many parts are ripe for packaging, but there was one piece that stood out as particularly useful and well-contained. I give you, Blocvox.GrayMatter.

Blocvox uses a command-query responsibility separation (CQRS) architecture to simplify both the domain model and the user interface code, and to enhance performance by computing derived values once at write-time rather than repeatedly at read-time. Blocvox.GrayMatter plays a major role in connecting these two sides of the Blocvox brain, allowing them to remain simple and agnostic.

This is a custom Castle Windsor facility that allows easy registration of event handlers, and high-performance, decoupled attachment at resolve-time. On the registration side, the package provides a strongly-typed fluent API and a reflection-friendly API. On the resolution side, the facility dynamically constructs, compiles, and caches the code needed to add event handler Delegate instances to events, in order to remove the reflection performance hit. It also wraps the delegate to provide “just-in-time” resolution of the subscriber, in order to mitigate huge dependency graphs being resolved up-front which might not even be used. This takes advantage of, and reinforces, the decoupled semantics of the event keyword.

Call me old-fashioned, but I find event to be the Way to do decoupled event-driven programming in C#. No reactive/observer/observable ceremony and threading black boxes. And no God-awful pub-sub/service bus God objects. (Unless you’re dealing with significantly disparate business units, pub-sub is doing it oh-so-painfully-unnecessarily-wrong. But it’s easy, right?) Despite its own warts, nothing is as idiomatic, well-understood, and portable as event for decoupling event providers from subscribers.

It’s not a lot of code, but I’ve found it it to be immensely valuable, primarily in keeping the cognitive load of the codebase low. I hope it is of use to others. I plan to maintain and improve it, but I don’t foresee drastic changes in the near future.

I leave you with these timeless words from a futuristic poet: “Upgrade your gray matter, ’cause one day it may matter.”

Decentralizing trust on the web

Update 2013-10-30: Yes, I am an idiot. This is all moot, as SSL’s single-point-of-failure is mitigated by cipher suites using perfect-forward secrecy. Carry on, nothing to see here.


I ‘d like to sketch out an idea for a practical, improved encryption mechanism on the web. As it stands, HTTPS relies on SSL key certificates, which are “endowed” with trustworthiness by certificate authorities. There are relatively few certificate authorities on the whole of the Internet. Because a compromise of those certificates means a nefarious agent can create/sign their own authentic-looking certificates and then perpetrate a man-in-the-middle attack on any so-protected web server, I contend the current state of web encryption depends on too few points of failure.

I propose replacing/augmenting this centralized trust model with the decentralized one of asymmetric public-key cryptography. Rather than one-key-serves-all, in public-key cryptography each communicating pair has their own key set. As a practical requirement, I propose relying on HTTP or HTTPS as the transport, but encrypting all request and response bodies with the parties’ individual public keys. Ideally, support is built into the browser, but short of that (or in the interim) we can use browser extensions/add-ons to hook into request/response events and perform the encryption/decryption.

Browsers that support this would notify the web server with a HTTP request header, perhaps X-Accepts-Key with a value of a supported public key’s fingerprint. This would allow the server to lookup the supported public key via fingerprint. Such browsers could also send messages encrypted with the server’s public key and indicate this in the response with the header X-Content-Key specifying the server’s key fingerprint. Likewise, server responses would include X-Content-Key in their responses to indicate the user’s public key. These headers should be considered alongside other HTTP content negotiation parameters (influenced by request Accept* headers and specified in response Content* headers) in determining HTTP cacheability.

Web servers will have to retrieve the public key specified in the request headers. I do not propose the exact mechanism for this, but a simple approach would be to allow users to associate a public key with their “user account” (i.e. their unique security identity), either by POST-ing a web form over plain-old HTTPS—or perhaps in person at a corporate or field office! (I imagine a market for physical key delivery could crop up if the public demanded it… think armored trucks and bankers boxes.) Likewise, the server will provide a public key to the user/user-agent; this associated keypair should be unique to the user to provided enhanced security. (Users can check this among themselves by comparing keys used between their individual communications with the server.)

Servers should also support the OPTIONS request and include something like X-Allows-Encryption: rfc4880. Both server and user agent should dynamically fall back to “plaintext” HTTPS when either side lacks support. In particular, due to non-idempotency of certain HTTP methods, URLs of encrypted requests should first be OPTIONS-ed. Unfortunately, OPTIONS is not cacheable, but this overhead is a small price to pay when security is tantamount. It would be nice to simply try the encrypted request and rely on the server to properly reject it with a 400 (which would indicate the need to retry without encryption), but it’s conceivable that the semantics of certain resources do not allow differentiation between plain and cipher text (PUT-ing or POST-ing binary data).

Ultimately, while not being the end-all of web security, this seems to me to add a “pretty good” layer of complexity to existing security conventions. Of course, I’m no security or cryptography expert, so I can’t assert the merit of this idea. But that doesn’t stop me from discussing and thinking about this important issue.


Update, 2013-10-03: Perhaps another alternative would be to make it easier for users to act as certificate authorities, and for websites to create unique SSL certificates per-user that the user can then sign. For example, a new website user will log into a website that is protected with an SSL certificate signed by a third-party “trusted” authority. The website will then, perhaps in automated fashion, create a unique SSL certificate for that user and request that the user act as a certificate authority and sign the certificate. Thereafter, the user will access the website via a unique subdomain, perhaps of the form https://<username>.example.com. While leaving the initial certificate-signing stage protected by only a common (single point of failure) SSL certificate, this does create a proliferation of SSL certificates thereafter, and cracking/forceful entities will have significant difficulty conducting mass surveillance.

Sexism in tech

This is certainly a hot-button issue. I’ve seen an increased focus and willingness to acknowledge and address this issue throughout the tech community, but many still deny that it exists. Much has been said of the inertia of male privilege and the meritocratic ideals of the tech industry, both of which are invariably characterized sociologically, as some metaphysical Force that operates on a level separate of the individual. At the same time, the sociological/systemic problem continues to be defined as the aggregate of interpersonal—not societal—failures (an authority figure looking the other way toward degrading behavior, men telling insensitive/sexist jokes, etc). This interpersonal dimension is not consciously discussed, leading to a type of stall in resolving the group inequity that seems common. The outlines of many inter-group conflicts are drawn in broad sociological terms, which motivates political action to achieve a negotiated level of structural equity. But if the focus never shifts to interpersonal equity, progress can stall culturally and the problem can persist and stagnate. For example, because of affirmative action and equal opportunities and desegregation, many today believe that race relations have achieved a satisfactory state, but the majority of minorities will disagree with this.

An article hit Hacker News today about the sexist bullying experienced by one female high school student in her computer science class. This obviously spills out of the tech field into high school culture, but generated a lot of discussion on Hacker News regardless. One comment called for men to simply accept that women have subjectively different experiences than they do. I agree, but the questions remain: why haven’t men already done this, and how do we progress from there?

I have observed that for many men having difficulty in comprehending/accepting that women experience the industry so differently than they do, they either A) over-generalize from an exceptional interaction, or B) follow those that have over-generalized. By “A” I mean that men can rely on confirmation bias to cement their impression of the female experience based on a few choice interactions, in order to create an intellectually convenient worldview. For example, confirmation bias can allow a random chat with a well-adjusted, confident woman who appears impervious to tech sexism to dispel for many years any notion in that man’s mind that sexism exists in the industry. Thereafter, contradictory signals can themselves be dismissed as the exceptions, and because of cognitive dissonance, can even sere to reinforce the misconceptions. (It should be noted that even though a woman might appear impervious, she actually may not be anyway.)

By “B” I mean that men with no relevant direct interactions with women (not uncommon given their low numbers) may follow the lead of the people with whom they associate, who are by definition men. So any confirmation bias of those men then spreads to them.

In considering such interpersonal breakdowns, what is not often recognized is that individual women have unique experiences. They are affected to varying degrees and in various ways by prejudism and ostracization. As a male, rather than tip-toe around or ignore the issue with a female colleague—allowing the assumption of the most intellectually convenient possibility—I’ve found the best hueristic for recognizing your potential to participate and perpetuate a toxic environment is to earnestly sense/inquire the nature of her individual past experience. (You may also share your own relevant experiences, if any.) Such a dialogue can help establish a common foundation and framework for maximizing the team and progressing the industry.

I believe the widespread focus on the direct, open, and individual treatment of interpersonal relationships (and moving away from the macroscopic one-experience-fits-all mentality, which lacks common sense and is susceptible to confirmation bias) is an important next step for evolving stalled relations between social groups in general.

Mobile SEO

This “SEO Cheat Sheet” has been making the rounds today. Pretty good stuff, but the second suggestion about mobile development should, at the least, be given a big asterisk if not removed entirely.

When differentiating content based on user-agent and including the “Vary: user-agent” header in responses, you are effectively disabling HTTP caching. Because of the huge number of user agent strings, neither the server’s output cache nor any CDN/intermediary cache will be effective at reducing request processing load. This is a very poor trade-off, and typically unacceptable.

If you must serve dynamic content based on user agent, the third option on the cheat sheet is probably better: use rel=canonical with separate URLs per device class. On each request, the server would still sniff the device class from the user agent string, but if the sniffed class does not match the one designated by the URL, the server 302-redirects (temporarily) to the device-specific URL (else, it serves the appropriate HTML). This requires a little more programming effort, but is usually worth having both caching and SEO.

I consider it a must to have the server take into account an override cookie when sniffing the device class, which the user can set through UI in the site header/footer. Also, I abhor URLs representing the same essential content to vary in domain or path (goes against the spirit of HTTP content negotiation), so I distinguish them with a simple “?lite” query parameter.

Gravitational Insanity

I’m introducing a new category of brain scramblings, Physics Insanity. I’m not sure how much I’ll actually post in it, but I have on several occasions composed my thoughts about matters relating to fascinating physical properties of the universe, so I might as well share them here. My point of view is less mathematical and more of a layman seeking a broader, deeper understanding of whatever this thing that we’re part of is. Expect to see the phrase “my mind was just blown” frequently.


My mind was just blown. It turns out that if a body (like the Sun) is in motion, its gravity doesn’t pull directly back toward the originating body, but toward a point ahead of the body. The further away from the body, the further ahead of it its gravity pulls.

Background: I think at least for this concept, it’s best to think of gravity not as a field surrounding a body (like the Sun), but as a type of radiation that continuously emanates from a body. As this radiation passes through surrounding objects, they are pulled in a certain direction by it. Two things are interesting about this. First, gravitational interaction between two bodies happens indirectly through “gravitational radiation”. Second, gravity is not an instantaneous effect, but one that travels through space, just like radiation from a nuclear meltdown or light from a bulb. Gravitational radiation travels at the speed of light, and when the mass/weight or location of a body changes, the corresponding changes to its gravity also travel away from the body at the speed of light. For example, if the Sun were to suddenly double in mass, the increased gravitational pull is not instantaneously felt by the various planets in the Solar System. Instead, Mercury would feel the corresponding increase in gravitational pull 3 minutes later (the same time Mercurians would first see the Sun double in size), and the Earth 8 minutes later, because that’s how long it takes for the corresponding in increase gravitational radiation to reach them.

Ok that’s neat, gravity radiates. Makes sense given what we’ve told about nothing (matter, energy, and mere information about physical changes) being able to travel faster than the speed of light. But there’s more to the story, so much more, dear reader. In the mundane, normal scenario of the Sun just drifting along at constant speed and direction, the direction of the gravitational pull that is radiating from it changes as it travels further away from the Sun! When you’re really close to the Sun, the gravitational radiation that passes through you pulls you straight back toward the Sun. But further away, for example at 92 million miles where Earth orbits, the gravitational radiation will have gradually changed so that its not pulling you straight back, but rather at a slight angle, toward the point in space the Sun will have drifted to during that 8 minutes! After traveling away from the Sun for another 22 minutes, the same gravitational radiation will reach Jupiter and will have changed more, such that it pulls toward wherever the Sun should have drifted by that time. As gravitational radiation travels away from a constantly moving body, the direction that it pulls in continuously changes so that it’s always pulling toward that body’s present expected location, not simply straight back to the location it was in when the radiation left it!!!!!1111!!1 o_O

If I explained that well, our minds are now in equivalent states of blown-ness. It’s awesome to think gravity has this dynamic nature. A crazy way to think about it is that even though we see the Sun in one location (where it was 8 minutes ago), we’re pulled by its gravity toward a different location (where it actually is right now). Apparently, the law of conservation of momentum requires that gravity behaves this way (I don’t understand this point conceptually yet). Also apparently, if the Earth were always being pulled toward the straight back to the old location of the Sun, and likewise all inter-planetary gravitational interactions were pulling straight back to “outdated” locations, their orbits wouldn’t be stable and the planets would fling each other out of the Solar System (assuming they would ever be able to assume an orbit, or even coalesce into existence, in the first place). Another crazy thing, each of us radiates weak gravity that, when it leaves us, is imbued with a sense of what direction we were heading in at that moment!

(I should note that gravitational radiation pulls toward wherever a body should be as long as its speed and direction remained unchanged. If either abruptly changes, the gravitational radiation already traveling through space does not magically “know” that it needs to recalibrate to the body’s new heading. Also, what I dub “gravitational radiation” is just my way of capturing the essence of its nature linguistically. I don’t understand the intricacies of quantum gravity or anything else that attempt to explain gravity more deeply.)

A cheesy visualization. At a given moment, gravity can be thought of as spherical arrangements of miniature electric fans floating away from a moving body, capable of blowing on anything they pass by. From a massive stationary rock, the set of fans “launched” at any given moment will always be pointed straight back at the rock, no matter how far away they float. But if the rock was drifting at a constant speed/direction when the fans were launched, each one of them will slowly change the direction they’re blowing in as they travel away from the rock, such that they’re always pointing to where the rock has drifted. The rock’s very motion puts a slight spin on the fans when they launch. The faster the rock is travelling, the stronger that initial spin; the slower its travelling, the weaker the spin. Such is gravity.

BON. KERS.

So, when’s our first gulag opening?

The British government (aka the US’s very own Mini-Me) has taken to intimidating the *friends and family* of disruptive journalists! This appalling, abusive violation of free speech resembles the very worst from repressive democracies like Russia!

It’s disheartening that there seems to be so little chatter about this issue on social networks. Is that a ramification of the incentive structures of the networks? Or of people’s belief that a magical force (an army of unicorns?) will keep life as we know it from ever changing? Or something else?

Should web designers know how to code?

UX expert Josh Seiden recently posted his affirmative thoughts about this question. I wanted to follow up with some thoughts of my own.

To me, this question is like asking should a writer know about the publishing business. If you’re “web-designing” solely as an artistic, self-directed endeavor, then “no” (but then we might prefer the term web art as opposed to web design). If you’re web-designing as part of anything larger, then “as much as possible, yes.”

As a developer and (novice) designer, I strive for pragmatism in both pursuits. Having both skillsets supports this. It’s pretty self-evident that the more of a system one understands, the more concerns and constraints they can simultaneously address/balance when working in that system. Understanding the relationship between CSS and HTML, their document-oriented roots, and the interpretation differences between browsers, better allows one to create flexible, idiomatic, conceptual, usable (e.g. performant) CSS while also supporting accessibility, SEO, and DOM and server performance.

I’ve read that the shift toward web design minimalism is about respecting the user. I agree, but I argue that it’s also about respecting the development process. Complex or pixel-perfect designs are very fragile and costly, wear on the development team, and hinder the business, usually with no marginal value over simpler designs.

Designing for web with strong knowledge of CSS is respectful of the process and environment of that work.

1