Tuesday, October 2, 2012

Complex Units of Measure

I embrace the Stevie's Home approach to life, and my return home from work usually looks more like a cathartic reconciliation between Carnivorous Vulgaris and Accelleratii Incredibus than a rational re entry to domestic life. On Monday, immobilized by my three pronged, patent-pending, tickle attack, Eli cried out, "Dad!! I exercise EVERY DAY. Why can't I escape?"

It's a wonderful question and a brilliant insight into the meta data of life. Why can't a 4 year old beat his father at the ol bear-hug/tickle-match? If the Olympics were an actual competition of sport, it's the sort of question you might hear in comparing the competence of two athletes. In fitness, we are near rivals except for the subfields affected by mass: Eli would need at least 3x his current mass to even pose a credible challenge. Yet in agility, core muscle strength, speed and dexterity, we are relatively equal when evaluated proportional to our weight. He is impacted by experience: he lacks muscle memory, endurance and accuracy, but these can be mitigated in young athletes by a greater supply of energy, irrational optimism and emotional charge.

If we were of comparable mass, it'd be a tough call to guess the tickler vs the tickled.

Competence, like most complex units of measure (risk, efficiency, aptitude), is domain specific. The most important attribute of a competent marksman is safety. Speed and accuracy are secondary in importance if a marksman isn't safe). Neither speed nor accuracy become positive attributes of the unit until safety is realized. Likewise, speed is irrelevant until accuracy is high.

You could simply call this efficiency: how fast we can do things well. The fast attribute is irrelevant until the well attribute is measured above favorably. Thus we could attempt to measure the relative competence of anyone at anything by evaluating for efficiency. I suspect that the domain specific nature of most of what we do in our lives makes this kind of abstraction difficult, at best.

That is all. I've been stewing over this particular introspection for some time. It hasn't altered my view of the world, though it has changed the way I view it--and with any luck, I hope to better quantify that view in the future.


Tuesday, September 25, 2012

Communicating and Why I Quit Email (Episode 2)

A few months have passed since I quit email, and (pleasantly enough [to me]) little has changed. I intervened twice: once to respond to a Craigslist buyer and complete a sale, and once to respond to a critical email of my recent book (hint: it seems to be even more of a stinker than my first). In the first week, I continued checking my personal email a few times a day--worried that I might have missed something valuable. By the second and third week, I had begun checking only daily or every other day. By the end of the first month, I stopped checking entirely.

Perhaps the most delightful part of this experiment is the knowledge that if people really want to talk to you, they will automatically route around perceived obstacles in communication. My immediate family immediately switched to GChat, SMS or semaphore.

But the world can suffer no dream too much before crushing it. Problems abide.

  1. Authentication is a nightmare. We are forced to self-identify as homo-sapien by email, then by CAPTCHA and then by the email-click-the-link-to-really-prove-you-exist. Account recovery is entirely email dependent. A few bright spots exist if you've already used email to establish an ~OAuth account for single sign on (like a Google account for cross authentication with StackOverflow); but, for most of the web, authentication is email's butler.
  2. The market provides no single, viable alternative platform for sharing links. Certainly, there are atrocities like Facebook and Google+, the failed Delicious and failed alternatives; but the closest approximation (I've found) that meets the need is Kippit, but it suffers from the email barrier to entry--which is just too great to convert 99% of friends and family. Sign-up is the death knell of an integration service, but you can feel free to do better. Having failed to convert anyone to Kippit, I'm declaring link sharing a complete loss. The only end-to-end failure of the experiment so far.
  3. The competition for push/pull/any notifications has yet to include a 2nd place victor. If you want to list something on Ebay or Craigslist and you don't want to use email--you've got a Sisyphean climb up the wrong side of a Pythagorean equation. Sure, there's Boxcar and all of the failed Open Pushes of the web, but they've either reinvented spam or they've vaporized into the ether of the venture capital bubble surrounding us. Projects like If This Then That inspire some hope that at least ideological convergence on a new way of thinking is possible, but if you're not short selling Optimism, perhaps you should be.
  4. Email contacts are really just hyperlinks; but they're the worst sort of hyperlink. I can still follow my first Geocities page link through the Internet wayback machine and touch it, were I so masochistically inclined. A dead email? Less valuable the bytes it occupies. I've been collecting photos of the good ole days, but forget about trying to share them with former comrades--those old email addresses are a one way trip.
None of these issues is enough to deter me from my pursuit of the Questing Beast. I plan to permanently delete my email account at the end of the year (terminating my auto-vacation responder directing inquiries to my last post of this subject). 

Kwin Kramer's The Cloud I'd like to See is the seedling of the inspiration for Episode 3 of this series: How to Communicate in a Post-Email World and/or How I Learned to Stop Listening and Love the Shockwave.

Sunday, July 8, 2012

Communicating and Why I Quit Email (Episode 1)

In the early 90s, the constraints imposed by dial-up speeds made email my primary interface for interacting with the Internet. Services like Juno made it even easier to connect, subscribe to listservs, and to surf the self-curated content that was the Web--in my inbox. Had I been born a little earlier than 1980 or had I a more computer-sciency bias imposed by my parents or peers, I might just as easily started with Usenet. But. I did not, and email was my entry point to online communications.

In the early days of AOL, they implemented this fantastic feature on their proprietary email client: if sending emails to/from AOL account holders, you could see the read status of the message and, if the message was unread, you could remote delete it. These two features have been rebel rousing in my head as I approach the first month of my hiatus from personal email.

That is correct. I quit email.

For me, the problem with IMAP/POP3/SMTP (Exchange/ActiveSync, BlackBerry, iCloud, etc)... the problem with email is that it doesn't solve any real (actual or imagined) problem well. Email provides this thin service:
  1. The Sender is allowed to send any arbitrary text message (permitted length not always known/guaranteed, UI rendering of content not known/guaranteed) with a (sometimes permitted) optional binary attachment(s) (availability of which to Recipient depending upon client is unknowable) to a valid Recipient (string'@'string'.'string).
  2. If an Author sends a valid message to an invalid Recipient or if a trapped error occurs during delivery, the mail exchanger will return an error.
  3. The Recipient is guaranteed to receive all valid messages sent to the Recipient's address, if the delivery attempt succeeded.
That's it. While various clients (GMail, Thunderbird, etc) have (sometimes greatly) improved upon components of email's core weaknesses, none has succeeded in being more than a really good band-aid on untenable platform. I have used email to try to solve notifications, scheduling, reminders, planning, archiving, bartering, trading and so, so on. Email has failed to solve no other problem quite so spectacularly as it fails to solve the problem of communication.

Ground -1. One-on-one physical, verbal, direct, close-proximity communication with my wife. I get:
  • To Start a dialog
  • Immediate, visual cues that what has been said has been heard, acknowledged and understood
  • Immediate opportunity to correct failures in hearing, acknowledgement or understanding of a given item
  • To End a dialog
Simply by speaking one-on-one, we can quickly define the problem ("What to order for dinner?"), explore the problem ("What did we eat last night? What are we in the mood for? What will the kids eat?"), and solve the problem ("Pizza."). At every exchange in the cycle, we know whether we've been heard, whether we resolved the problem, and when we're done

Anyone who manages or co-manages an organization larger than 1 person implicitly knows the right tools to communicate with their team on daily chores: dinner, shopping, house/car/lawn maintenance, changing diapers vs feeding the baby. No sane person ever selects email as the tool for this conversation. Yet the same people, with the same implicit understanding about the nature of communicating, will (without thinking) choose email to communicate the organization of office lunches or product failures.

As inferior a tool email is at communication simple tasks, its failure rate begins parabolizing as the complexity of the subject matter increases. Would you pick email out of the toolbox to approach the "I forgot our anniversary" subject? Natural language communication makes it possible to seamless transition from apology, to night on the couch, to reckless (and expensive) expression of love, to night on the couch, to sincere and heartfelt apology, to forgiveness.

Communication itself is inherently volatile. Under the most optimum conditions, I routinely fail. Spectacularly. Our individual capacity to communicate and maturation of the components of the skill: reading subtlety and subtext, expressing nuance, employing empathy, listening, speaking, not-speaking--vary wildly by age, time of day, coffee levels and mood. Choosing the right communication tool doesn't give your skillset a shot of human growth hormone, but choosing the wrong tool can be the equivalent of voluntary amputating your pitching arm, and choosing email as the tool is akin to amputating both arms and batting with your mouth.

To my mind, Google Wave's ultimate failure rested not on the poor execution of the platform itself but on the non-existent explanation of the problem Wave was intended to solve. Natural communication incorporates all of our senses with real-time, tactile feedback. Any attempt to bring communication online must preserve most of what we get out of the box from Nature and it must incorporate many of the new tools we have implemented in the digitical space.

Email does not do this, nor can email do this. I do not believe that standards can solve the issue for email either. In fact, I cannot envision a solution at all.

However. I can stop participating in the problem. By selecting email out of my digital gene pool, I am forced to find tools that actually solve my comminications issues.

  • Yes, it does mean the proliferation of narrowly focussed tools. 
  • Yes, it's inconvenient for 100% of the people with whom I communicate regularly. 
  • Yes, noone but me likes this idea.
  • Yes, next to noone will stop trying to use email to communicate as a result of this extroversion.
  • Yes, my spam folder is overflowing with "assisted suicide" offers as a direct result of this decision.
  • Yes, I admit that even though I have stopped sending email--I have still checked it once or twice to be sure I am not really insane.
Still, I have not missed an important event or failed to follow through on a task as the result of not using email; and I have 10% more of my day to work on things I care about.

We will see what month #2 has to offer.

Saturday, June 16, 2012

Your Hyperbole is Too Large

A confluence of events have me pondering a kernel of an idea. Prometheus decapitated anticipation, and even the experience of watching the film rivaled the Astor Theater's recent fail.Then I read Jame's post, Your Coding Philosophies are Irrelevant, which triggered a thought.

Last night, I let Netflix streaming do the driving and watched two ("Top 10 for Christopher") films: Rammbock and Pontypool. Both are minimalistic horror films which slowly build tension and suspense and develop real characters (of the two, I thought Pontypool pushed more boundaries and delivered more reward). Both do this with a fraction of the budget of Prometheus, few special effects, and bias toward keeping the creatures out-of-view. Just as with the original Alien, what you can't see is almost always scarrier than what you can.

It's impossible to know what disciplines of theater, photography, writing, production, cinematography, composition and filmmaking (if any) the three teams employed; but just as Jame's accurately notes--it doesn't matter. Prometheus stole 3 hours of my life, while Rammbock and Pontypool returned value. We can't peer inside the black box of the films' construction to divine what philosophies drove the choices the directors made, but if I had to hazard a guess, I'd say Mr. Scott is more religious about adhering to various filmmaking idioms and paradigms, while the others are not.

It's pure speculation, of course, and perhaps not worth significant reflection; however, consider these films as software. As horror films, they have a primary function: to scare us. As science fiction films, they also have a nice-to-have function: to probe intellectual, moral and ethical boundaries. Prometheus is to iTunes what Pontypool is to Spotify.

There's something that resonates in me with Jame's conclusion: "It's not the behind-the-scenes, pseudo-engineering theories that matter. An app needs to work and be relatively stable and bug free, but there are many ways to reach that point... And it might just be that too much of this kind of thinking is turning you into an obsessive architect of abstract code, not the builder of things people want."

Clearly, the best prime directive is to build things people want to use. When I ask for a horror film, no matter what you do under the hood, if you give me a Prometheus--you're wrong.

Thursday, May 24, 2012

Physical Books (and Such)

I don't read as many books as I used to, and I wonder why.

Before:

Growing up in Mississippi, the son of a minister, books were the swiss army knife of childhood survival. I had to read as much as possible to understand the overwhelming guilt that is the reformed Presbyterian's constant companion and to wield the irrationally large words required to defend that guilt as good, godly, honorable, desirable and prosecute it as an equally detestable, abhorrent thing.

Books were also one of the best respites from the theological gauntlet which never seemed to end. I used to be just as comfortable in 10 lbs tomes like the New York Public Library Desk Reference or dictionaries of cultural literacy or an early 1970s edition of Encyclopedia Britannica as I was in Tolkien, Hemingway, Dahl, Herbert, Clawell or Bradbury. If it was in the home or school library, I probably read it at least twice.

This continued until I returned from Iraq, when I was 24. Up to this point, I also spent nearly as much of my time consuming every other form of media. I had collected upwards of 1,000 CDs, more than 30,000 tracks in my whatever-was-before-iTunes collection. I had 700 DVDs of movies and shows I bought and watched, at least twice--and I had watched at least 3 times that amount between renting and going to the theater.

Then:

Abruptly, the whole pattern of my consumption shifted. Starting at age 25 and with increasing severity, I had developed an impatience. If consumption did not begin yielding quick returns on investment--be it pleasure, edification or even frustration and irritation, I terminated it. I found myself no longer willing to risk my time (now a scarcer resource) on anything which wasn't easily converted into value. Fewer books, fewer albums, fewer films.

Now:

I read one physical book last year, Game of Thrones. Similarly, my consumption of other media is way down. While this might seem like a loss, I'm not yet sure. I still read as much as or more than I ever have. I've format shifted, in a way. Now, I follow 150 or more content feeds from blogs across the spectrum, various syndicated streams from news sites and whatever falls into the else bucket.

Of course, one of the very best value-added features of reading a novel or watching a great film is the transformative nature of the experience. You read King Rat, and (if you're like me) you begin to root for the protagonist as Clawell carefully scopes the action and the drama to lure you into liking this particular abomination. You are pulled all the way through the rabbit hole and then he shreds the veil. You learn something you didn't know about yourself, something uncomfortable, something challenging. And that's good. A book can expose you for who you really are but weren't able to grok by yourself.

This cathartic scab-ripping is important, at least to me; but it's much harder consuming short articles, essays and blurbs. It requires internal combustion to drive it. Vigilance. Doubt.

I still haven't read Crime and Punishment, and I know in my gut that I'm a weaker person without it; yet it continues to collect dust as it eyes me, passive-aggressively from my bookshelf.

Maybe it's impulsiveness that keeps me jumping from feed to feed, and maybe it's unhandled optimism that allows me to persist the notion that this behavior is OK. Still, I'm not certain.

It is different.

Wednesday, April 25, 2012

The Cost of Urgency

Sandstorms are spectacular creatures. From less-than-ether, they arise, demolish and vanish.

If you've ever been in a sandstorm, you'll remember the natural instincts to _save_ everything you can. Collect valuables and retreat into enclosed areas. Preserve as much as possible.

If you've ever been in more than one sandstorm, you'll remember the quick duck-and-cover response. Followed by the long wait.

In a sandstorm, the only

Friday, January 27, 2012

How to Subclass an Array (Not Really)

Update: I don't recommend following the advice in this post. The 5-minute-ago version of my self was hatsy, arrogant, foolish and a bad dinner guest. I recommend the recently updated version of this post from the 1-minute-ago version of my self. He seems to be more well mannered and even keeled--I like him. However, if you're interested in what advice I used to be capable of delivering, please go on.

If you've ever written more than a few lines of JavaScript, you've probably instanced and iterated an Array or two. As you've done this, you may have even thought to yourself, "Gee willikers, wouldn't it be keen to hook a contains() method on the Array prototype to simplify that awful indexOf(val) !== -1 shenanigans?"

You may have even tried to simply extend the native Array prototype. Life would have been grand until you needed your JavaScript to work in IE.

From there, if you're at all like me, you might have thought: "This is friggin' JavaScript! Everything is mutable; everything is extensible! I'll just subclass the native Array type." The subsequent journey through Google would then lead you to believe that the problem of subclassing Array is so hard, so frought with peril, so dangerous that no one could ever do it and do it right.

And if you followed any of the advice that Herr Google had to offer, you would be right. And then you watched Douglas Crockford's On JavaScript series and hit Part III: Function the Ultimate. Suddenly, it becomes much, much simpler.

We now have an array() function which operates just like a plain vanilla, JavaScript Array (push, pop, splice and all intact), addressable by index with a properly functioning length property...and which contains a contains() method.

To do this, we didn't need 'this' or 'new', and we don't have to touch 'prototype' at all if we wanted to interpret 'arguments' differently (though Array.prototype.slice.apply is probably the most efficient way to do it).

This prototypical class model is, in my opinion, much easier to implement, read and debug than any of the blood, sweat and tear infused models suggested around the 'tubes.

As always, your mileage may vary.

Tuesday, January 24, 2012

Out with the Old, In with the New (Profile)

At the launch meeting for our new team blog, Object Reference Not Set, we pledged to enforce only one rule: 

"Complete editorial freedom..."

This rule didn't even make it past the first post. On the chopping block? My old, Google+ profile. 

May it, like the phoenix, rise from the ashes into something...different.

Rest in peace, old faithful.

"My father killed himself with a potato gun (in front of my mother but outside the pantry) when I was six years old. The potato gun didn't scare us; he often cut himself with bad poetry or fresh salmon. You don't think about potatoes in their most mortal sense when you're a disaffected, drug-addicted youth. We buried him in the backyard pool, which we filled with plankton. Sometimes, when the moon's just right, you can still see the flies swarming over his grave.

"The cannibalism of the years that followed seems only natural when compared with the vegan gluttony of our youths. My sisters and I began working as shoe salesmen for local choir boys, but only the family dog really had a talent for it. We hired and fired Egyptian slaves, but Scandal had other games for us to play. The first sister died from salmonella poisoning at Disney Land. It was Vogue and en vogue that caught the nation's attention.

"School was compulsory, of course; and those of us that remained largely ignored it. We studied particle physics in the garage and practiced fencing on the roof. The second sister died of dark matter exposure, and we burned her on a funeral pyre of our own making. The seventh sister was immaculately conceived that year. In these early years, we invested heavily in Swedish bubble gum and nineteenth century guns. It was a time of self reflection and turbulent bathing."

Performance! Determinism. Vs?

David Ungar (co-creator of Self, the forefather to my favorite language, JavaScript) had an interesting talk at SPLASH 2011, which is nicely summarized here (post-talk interview here).

There are two predicate assumptions in this line of thinking. First, that in the end, we will always value performance over everything else (data integrity, reliability, determinism). Second, that we must sacrifice some level of determinism if we want to increase performance.

The JavaScript example is a good one.

"Why would you write critical business logic in an untyped language?"

"Because, by sacrificing static typing, you can achieve performance that is otherwise impossible to attain."

Of course, JavaScript itself isn't even the Wild West. It's serial; it's single-threaded. It poses few of the suggested risks of operating in thousand core environment, where (if we are to believe Mr. Ungar) we must stare into the abyss.

The line of thinking is provocative, and it's led me to wonder if it doesn't speak to something more basic: we must constantly challenge our assumptions about what is the most performant way to program.

Recently, Node.js entered the tweet-o-sphere and has garnered enough attention that even Microsoft has built support for Node into IIS and their Azure platform. Are they mad?

In explaining the relevance of Node to broader technology groups, Brett McLaughlin notes:

"There's another web pattern at work here, and it's probably far more important than whether you use Node or not, and how evented your web applications are. It's simply this: use different solutions for different problems. Even better, use the right solution for a particular problem, regardless of whether that's the solution you've been using for all your other problems."

That, to me, is the most elegant subversion of Mr. Ungar's thesis. We may have to abandon determinism to solve certain problems, but we can also abandon it in a deterministic way.

Friday, January 20, 2012

On Competence

Years ago, one of my dearest friends, Jennifer Elms, and I were discussing a short bit of prose that I had written. She had just finished reading it, and I had asked for her opinion.

"I think I'm a competent enough writer," I said.
"Oh, I wouldn't call you competent," she said, "but this is better than some of what people call 'good'."

I was stunned.

"If I'm not competent, who is?"

She paused and reflected for a moment. "Joseph Conrad was competent. Try Lord Jim."

'Stunned' quickly ceased to scope to my reaction. I read Lord Jim, and I met the unknown. It took some time to find my hubris, identify it and begin extracting it, but the journey was one of the better investments of my life.

In the last few years as a software developer, I have reflected more than a little on this exploration. I know better than to call myself a competent developer--I know better than to attempt objective qualifications on my abilities.

Crockford muses that software development is the constant struggle for perfection. He's right, but it's also much more than that. Software is poetry. Development is a discipline. Despite their flaws, languages are tools which we both leverage against the world and whose understanding changes us. Code is a means of communicating designs and intentions between humans that must also be translatable by the compiler into something flawless and (as a consequence of this requirement) something altogether unintelligible.

Like every other developer I know, the code that I write today is the product of the Muse. Unequivocating genius. Quite unlike the code I wrote yesterday, which was unmitigated toxic waste.

As a writer, after talking to Jennifer, I used to set the bar at 'competence'. As a developer, that's simply not good enough. The bar has to be set at perfection. For someone who's still not competent, the disparity between here and there is significant.

Fortunately, we get a lifetime to make the journey.