Friday, January 27, 2012

How to Subclass an Array (Not Really)

Update: I don't recommend following the advice in this post. The 5-minute-ago version of my self was hatsy, arrogant, foolish and a bad dinner guest. I recommend the recently updated version of this post from the 1-minute-ago version of my self. He seems to be more well mannered and even keeled--I like him. However, if you're interested in what advice I used to be capable of delivering, please go on.

If you've ever written more than a few lines of JavaScript, you've probably instanced and iterated an Array or two. As you've done this, you may have even thought to yourself, "Gee willikers, wouldn't it be keen to hook a contains() method on the Array prototype to simplify that awful indexOf(val) !== -1 shenanigans?"

You may have even tried to simply extend the native Array prototype. Life would have been grand until you needed your JavaScript to work in IE.

From there, if you're at all like me, you might have thought: "This is friggin' JavaScript! Everything is mutable; everything is extensible! I'll just subclass the native Array type." The subsequent journey through Google would then lead you to believe that the problem of subclassing Array is so hard, so frought with peril, so dangerous that no one could ever do it and do it right.

And if you followed any of the advice that Herr Google had to offer, you would be right. And then you watched Douglas Crockford's On JavaScript series and hit Part III: Function the Ultimate. Suddenly, it becomes much, much simpler.

We now have an array() function which operates just like a plain vanilla, JavaScript Array (push, pop, splice and all intact), addressable by index with a properly functioning length property...and which contains a contains() method.

To do this, we didn't need 'this' or 'new', and we don't have to touch 'prototype' at all if we wanted to interpret 'arguments' differently (though Array.prototype.slice.apply is probably the most efficient way to do it).

This prototypical class model is, in my opinion, much easier to implement, read and debug than any of the blood, sweat and tear infused models suggested around the 'tubes.

As always, your mileage may vary.

Tuesday, January 24, 2012

Out with the Old, In with the New (Profile)

At the launch meeting for our new team blog, Object Reference Not Set, we pledged to enforce only one rule: 

"Complete editorial freedom..."

This rule didn't even make it past the first post. On the chopping block? My old, Google+ profile. 

May it, like the phoenix, rise from the ashes into something...different.

Rest in peace, old faithful.

"My father killed himself with a potato gun (in front of my mother but outside the pantry) when I was six years old. The potato gun didn't scare us; he often cut himself with bad poetry or fresh salmon. You don't think about potatoes in their most mortal sense when you're a disaffected, drug-addicted youth. We buried him in the backyard pool, which we filled with plankton. Sometimes, when the moon's just right, you can still see the flies swarming over his grave.

"The cannibalism of the years that followed seems only natural when compared with the vegan gluttony of our youths. My sisters and I began working as shoe salesmen for local choir boys, but only the family dog really had a talent for it. We hired and fired Egyptian slaves, but Scandal had other games for us to play. The first sister died from salmonella poisoning at Disney Land. It was Vogue and en vogue that caught the nation's attention.

"School was compulsory, of course; and those of us that remained largely ignored it. We studied particle physics in the garage and practiced fencing on the roof. The second sister died of dark matter exposure, and we burned her on a funeral pyre of our own making. The seventh sister was immaculately conceived that year. In these early years, we invested heavily in Swedish bubble gum and nineteenth century guns. It was a time of self reflection and turbulent bathing."

Performance! Determinism. Vs?

David Ungar (co-creator of Self, the forefather to my favorite language, JavaScript) had an interesting talk at SPLASH 2011, which is nicely summarized here (post-talk interview here).

There are two predicate assumptions in this line of thinking. First, that in the end, we will always value performance over everything else (data integrity, reliability, determinism). Second, that we must sacrifice some level of determinism if we want to increase performance.

The JavaScript example is a good one.

"Why would you write critical business logic in an untyped language?"

"Because, by sacrificing static typing, you can achieve performance that is otherwise impossible to attain."

Of course, JavaScript itself isn't even the Wild West. It's serial; it's single-threaded. It poses few of the suggested risks of operating in thousand core environment, where (if we are to believe Mr. Ungar) we must stare into the abyss.

The line of thinking is provocative, and it's led me to wonder if it doesn't speak to something more basic: we must constantly challenge our assumptions about what is the most performant way to program.

Recently, Node.js entered the tweet-o-sphere and has garnered enough attention that even Microsoft has built support for Node into IIS and their Azure platform. Are they mad?

In explaining the relevance of Node to broader technology groups, Brett McLaughlin notes:

"There's another web pattern at work here, and it's probably far more important than whether you use Node or not, and how evented your web applications are. It's simply this: use different solutions for different problems. Even better, use the right solution for a particular problem, regardless of whether that's the solution you've been using for all your other problems."

That, to me, is the most elegant subversion of Mr. Ungar's thesis. We may have to abandon determinism to solve certain problems, but we can also abandon it in a deterministic way.

Friday, January 20, 2012

On Competence

Years ago, one of my dearest friends, Jennifer Elms, and I were discussing a short bit of prose that I had written. She had just finished reading it, and I had asked for her opinion.

"I think I'm a competent enough writer," I said.
"Oh, I wouldn't call you competent," she said, "but this is better than some of what people call 'good'."

I was stunned.

"If I'm not competent, who is?"

She paused and reflected for a moment. "Joseph Conrad was competent. Try Lord Jim."

'Stunned' quickly ceased to scope to my reaction. I read Lord Jim, and I met the unknown. It took some time to find my hubris, identify it and begin extracting it, but the journey was one of the better investments of my life.

In the last few years as a software developer, I have reflected more than a little on this exploration. I know better than to call myself a competent developer--I know better than to attempt objective qualifications on my abilities.

Crockford muses that software development is the constant struggle for perfection. He's right, but it's also much more than that. Software is poetry. Development is a discipline. Despite their flaws, languages are tools which we both leverage against the world and whose understanding changes us. Code is a means of communicating designs and intentions between humans that must also be translatable by the compiler into something flawless and (as a consequence of this requirement) something altogether unintelligible.

Like every other developer I know, the code that I write today is the product of the Muse. Unequivocating genius. Quite unlike the code I wrote yesterday, which was unmitigated toxic waste.

As a writer, after talking to Jennifer, I used to set the bar at 'competence'. As a developer, that's simply not good enough. The bar has to be set at perfection. For someone who's still not competent, the disparity between here and there is significant.

Fortunately, we get a lifetime to make the journey.