Thursday, January 31, 2013

Stealing a Trillion Dollars an Hour in Pure JavaScript

The Heist

Part 1: The Mark

By now, I think it fairly safe to say that no one seriously disputes the fact that Industrial Protectionism infringement is no different than all other forms of theft (either literally or figuratively). Copyright infringement, patent infringement, trademark violations? Theft. Pure and simple. Sharing knowledge and culture is fundamentally no different than breaking into someone's home and stealing their chia pets.

Except that it's even worse. When a person is robbed of a tangible asset, they (generally) lose only the asset; however, when Industrial Protectionism infringement occurs, the monopoly is damaged much more. In fact, the collateral damage to the whole industry can be quite profound: the Betamax nearly killed the corn industry.

Our forefathers wisely predicted the irreparable harm that would befall innovators if not afforded indefinite, government granted monopolies over their creations: just look at the pace at which public domain books are robbing the publishing industry of its value.

So it's intuitively obvious why Industrial Protectionism rights must be defended at times even more vigilantly than our Amendment rights. Mechanics can't be allowed to listen to the radio any more than Girl Scouts can be allowed to sing "Happy Birthday"; neither could we allow churches to gather on that Super Day to watch that largish sporting event any more than we could let YouTube show Dr. King's "I have a Dream". These inflict immediate, permanent, scarring injuries upon the monopolies who have poured their very essence into their creations, ex nihilo.

Part 2: The Payout

This is intuitively obvious. Why are we even talking about it?

Because this presents the potential for the biggest heist in the history of human heists: steal the GDP of the world (the net, the gross, the whole shebang) in less than a week. Let's steal $1,000,000,000,000 an hour in pure JavaScript.

We know that sharing knowledge and culture is theft. It doesn't matter what we do with the stolen property: we can store it, we can throw it away, we can give it back immediately--it matters not. Theft is theft, plus the cascading effect to the entire infrastructure of the owner can vary from $22,500 - $4,500,000/sharing of knowledge and culture. Based on current case law, few juries have awarded much higher than $80,000/sharing of knowledge and culture, but our intent to share knowledge and culture is willful to the extreme, with the intent to encourage second and third party sharing of knowledge and culture-without remorse. Surely we could steal closer to the $4 million dollar mark.

Stealing an MP3 is roughly equivalent to stealing a US Dollar in immediate, tangible losses to the copyright owner. To net our trillion by stealing The Beatles' Love Me Do, we would need 250,000 downloads/hour (assuming we net $1/song + $4,000,000/damages). Even though our sharing of knowledge and culture is willful, explicit, broadcast and encouraged...MP3s are still large enough that, even at the higher damages, this probably won't scale to the volume we need per hour.

The estate of Dr. King charged the US government $856,160 for the rights to Dr. King's image and 409 of his words to inscribe on the memorial in D.C.. King's I Have a Dream speech consists of 1660 words, the value of which is increased by the film, the delivery, the presence and the power of the man. Stealing this video is roughly equivalent to excavating Dr. King's tomb, bringing his body into the house of his estate, stealing the good silver and then burning the estate to the ground; but based on the value of the word count alone, let's assume this would yield a mere $3,424,640/sharing of knowledge and culture--pessimistically less than stealing from The Beatles (and the video file size is larger/sharing of knowledge and culture).

There is another avenue yet. The eBook. Let's pick something that has been rescued from the public domain: Minority Report. eBooks average $10/book and they are tiny compared to videos and music. We can easily get our shared knowledge and culture compressed down to around 20 kilobytes--smaller than half of the JavaScript files required to render this page, most of which load in around 200 ms.

Anything we repeatedly download from any given server will be immediately fetched from the local cache (less the round-trip to verify no file changes have occurred), therefore repeated downloads will complete in 50 ms or less. Assuming the average attention span of the user of the page holds constant at around 4.5 seconds, that gives us the potential for 87 infringements/user/page load. At this rate, we could effectively bag our target with 2,874 visitors an hour.

Part 3: The Plan

If you've made it this far, it's quite possible that the entire continent of Africa is in near financial ruin. Parts of Europe may also be crumbling before your very eyes. 

The Greatest Heist Ever Conceived

Sunday, January 13, 2013

Good Reads

Of the part of the day I don't spend eating, sleeping or enjoying/surviving my family, I spend it in roughly equal parts reading and writing. While my reading habits have evolved, the relative quality of what I end up reading has varied very little over the years. Most of what I read is just... ok. I think it's nearly always possible to profit from everything read (assuming some discretion is applied upfront)--even if the effect is merely the intellectual equivalent to eating celery.

Some of it though--some of it, is just spectacular. So good, I think it worth isolating and sharing. Before I do, allow my wind to extend just another paragraph or so to explain "why" I think these links are worth sharing. From my perspective, the links that will follow represent illuminating content by competent authors--authors whom you should follow, who consistently (because they are competent) continue to generate content of the same excellence and whose advice and opinions can be trusted to be worth thinking about.

As a last aside, what do I mean by "competent"?

I don't intend for the following two observations to be terribly polarizing: competency is both a diffuse term and a complex unit of measure; but I won't worry too much about the impact of the next. To my mind, in any given specialty (software engineering, baseball, philosophy, marine biology, etc), only a relatively few individuals will emerge and be recognized as competent. I doubt it possible to draft a definition for competence which could accurately apply to all distinct domains (much less the combinations we practice in reality); yet, for the purpose of this recommendation, let's call it so:




  [kom-pi-tuh ns] 
possession of excellence in required skill, knowledge, qualification [domain]; comprehensive understanding of the components and sub-components of the domain; ability to communicate both specific and abstract components of domain to non-competent peers; ability to train others in the domain and bring them to competency: Nikola Tesla was one of the few competent inventors of his generation.

So without further ado, a few things worth reading:
  1. From Reg Braithwaite's excellent experiment, Homoiconic, his recent essay Practical Applications of Partial Application. It's. Just. Fantastic. He takes functional programming down the bones and reconstructs a the organism. This is not only just a great article on software engineering, it's the template for great articles on software engineering.
    1. His standard blog, raganwald's posterous, is also excellent.
  2. From +Douglas Crockford, his recent talk, Monads and Gonads, is wonderful. I wish I could have been there. Afterwards, when you think you've understood what he demonstrated, grab the code from the lecture and try to implement it. If that proves challenging in the least, revisit #1.
    1. Nearly all of Crockford's videos are worth watching, but he can also write. From, see his Satan Comes to Dinner
  3. From James Hague's, programming in the 21st century, his Hopefully More Controversial Programming Opinions, which includes this gem: "You shouldn't be allowed to write a library for use by other people until you have ten years of programming under your belt." While I can't know his competency as a developer, because he rarely if ever show actual code, as a writer about software engineering, he's a wellspring.
That's all. A small fraction of what I read, which you can see most of it on my Netvibes profile, somecallmechief.

Thursday, January 3, 2013

JavaScript (aka Marlboro Country)

Sometimes JavaScript land just feels like Marlboro Country. The sales pitch certainly feels similar:

Come to the hottest part of the United States. It's popular. It's everywhere. It's the future. Now: cover yourself in leather. Don't ask why. It's part of the feature set. Next: the temperature is 120 degrees, and you need to start a fire. It's not supposed to make sense, that's the point. You'll need to do this at noon every day. After about 10 days, you'll find yourself assaulting anyone _not_ starting noonday fires.

C'mon?! What's the worst that could happen?

Imagine: right now, you could be riding across a brush fire.. or maybe it's a dust storm.. or maybe.. who knows? Everything's mutable! Ride across anything you want. Of course, you are limited to desert colors (obviously).

Lest you be confused, I love JavaScript. While the language as specified is littered with nightmarish design flaws and obstacles reminiscent of trench warfare and carpet bombing (let's call them WATs), JavaScript's Good Parts are so good you might be forgiven for forgetting about that elephant in the bathroom (aka IE). Yet.

Yet and yet, the WATs do also rise. And they rise.

A few weeks ago, I began to set ink to digital paper to record the recipe for my new found WAT. There is a delightful tool for recording code demonstrations, The Code Player. While the demos presented are quite nice, I lost all 5 of my first drafts to bugs in the system. Still, you should keep an eye on it. I considered reverting to my old faithful, JSFiddle, which is fantastic for mocking up and testing code but less suited for driving a walk through. In the end, syntax highlighting in the blog will have to must suffice.

The municipal "we" begin with a simple function:

function doVerbOnNoun(aThing) {;

//In theory, if(aThing) should be equivalent to if(true == aThing)
if(aThing) {
console.log('Verbing on ' + aThing);
} else {
console.log('Not verbing on ' + aThing);

A naïve developer, like myself, would assume that our guarding if(truthy) would apply the loose, truthy type coercions we have been trained to avoid; and an undisciplined tester, like myself from a few moments ago, might almost be forgiven for missing the lawn for the dandelion. The following assertions execute as expected naïvely:

doVerbOnNoun('aThing'); //We verbed
doVerbOnNoun(''); //Verb, we did not

doVerbOnNoun(5); //We verbed
doVerbOnNoun(NaN); //Verb, we did not

doVerbOnNoun(true); //We verbed
doVerbOnNoun(false); //Verb, we did not

//Null and undefined:
doVerbOnNoun(null); //Verb, we did not
doVerbOnNoun(undefined); //Verb, we did not

But if the republic of we would but pause and stare but briefly into the abyss, it becomes immediately apparent that loose, truthy typely coerced evaluation is not happening here.

doVerbOnNoun('a string of some sort'); //We verbed
console.assert(true == 'a string of some sort', "'true' != 'a string of some sort'"); //Assertion fails

doVerbOnNoun(5); //We verbed
console.assert(true == 5, "'true' != 5"); //Assertion fails

doVerbOnNoun('0'); //We verbed
console.assert(true == '0', "'true' != '0'"); //Assertion fails

To my mind, this is irrational, unsettling and in possible violation of the laws of gravity; yet, it had never occurred to me that an if('gimme shelter') statement would ever not be met as truthy. In fact, my mind was so far out of touch with the rules of the matrix, that I spent a few fevered seconds struggling with this paradox:

doVerbOnNoun({}); //We verbed
console.assert({} == true, "{} != true"); //Assertion fails
console.assert({} == false, "{} != false"); //Assertion fails

Clearly, loose truthy evaluation is not part of the algorithm of the single parameter if() statement. The multinational we can exercise some futility by instrumenting doVerbOnNoun with some diagnostics:

function doVerbOnNoun(aThing) {;

if(aThing) {
console.log('Verbing on ' + aThing);
} else {
console.log('Not verbing on ' + aThing);

console.log(!aThing, 'evaluating (!' + aThing + ')');
console.log(false != aThing, 'evaluating (' + aThing + ' != false)');
console.log(false != (false == aThing), 'evaluating (false != (false == ' + aThing + ')');
console.log(true == aThing, 'evaluating (' + aThing + ' == true)');


But none of the assertions align with the evaluation of the if() statement 100% of the time. So what is actually happening? Pure, unbiased, unmitigated, unabashed madness--that's what. Section 12.5 of the ECMAScript specification defines the if() statement as:

"The production IfStatement : if ( Expression ) Statement is evaluated as follows:
Let exprRef be the result of evaluating Expression.
If ToBoolean(GetValue(exprRef)) is false, return (normal, empty, empty).
Return the result of evaluating Statement."
What of ToBoolean, you ask (jaws clenched in fear, rage and agony)? It is sufficiently (albeit imperfectly) expressed as:

var ToBoolean = function (val) {
return (val !== false &&
val !== 0 &&
val !== '' &&
val !== null &&
val !== undefined &&
(typeof val !== 'number' || !isNaN(val)));

At least to the existential "us", this revelation is almost laughably ironic. The if() statement isn't loose or truthy and doesn't coerce type whatsoever, rather it simply casts whatever it is offered ToBoolean().

So ends this rather unsatisfying diversion into JavaScript. Please don't let this sully your feelings on the language. Restore your faith and go watch +Douglas Crockford's latest, most excellent talk on Monads and Gonads. It's worth every minute of your time.