Category / World Wide Wait

HTTP 2.0: It’s about time… May 25, 2014 at 6:22 pm

Time is money.

I’ve spent a lot of time thinking about protocols old and new in the last few years. Granted, I am not an expert in this field, but i’ve been there and done that a number of times in my career. One of my most recent forays into the fray included using ZeroMQ as a foundation for a much higher efficiency transport of data between the “ultimate” client and final server. (Blah, Blah, Blah) I know that the fundamentals of this project were sound. A huge problem was momentum. REST (in peace, please!) and the REST style API is hugely inefficient, but “the world” is hyper enamoured with this style of programming at the moment because it helps developers “enjoy” a form of scale without too much thinking. (I dig Amazon’s Kinesis even if it is “extra inefficient” about the way it transfers data.)

I recently spent sometime at a company operating a retail web API. I wasn’t around long enough to fully ingest their system, but what I saw wasn’t hideous. In the grand scheme of things the exceptionally inefficient way of doing things hadn’t made a huge impact (in most cases) on the perceived performance. (Let me be clear on this. I was impressed with the general way the API was constructed AND the API’s were generally successful, if not “perfect.” What system is?) It would be hubris to say that I could have done better. The key was that the API(‘s) WAS mostly effective in helping to generate (highly likely) many 100 millions of dollars in revenue. Given that kind of metric who really gives a flying fig about ultimate performance? If the performance is good enough to drive revenue for a Fortune 500 company and not piss off too many customers does it really matter? Funny thing is yeah, ~I~ do think it matters.

Why am I even bothering to write this? I think the world is just about ready to go full circle again. This is my conclusion based on a number of things and my own “prejudice” towards (what I consider more) efficiency.

Back before “the world as we know it now” there “was” these “things” called DCOM and CORBA. Both kinda suck(ed), but I also think that both were reasonable attempts to accomplish what ~I~ see REST style API’s doing. ~I~ think that the biggest issue with both was that ~most~ people couldn’t see past solving the problem at hand. Some form of metadata driven programming has to exist for the inter-op. What I keep on seeing is the push and pull between “pure” meta-style-programming and getting it done. I can cite a number of examples of this that I have seen. In all of the cases, moving declarations out of the implementation has been the hardest “thing” for many of the developers I have worked with to understand. I believe that this is one of the reasons why I continue to hear developers “hate on” DCOM and CORBA. Know what I think is funny? WCF and SCA are “just” variations on the principle of the theme. (This is my opinion, hate on me all you want.)

So, “in the beginning” there was binary and slow. When the computer industry was continuously creating faster and faster computers, with more and more memory the idea of trading ease of understanding with efficiency of transfer has gained a ton of momentum. (Trust me, I am going to finally get to a point about HTTP.) I mean, the idea of transferring data as text is neet-o when it only takes a few microseconds. We have come a ~long~ way from the days of needing to work with 300 baud transfer rates. Um, yeah that was bad and the amount of data we transfer now JUST in handshakes and establishing protocols would bring something that slow to its knees. XML is/was awesome and XML sucks. Having structured unstructured data could be really cool. But… the same (fundamental type of) issues that occur with binary data transfer occur with the unstructured XML. The basic idea of a DTD or XSD is to describe the contract that the very unstructured nature of XML was intended to alleviate. (I am glossing over some of the other ideas that made XML a cool thing, like that browsers could ignore tags and that some form of structure could be added to what was effectively transferred unstructured data via HTML.) A well structured XML file is ~far~ from the “human readable” ideal behind one of the core tenets of XML. Yeah, my opinion too. Hate all you want. I LIKE many aspects of XML encodings (XAML is both awesome and shite), but they can be exceptionally difficult to read and the DOM (for XML not XAML) is… yuck.

Lets pretend that 50 (gazillion) other examples of encoding data as text don’t exist and say that JSON “is the bomb.” (Ok, I am going to welch on that and say I like Lua encoding a ~little~ better, but JSON has more momentum in the web world from my point of view.)

So one of the base ideas is that JSON is a serialized form of a living and breathing JavaScript object. JSON also “pretends” to be human readable and seriously fails (IMO) for more than trivial encodings.

So all these systems get designed with this basic idea that everything on the wire is text. Oh, and while we are at it, it is 7bit text encoding as well. Yeah lets make something that is already “bad” use only 7/8’s of the bandwidth. It doesn’t matter… What is a few microseconds between friends? Oh, and if THAT wasn’t bad enough… when you need to transfer anything that is REALLY binary in nature then you likely must encode THAT in a subset using only 6 bits of the already hobbled encoding. Gack. base64 makes me want to puke (but I except that it is required and my gag reflex has long been suppressed even if I whine about it).

You say it doesn’t matter? You probably don’t work at a company that is potentially transferring yottabytes of data between computers during the lifetimes of the systems. (I swear that when 32bit computing made 4G of memory available it was hard-ish to imagine needing that much memory when typical hard drives of the time had not reached that milestone.) I’d say it matters enough that some company’s KNOW the benefits of reducing the bandwidth required. Google’s Protocol Buffers are a real manifestation of this (IMO).

Now I am going to finally mention why the ideas in HTTP/2 makes me all a flutter. “Compressed” headers, long term connections, internal multiplexing of the IP stream, “magically” compressed frames, and binary framing. If you can read between the lines of this jumble of a blog post, you might make the connection that I think that HTTP/2 is going to help fix some of the issues. The flow control idea would be a massive help in some of the systems I have had opportunity to work with. Think embedded systems that have HTTP servers “built-in” but still use alternate connections that aren’t HTTP for the bulk of the data transfer. (You know who you are!) About the only thing I loathe (because of every OS/CPU ~I~ have worked on) is the network byte ordering of the protocols. I suppose I just HAD to not like at least one thing.

I fully suspect that a number of folks that think binary is evil are likely going to hate on http/2. I mean, just look at the Huffman encoding and using the utf8-esq length encoding instead of only string literals for name/value pairs. What sacrilege.

However, at the end of day I also suspect that the VAST majority of developers won’t directly encounter this as the tools used will largely hide the details. The benefits to the RESTful based APIs of the world could be massive. Finally making the concession that a server might have information that a client “wants” at sometime in the future “breaks” the http is “stateless” mantra. I think we passed that rubicon many, many years ago.

It’s hard to imagine that Chrome (my current browser of choice) and IE wouldn’t support http/2 nearly instantly (SPDY anyone?). I don’t really follow the “browser wars” anymore. I can’t imagine any “serious” shop not making http/2 a priority for servers. The changes aren’t (IMO) as aggressive as what is required to support IPv6 (why aren’t we there yet!? — rhetorical: ‘people’ suck and it won’t truly happen until it is too painful to ignore anymore). It is entirely plausible that http/2 ~could~ be something that just happens completely between the browser and server without much ado. Then again, nothing comes for 100% free.

It’s about time.