At the start of the month, I travelled back to my native Perth, Western Australia to attend linux.conf.au. LCA, as it’s commonly known, has been the largest free and open source software conference in Australia and New Zealand for several years, attracting a wide range of attendees both from the region and further afield. Its name is now something of a misnomer, as it covers more than just Linux — more general software and hardware talks are also encouraged.
One interesting talk I attended was ‘HTTP/2.0 and You’ by Mark Nottingham, the chair of the IETF HTTPBIS working group. The group is responsible for the forthcoming HTTP 2.0 standard and is uniquely placed to introduce both the protocol itself and the rationale behind the many changes it introduces.
Now HTTP 2.0 is a very significant change to the protocol — far more so than the previous transitions to HTTP 1.0 and 1.1. Based on the work done by Google on SPDY, it replaces HTTP’s current text based protocol with a multiplexed binary protocol while maintaining HTTP’s transactional semantics.
In his presentation, Mark laid out the pros and cons very even-handedly. The loss of simple debugging via telnet and straightforward packet sniffing is likely to be mitigated through updated and new tools that can handle both the binary nature of HTTP 2.0 and the multiplexed streams, such as new plugins for Wireshark. There are significant advantages for mobile users and sites that require users to load many assets in parallel, as the use of a single TCP connection should noticeably reduce latency.
One aspect of the protocol that seems likely to be controversial, based both on the discussion during the talk and subsequent hallway conversations, is TLS (née SSL). TLS will not be required by HTTP 2.0 itself, but as Firefox and Chrome have both stated they will mandate the use of TLS — this will effectively require Web sites to install and maintain valid certificates. This seems likely to either require smoother UI flows for accepting self-signed certificates, or for free certificates to become more easily available — the long tail of small sites on the Web is unlikely to want to pay for certificates, and the current vendors of free certificates tend to have overly complicated provisioning processes.
Overall, it was a terrific, thought-provoking talk that sparked conversation beyond the event itself. The video is below and well worth a watch for anyone involved in Web development. Beyond describing the new protocol, Mark touched on many issues that will need to be worked on even after HTTP 2.0 is finalized, such as the role of load balancers in a world with longer lasting, more complex connections. Change is coming, and the promulgation of a new HTTP standard is likely to be only the beginning.