Enter HTTP/2. It offers many improvements over the previous HTTP/1.1 - and we support it.
Our servers run nginx - which supports HTTP/2, we use Cloudflare for protection - and that supports HTTP/2, and, once Amazon announced that Cloudfront supports HTTP/2, we immediately enabled it on all our distributions. So if your package includes delivery using Cloudfront, you immediately benefit from it. The good news is that this all works seamlessly. The pages still look the same. The videos still look the same. They just load faster. Most major browsers have supported HTTP/2 since the end of 2015.
HTTP/2 uses the available connection much more efficiently than HTTP/1.1. It is designed to be fully multiplexed, loading requests in parallel. This is increasingly important with modern sites. In the early days of the web, HTTP/1.1 was fine. Its limitation on effectively only allowing one outstanding request per TCP connection was not so much of an issue when a page merely consisted of text and perhaps some images. Fewer requests were made. However as the web has developed, sites have becomes increasingly complex, loading many types of asset - including videos (particularly relevant to our video hosting platform!).
It is based on a protocol that Google pioneered called SPDY (pronounced “speedy”). They saw that HTTP/1.1 needed improving. We supported SPDY too in our nginx servers. However HTTP/2 introduced further improvements to prevent exploits such as CRIME that were found with SPDY’s method of compression. Google have now dropped support for SPDY in favour of HTTP/2.
HTTP/2 reduces the size of the headers dramatically. Headers are essentially overhead (cookies, referer etc) but must be sent. The more assets that a page loads, the more header data needs sending along with them. Compressing the headers particularly benefits mobile devices where latency is often much higher and connection speeds lower.
Finally, HTTP/2 brings server push. Currently our Vidbeo online video platform does not make use of that particular feature, however it is an interesting area. The idea is for the server to predict what the client will need before they request it and so “push” that data to them. This avoids the round-trip delay of the client actually requesting that data, but of course it could actually harm performance if the prediction was wrong.
If you want to find out more about our professional online video platform, you can email us at [email protected] or simply fill out the contact form with your requirements.
Updated: September 20, 2016