Vulnerabilities in HTTP/2 Provide Invaluable Insight
1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading...

Vulnerabilities in HTTP/2 Provide Invaluable Insight

Finding vulnerabilities in HTTP/2 makes the new standard more secure.

HTTP is the foundation of the web. It’s the technical protocol that facilitates communication between your web browser and the sites you visit. It’s also nearly 20 years old—that’s ancient by internet standards.

In May 2015, work was finalized on HTTP/2, the first major upgrade to the protocol in over a decade. It brings with it huge advances in technology that will make the web faster, more efficient, and just plain better.

Since HTTP/2 was finalized last year, a lot of groundwork for implementing this new standard has been laid: All the major web browsers now support the protocol, as do most major web servers. Even CDNs and network appliances, commonly used with larger sites and networks, now support it.

That means HTTP/2 is ready for prime-time, right?

Not quite.

New research presented by Imperva, a cyber security company, has found a number of vulnerabilities in HTTP/2 implementations that should give us pause. The research, conducted by Imperva’s Defense Center, found four major vulnerabilities In the HTTP/2 implementation of the most popular web servers, including Apache, Microsoft’s IIS, and NGINX.

Currently, HTTP/2 is already in use amongst 18.5% of the Alexa Top Million websites[1]. But there is a long way between early technology adopters like Google and Facebook and deployment across the entire web.

Frankly, there is a reason that the entire web does not immediately implement a new technology as soon as it becomes available – problems are expected.

Implementing new protocols means making new mistakes (and making old mistakes in new ways), and the code has not had a chance to be properly battle-tested under real-world conditions. These factors mean new software likely contains at least a few vulnerabilities.

The Curious Process of Implementing Standards

It took over three years and 17 drafts to finalize the HTTP/2 specification, so why are we already finding problems with it?

Because even a bullet-proof standard is only as good as the implementations of it.

Internet standards are only guidelines and rules on how things should be done – not actual code that can do them. From a technical standpoint, it would be almost impossible for internet standards to provide actual working code, due to the number of programming languages out there, not to mention how that code would integrate into existing systems and the legal implications of distribution.

This means successful implementation of new technologies, like HTTP/2, are contingent on developers understanding the standard. The HTTP/2 specification is nearly 100 pages long – and those are dense pages – so it’s not surprising that mistakes happen.

Now think about the fact that hundreds (maybe even thousands) of developers around the world are all individually working on the implementations for their software. All it takes is for one major product to get something wrong and suddenly millions of users at risk.

Due to the complexity of the web, “vendors alone cannot make a new protocol secure, it takes the full strength of the security industry to harden the extended attack surface”[2] The push and pull of new capabilities, and from those, new vulnerabilities, is part of the natural evolution of technologies. When we hear about vulnerabilities, our immediate reaction is that something must be wrong. But in reality, the discovery of these problems is evidence that the web ecosystem is working – that vendors aren’t releasing software and products without independent review, that new standards are constantly studied, critiqued, and improved.

Vulnerabilities in HTTP/2 and new technology

“New technology brings new risk,” Imperva stated in its official report. “As with any new technology, HTTP/2 suffers from creating new extended attack surfaces for attackers to target.”

Despite the fact that HTTP/2 is an improvement, it still isn’t perfect. Imperva found four server-side vulnerabilities in HTTP/2 implementations of major web server software.  These vulnerabilities impacted some of the core improvements of HTTP/2 including multiplexing and HPACK.

These vulnerabilities were found so early on because they were the most obvious. In fact, the HTTP/2 standard specifically warned about the potential for improper implementation – which told Imperva’s team exactly where to start looking.

All four of these vulnerabilities overwhelm the target servers in different ways. This can considerably slow a website’s performance, or entirely bring it offline. However, the vulnerabilities cannot compromise data stored on the server.

The Slow Read vulnerability sends requests to the server which it can’t resolve, causing them to stay open and make resources unavailable for other visitors. The three other attacks, which don’t have formal names, cause similar unresponsiveness – and can be used to DDoS a server or fill a server’s available memory, causing it to crash.

But, luckily we have no reason to worry – Imperva worked with vendors and the affected software is already patched. These vulnerabilities pose no threat to the internet – and their significance is in their insight to the adoption of technology, not their capacity to do harm.

There’s (Usually) Nothing to Fear

Don’t be scared next time you hear about a vulnerability. It is true – many vulnerabilities have the capacity to cause real damage, and malicious computer attacks cost the global economy billions of dollars a year. However tons of vulnerabilities get patched through responsible disclosure before the public even hears about it, and others, while technically dangerous have never been documented in use.

But every vulnerability, the truly harmful and the harmless, show us that the technology community is working well.

 

[1] http://isthewebhttp2yet.com/measurements/adoption.html

[2] Pg.2