whoAMI: Discovering and exploiting a large-scale AMI name confusion attack

Seth Art

fwd:cloudsec North America 2025 · Day 1 · Track 1 - Crystal

Seth Art, a security researcher at Datadog with 15 years of penetration testing experience, presents a comprehensive examination of the **whoAMI** attack, a name confusion vulnerability affecting **Amazon Machine Images (AMIs)** in AWS. The attack allows an adversary to publish a malicious AMI that matches a victim's search criteria, causing the victim's infrastructure-as-code to unknowingly deploy the attacker's image instead of the legitimate one. The result is remote code execution in the victim's AWS account. Art estimates that if launched today, this attack could compromise thousands of AWS accounts. The talk covers the vulnerability's history dating back to 2009, a live demonstration of the attack chain, real-world findings including a vulnerability in Cirrus CI's self-hosted runner infrastructure, and the new AWS **Allowed AMIs** feature that serves as the primary defense.

AI review

A masterclass in supply chain attack research. Seth Art takes a deceptively simple vulnerability -- missing owner filters in AMI lookups -- and turns it into a fully weaponized attack chain with real-world impact against thousands of AWS accounts, including AWS's own internal systems. The historical depth, live demos, coordinated disclosure with Cirrus CI, and the Cloud Image Investigator tool make this the complete package.

Watch on YouTube