CipherSteal: Stealing Input Data from TEE-Shielded Neural Networks with Ciphertext Side Channels

Yuanyuan Yuan, Zhibo Liu, Sen Deng, Yanzuo Chen, Shuai Wang, Yinqian Zhang

IEEE Symposium on Security and Privacy 2025 · Day 3 · Hardware Security

In an era where deep neural networks (DNNs) are increasingly processing sensitive user data—ranging from facial photos and biometric scans to highly personal genetic information—the imperative for robust privacy protection has never been greater. Trusted Execution Environments (TEEs) have emerged as a cornerstone technology for safeguarding such data, promising hardware-backed isolation where even a malicious host operating system cannot inspect the contents of protected memory. However, the talk "CipherSteal: Stealing Input Data from TEE-Shielded Neural Networks with Ciphertext Side Channels," presented by Yanzuo Chen at IEEE S&P, unveils a critical vulnerability that undermines these assurances, demonstrating how sensitive input data can be exfiltrated from TEE-protected DNNs.

Watch on YouTube