A new side-channel attack dubbed PIXHELL could be abused to target air-gapped computers by breaching the “audio gap” and exfiltrating sensitive information by taking advantage of the noise generated by pixels on an LCD screen.
“Malware in the air-gap and audio-gap computers generates crafted pixel patterns that produce noise in the frequency range of 0 – 22 kHz,” Dr. Mordechai Guri, the head of the Offensive Cyber Research Lab in the Department of Software and Information Systems Engineering at the Ben Gurion University of the Negev in Israel, said in a newly published paper.
“The malicious code exploits the sound generated by coils and capacitors to control the frequencies emanating from the screen. Acoustic signals can encode and transmit sensitive information.”
The attack is notable in that it doesn’t require any specialized audio hardware, loudspeaker, or internal speaker on the compromised computer, instead relying on the LCD screen to generate acoustic signals.
Air-gapping is a crucial security measure that’s designed to safeguard mission-critical environments against potentially security threats by physically and logically isolating them from external networks (i.e., internet). This is typically accomplished by disconnecting network cables, disabling wireless interfaces, and disabling USB connections.
That said, such defenses could be circumvented by means of rogue insider or a compromise of the hardware or software supply chain. Another scenario could involve an unsuspecting employee plugging in an infected USB drive to deploy malware capable of triggering a covert data exfiltration channel.
“Phishing, malicious insiders, or other social engineering techniques may be employed to trick individuals with access to the air-gapped system into taking actions that compromise security, such as clicking on malicious links or downloading infected files,” Dr. Guri said.
“Attackers may also use software supply chain attacks by targeting software application dependencies or third-party libraries. By compromising these dependencies, they can introduce vulnerabilities or malicious code that may go unnoticed during development and testing.”
Like the recently demonstrated RAMBO attack, PIXHELL makes use of the malware deployed on the compromised host to create an acoustic channel for leaking information from audio-gapped systems.
This is made possible by the fact that LCD screens contain inductors and capacitors as part of their internal components and power supply, causing them to vibrate at an audible frequency that produces a high-pitched noise when electricity is passed through the coils, a phenomenon called coil whine.
Specifically, changes in power consumption can induce mechanical vibrations or piezoelectric effects in capacitors, producing audible noise. A crucial aspect that affects the consumption pattern is the number of pixels that are lit and their distribution across the screen, as white pixels require more power to display than dark pixels.
“Also, when alternating current (AC) passes through the screen capacitors, they vibrate at specific frequencies,” Dr. Guri said. “The acoustic emanates are generated by the internal electric part of the LCD screen. Its characteristics are affected by the actual bitmap, pattern, and intensity of pixels projected on the screen.”
“By carefully controlling the pixel patterns shown on our screen, our technique generates certain acoustic waves at specific frequencies from LCD screens.”
An attacker could therefore leverage the technique to exfiltrate the data in the form of acoustic signals that are then modulated and transmitted to a nearby Windows or Android device, which can subsequently demodulate the packets and extract the information.
That having said, it bears noting that the power and quality of the emanated acoustic signal depends on the specific screen structure, its internal power supply, and coil and capacitor locations, among other factors.
Another important thing to highlight is that the PIXHELL attack, by default, is visible to users looking at the LCD screen, given that it involves displaying a bitmap pattern comprising alternate black-and-white rows.
“To remain covert, attackers may use a strategy that transmits while the user is absent,” Dr. Guri said. “For example, a so-called ‘overnight attack’ on the covert channels is maintained during the off-hours, reducing the risk of being revealed and exposed.”
The attack, however, could be transformed into a stealthy one during working hours by reducing the pixel colors to very low values prior to transmission — i.e., using RGB levels of (1,1,1), (3,3,3), (7,7,7), and (15,15,15) — thereby giving the impression to the user that the screen is black.
But doing so has the side effect of “significantly” bringing down the sound production levels. Nor is the approach foolproof, as a user can still make out anomalous patterns if they look “carefully” at the screen.
This is not the first time audio-gap restrictions have been surmounted in an experimental setup. Prior studies undertaken by Dr. Guri and others have employed sounds generated by computer fans (Fansmitter), hard disk drives (Diskfiltration), CD/DVD drives (CD-LEAK), power supply units (POWER-SUPPLaY), and inkjet printers (Inkfiltration).
As countermeasures, it’s recommended to use an acoustic jammer to neutralize the transmission, monitor the audio spectrum for unusual or uncommon signals, limit physical access to authorized personnel, prohibit the use of smartphones, and use an external camera for detecting unusual modulated screen patterns.
Found this article interesting? Follow us on Twitter and LinkedIn to read more exclusive content we post.