Cybersecurity stories are like buses: the one you’re waiting for doesn’t come along for ages, then two arrive at once.
The specialist subject that suddenly popped up twice this week is: resonance.
On Monday, we wrote about Janet Jackson’s 1989 song Rhythm Nation, and how it inadvertently turned into a proof-of-concept for a Windows-crashing exploit that was reported way back in 2005.
That story was publicised only recently, as a bit of weird historical fun, and with an equal sense of fun, MITRE assigned it an official CVE bug number (confusingly, however, with a 2022 datestamp, because that’s when it first became known).
In that issue, something about the beat and mix of frequencies in the song is alleged to have troubled the disk drives in a certain vendor’s Windows laptops, matching the natural vibrational frequencies of the old-school hard disks…
…to the point that the resonance effects produced enough vibration to crash the disk, which crashed the driver, which crashed Windows.
Apparently, even nearby laptops with the same model of disk could be R&Bed to the point of failure, bringing down the operating system remotely.
The solution, apparently, involved adding some sort of band-pass filter (band as in “range of frequencies”, not as in “group of musicians”) that chopped out the resonance and the overload, but left the sound well-defined enough to sound normal.
Two buses at once
Well, guess what?
At around the same time that the Rhythm Nation story broke, a researcher at Ben-Gurion University of the Negev in Israel published a research paper about resonance problems in mobile phone gyroscopes.
Modern phone gyrospcopes don’t have spinning flywheels housed in gimbals, like the balancing gyroscope toys you may have seen or even owned as a youngster, but are based on etched silicon nanostructures that detect motion and movement through the earth’s magnetic field.
Mordechai Guri’s paper is entitled GAIROSCOPE: Injecting Data from Air-Gapped Computers to Nearby Gyroscopes, and the title pretty much summarises the story.
By the way, if you’re wondering why the keywords Ben-Gurion University and airgap ring a bell, it’s because academics there routinely have absurd amounts of fun are regular contributors to the field of how to manage data leakage into and out of secure areas.
Maintaining an airgap
So-called airgapped networks are commonly used for tasks such as developing anti-malware software, researching cybersecurity exploits, handling secret or confidential documents safely, and keeping nuclear research facilities free from outside interference.
The name means literally what it says: there’s no physical connection between the two parts of the network.
So, if you optimistically assume that alternative networking hardware such as Wi-Fi and Bluetooth are properly controlled, data can only move between “inside” and “outside” in a way that requires human intervention that can be robustly regulated, monitored, supervised, signed off, logged, and so on.
But what about a corrupt insider who wants to break the rules and steal protected data in a way that their managers and their security team are unlikely to spot?
Ben-Gurion University researchers have come up with many weird but workable data exfiltration tricks over the years, along with techniques for detecting or preventing them, often giving them really funky names…
…such as LANTENNA, where innocent-looking network packets on the wires connecting up the trusted side of the network actually produce faint radio waves that can be detected by a collaborator outside the secure lab with a hidden antenna and a software defined radio receiver:
Or fan speeds used to sent covert sound signals, in a trick dubbed the FANSMITTER:
Or using capacitors on a motherboard to act as tiny stand-in speakers in a computer with its own loudspeaker deliberately removed.
Ir adding meaning to the amound of red tint on the screen from second to second, and many other abstruse airbridging tricks.
The trouble with sound
Exfiltrating data via a loudspeaker is easy enough (computer modems and acoustic couplers were doing it more than 50 years ago), but there are two problems here: [1] the sounds themselves squawking out of speakers on the trusted side of an airgapped network are a bit of a giveaway, and [2] you need an undetected, unregulatded microphone on the untrusted side of the network to pick up the noises and record them surreptitiously.
Problem [1] was overcome by the discovery that many if not most computer speakers can actally produce so-called ultrasonic sounds, with frequencies high enough (typically 17,000 hertz or above) that few humans can hear them.
At the same time, a typical mobile phone microphone can pick up ultrasonic sounds at the other side of the airgap, thus providing a covert audio channel.
But trick [2] was thwarted, at least in part, by the fact that most modern mobile phones or tablets have easily-verified configuration settings to control microphone use.
So, phones that are pre-rigged to violate “no recording devices allowed” policies can fairly easily be spotted by a supervisory check before they’re allowed into the secure area.
(In other words, there’s a real chance of being caught with a “live mic” if your phone is configured in an obviously non-compliant condition, which could result in getting arrested or worse.)
As you’ve figured from the title of Guri’s paper, however, it turns out that the gyroscope chip in most modern mobile phones – the chip that detects when you’ve turned the screen sideways or picked the device up – can be used as a very rudimentary microphone.
Greatly simplified, the GAIROSCOPE data exfiltration system involves testing a known mobile phone to a range of ultrasonic frequencies (in Guri’s example, these were just above 19,000 hertz, too high for almost anyone to hear) and finding out a precise frequency that provokes detectably abnormal resonance in the gyroscope chip.
Once you’ve found one or more resonant frequencies safely out of human hearing range, you’ve effectively got yourself both ends of a covert data signalling channel based on frequencies that can inaudibly be generated at one end, and reliably be detected, without using a regular microphone, at the other.
The reason for targeting the gyroscope is that most mobile phones treat the gyroscope signal as uncontroversial from a privacy or security point of view, and allow apps (on Android, this even includes, by default, the Chrome browser) to read out the gyroscope X, Y and Z position readings without any special permissions.
This means a mobile device that has apparently been configured into “no eavesdropping” mode could, after all, be receiving secret, inaudible data via a covert audio channel.
Don’t get too excited.
Data rates generally seem to be about 1 bit per second, which makes 50-year-old computer modems seem fast…
…but data such as secret keys or passwords are often only a few hundred or a few thousand bits long, and even 1 bit/sec could be enough to leak them across an otherwise healthy airgap in a few minutes or hours.
What to do?
The obvious “cure” for this sort of trick is to ban mobile phones entirely from your secure areas, a precaution that you should expect in any serious airgapped network area.
In less-secure areas where airgaps are nevertheless used, but mobile phones are allowed (subject to specific verified settings) as an operational convenience, the invention of GAIROSCOPE changes the rules.
From now on, you will want to verify that users have turned off their “motion detection” system settings, in addition to blocking access to the microphone, Wi-Fi, Bluetooth and other features already well-known for the data leakage risks they bring.
Lastly, if you’re really worried, you could simply disconnect internal speakers in any computers on the secure side of the network…
…or use an active frequency filter, just like that unnamed laptop vendor did to block the rogue Rhythm Nation signals.
(Guri’s paper shows a simple analog electrical circuit to cut off audio frequencies above a chosen value.)