As if we don’t have enough to worry about, now there’s spyware for your brain. Or, there could be. Researchers at Oxford, Geneva, and Berkeley have created a proof of concept for using commercially available brain-computer interfaces to discover private facts about today’s gamers.
The attack is deceptively simple and relies on technology that has been deployed commercially for years. In 2009, a company call NeuroSky partnered with Star Wars and toy distributor Uncle Milton to create The Force Trainer, a game where young Jedi move a sphere up a tower just by concentrating on it. The basic methodology is to use electroencephalography (EEG) to detect a brain wave pattern consistent with task-relevance. Users must train the device so that it can recognize when they are thinking about a given task. But then merely concentrating on that task—for instance, moving a ball—can signal a processor to manifest the activity digitally or, if coupled with a motor like the fan in The Force Trainer, in actuality.
Today a variety of games leverage the same mechanism to permit users to control some or all of a game with headgear that detects brain activity. There are even app stores where third-party developers can write games for NeuroSky and other EEG devices.
And that’s where the trouble comes in. In a recent paper entitled On the Feasibility of Side-Channel Attacks with Brain-Computer Interfaces, researchers demonstrate how malicious software might be capable of training the user’s brain surreptitiously to reveal information—bank, date of birth, even pin number—that could be used for identity theft. By showing the user a series of pictures of bank logos, for instance, the researchers could begin to zero in on the name of the user’s own bank. The study is a proof of concept; the success rate was rather modest (10-42%). But the researchers speculated convincingly on how the technique might be refined.
There are surely simpler ways to get at people’s personal information. But the prospect of brain spyware, like the looming specter of the domestic use of drones, could churn the collective imagination. Thus, we might expect a state attorney general or two to investigate should the practice surface even once in the wild. And I bet even the most ardent supporters of “open” platforms that admit of third party innovation might agree that brain app stores should take on a heavily curatorial role. (I do.)
There are other issues. Gaming systems average one per household. The most popular games in America are massively multiplayer and hence online. If EEG represents the future of gaming, then, at any given time, millions of Americans could be hooked up to a NeuroSky or Emotiv Systems console while simultaneously connected to the Internet. In theory, this would allow for a kind of brain AMBER alert. A little girl goes missing and the government shows everyone a picture, noting any brainwaves that suggest recognition and pairing them with an IP address. (For more on the intersection of neuroscience and criminal investigation, see, e.g., Nita Farahany, Incriminating Thoughts. For more on the burgeoning field of “neurosecurity,” see Tamara Denning et al., Neurosecurity: Security and Privacy for Neural Devices.)
Whatever the applications, the mere possibility that any actor other than a trusted game developer or medical provider could leverage EEG for a secondary purpose suggests the law will have something to say about this technology as it matures. I presented and commented on this research at a Center for Law and the Biosciences journal club last week that should eventually find its way to the Stanford Law School YouTube channel. I’m also eager to
read hear your thoughts.