The U.S. military has tested, multiple times, a brain implant that allows a human operator to simultaneously control, with their thoughts, up to three flying drones.
Well, in theory. The tests, overseen by the Defense Advanced Research Projects Agency (DARPA), were computer simulations. And while they might eventually lead to actual mind control for flying robots, the technology is still in its infancy.
The mind-controlled drone trials took place in Pittsburgh between June 2016 and January 2017, according to DARPA. “Using a bidirectional neural interface, a volunteer named Nathan Copeland was able to simultaneously steer a simulated lead aircraft and maintain formation of two, simulated unmanned support aircraft in a flight simulator,” Tim Kilbride, a DARPA spokesperson, told The Daily Beast.
ADVERTISEMENT
Test-subject Copeland, who is partially paralyzed, never actually steered a real drone using only his thoughts. Instead, he channeled his thoughts through a medical implant embedded in his skull, which used electroencephalogram—or EEG, the same method doctors use to diagnose epilepsy—to interface with a computer simulation of a drone navigating an obstacle course in the company of two robotic wingmen.
“Nathan’s task was to exercise vertical and lateral control to fly the lead aircraft through a series of hoops positioned in the center of the screen, while also maintaining/correcting the lateral course of the two support aircraft through their own hoops positioned in shifting locations at the top of the screen,” Kilbride said via email.
The technology is promising, and could one day lead to a direct interface between human operators and robots. That’s right—mind control for drones. But there are limits. Vaguely controlling one drone is possible today. Directly controlling several drones, and with greater fidelity and full two-way communication, is beyond the reach of current tech.
Thought-controlled drones have been in development for years. In February 2015, DARPA announced that a volunteer named Jan Scheuermann, a quadriplegic, had flown a simulated F-35 stealth fighter using only her thoughts.
A year later in April 2016 at the University of Florida, 16 people donned EEG headsets and used their brainwaves to steer drones along a 10-yard indoor course. Amber Hawthorne, a sophomore majoring in electrical engineering, took first place in that race.
DARPA’s Pittsburgh experiments took the technology a step further. Not only did Copeland send signals to the drones—the drones sent signals back. “The signals from those aircraft can be delivered directly back to the brain so that the brain of that user can also perceive the environment,” Justin Sanchez, the director of DARPA’s Biological Technologies Office, said at the agency’s 60th anniversary event in Maryland in early September.
Special code in Copeland’s brain implant translated specific thoughts—“turn left,” for example—into code that the drone can understand. “This training process assists the computer with detecting brain patterns that correspond to specific cognitive commands,” Chris Crawford, the lead developer for the University of Florida event, told local public radio. “In our system, we train [in] a neutral state—users are relaxing, calm, not blinking—and [in] a push state. Imagine pushing an object forward.”
The drone in turn can scan its environment, detect an obstacle to its right, and beam back to the operator its own recommendation. “I should go right,” for instance.
In a separate phase of the DARPA experiment, a test subject using an implant like Copeland’s received these kind of directional signals back from a single drone, Kilbride said. The implant translated the drone’s recommendation into “haptic perception”—a kind of impression corresponding to the physical space around the user.
In other words, the test subject felt what the drone saw at the same time that the drone responded to the subject’s commands. The haptic perception prompted the test subject in that separate experiment to steer the drone left or right to keep it on course. “It’s taken a number of years to try and figure this out,” Sanchez said.
But there are clear limits to the technology. With today’s tech it’s only possible for a user to communicate with one drone at a time, Juan Gilbert, the University of Florida professor who oversaw the 2016 mind controlled drone race, told The Daily Beast. “I am not aware of any new advances that would allow a single person to control multiple drones at the same time.”
And the communication, in both directions, is limited to vague directional commands. Go left. Go right. The technology isn’t nearly ready to, say, beam a drone’s video stream directly into a user’s brain. “High-resolution electro-neural interface with read and write capabilities in 3-D is a long ways away,” Daniel Palanker, an expert in prostheses at Stanford University, told The Daily Beast.
“I don’t know of any existing technology that can put images in your brain,” Gilbert said.
If and when the tech is ever ready for a high-fidelity, man-machine interface, it’ll probably come at a high cost to the human user. That kind of implant is “likely going to be invasive,” Palanker said.
But if it reliably works, the drone-brain interface could have profound implications. “This is the future,” Christopher Jacobson, who attended the University of Florida drone race, told public radio. “I believe we will be able to soon control many things with these kinds of devices.”