Innovation

This Flying Drone’s New AI Could Lead to Self-Driving Cars

FLUID FLIGHT

It utilizes a liquid neural network to navigate completely new environments.

MIT-Drone_-4_vjtuj6
MIT

There’s probably no more exciting piece of aerial technology than drones. Not only can they fly into hazardous environments to do things like fight wildfires, but they’re also great platforms to test other emerging technologies like 3D infrastructure printing and even brain-computer interfaces.

Engineers have even developed drones that can fly and navigate environments all by themselves—no human operator required. While they’ve been used for a variety of different applications like watering and seeding crops on farms, and mapping land and waterways, they’ve been fairly limited so far because of a big issue: they can’t really handle new environments. That means a self-flying drone needs to train on specific areas and terrain before it can navigate it by themselves.

That’s why MIT researchers published a study in the journal Science Robotics on Wednesday showcasing a new type of autonomous drone that utilizes an advanced neural network to fly. The authors say that the new design allows the drone to make better decisions when flying through completely new environments—and could have future applications in self-driving cars, search and rescue operations, wildlife monitoring, or even diagnosing medical issues.

ADVERTISEMENT

“Our experiments demonstrate that we can effectively teach a drone to locate an object in a forest during summer, and then deploy the model in winter, with vastly different surroundings, or even in urban settings with varied tasks such as seeking and following,” MIT professor Daniela Rus, who co-authored the study, said in a statement.

The study’s authors utilized a system called a liquid neural network for the drone. Not only can these networks recognize familiar environments, but it continuously adapts to changes and new elements it encounters. This allows it to make decisions as it traverses areas like forests and cities, and adjust to new sensory inputs like noises and barriers.

This gives drone operators the flexibility to use their devices in practically any environment “without any additional training,” Rus explained.

The model was initially trained on data collected by a drone flown by a human pilot. In experiments, the authors utilized the system in a quadrotor drone and tested it by having it track moving targets, and moving around objects.

This ability to make good decisions based on limited data is crucial. It could allow for future autonomous drones to be deployed more efficiently and safely. For example, a fire department could use the same drone to fight house fires as it does wildfires without having to train it in new environments each time. Businesses could also use these drones to deliver packages no matter where they are. Search and rescue teams could deploy drones to find lost hikers in large parks where the terrain can vary wildly.

When it comes to self-driving vehicles, the system could be used to help train autonomous vehicles in order to better adapt to new streets. This would be a large step up from current self-driving systems like Chevrolet’s Super Cruise, which requires cars to be on specific roadways in order to drive semi-autonomously.

However, MIT researcher and co-author of the study Ramin Hasani noted in a statement that there’s “still so much room left for future research” and testing before they can safely deploy the technology. So don’t expect self-driving drones to be zooming around your town any time soon.

The study is no doubt an exciting development in the world of neural networks and autonomous robotics, though. When you consider the wide range of applications for the liquid neural network system—self-driving cars, aerial package deliveries, firefighting drones—you’ll find that the sky is truly the limit.

Got a tip? Send it to The Daily Beast here.