Researcher Making Artificial Intelligence Based Smarter Drones
Recent advances in artificial intelligence (AI) have opened the door for an unprecedented number of uses for unmanned autonomous vehicles (UAVs). A researcher makes drones smarter, team-oriented, and situationally aware as part of an NSF-funded project to allow UAVs, taking into account various situations, to track themselves and each other. Groups of drones now can work together in networks for purposes such as traffic control, smart agriculture, surveillance and security systems, law enforcement, public safety, and much more.
Present drone systems, however, lack crucial considerations, such as the ability to properly recognize and respond to environmental and behavioral factors, says Abolfazl Razi, an assistant professor at the University of Northern Arizona’s School of Informatics, Computation, and Cyber Systems (SICCS).
“A researcher makes drones smarter, team-oriented, and situationally aware as part of an NSF-funded project to allow UAVs, taking into account various situations, to track themselves and each other.”
That’s why Razi is working to make drones smarter and more autonomous. He believes drones can be developed through computer programming to demonstrate situational awareness, identify malfunctioning, suspect or invading UAVs, and make changes on the move. The director of NAU’s Wireless Networking and Smart Health (WiNeSH) Lab, Razi has received a $480,000 grant from the National Science Foundation for his project titled, “Proactive Inverse Learning of Network Topology for Predictive Communication among Unmanned Vehicles.”
An important component of the project deals with adversity. Through computer programming, he believes drones can be developed to express situational awareness, recognize malfunctioning, suspicious, or invading UAVs, and make adjustments on the fly. Adversity addresses an essential aspect of the project. “A drone that has joined a mission may show an anomaly and violate the set regulations for the mission. Instead of following the pre-planned motion trajectory, it may go dangerously close to other drones, for example,” he said.
Razi says critical missions could involve forest fires, traffic accidents, search, and rescue, or military operations. “If someone tries to penetrate your mission by sending in their own drones and making problems either on purpose or by accident, we want the UAVs to find the intrusion and cope with the situation.” Developing drones for face recognition needs another technology that can train an AI model to detect the people using their facial attributes.
The study is intended to make drones more independent of human control and observation, by communicating with other unmanned aircraft in their area, behaving like teammates, and being able to distinguish intruding or enemy drones.
The research will include the human-inspired method of proactive learning from limited experience by exposing the software to various conditions and involve reverse engineering of the UAVs’ decision support system (DSS). Most of the infrastructural amenities are either fully or partially automated with advanced security surveillance used in AI-based cameras for quick facial recognition and trace the unwanted objects.
“This approach serves for AI-enabled networking by incorporating the predicted responses into system protocols,” Razi said. The three-year project is expected to benefit U.S. government agencies, organizations, and researchers with the ultimate goal of developing better systems of multiple autonomous drones. Moreover, it also plays a crucial role in traffic monitoring and defines the city route for a smooth and trouble-free movement.