Microsoft research team creates unique drone training simulator

By Luke Geiver | May 10, 2017

Ashsih Kapoor is working to create a better environment for unmanned aircraft systems with his team of Microsoft researchers. Through the team's efforts with the Aerial Informatics and Robotics Platform, they are working to give drone developers, payload and sensor makers or even hobby drone pilots, the ability to fly—or crash—in a uniquely realistic yet simulated environment.

Kapoor and his team began working on the simulation system due to a need from the artificial intelligence community. Robots, drones or computer systems that have been able to function in the real-world have done so with the help of machine learning, a process that utilizes sensory data combined with algorithms to allow a robot or drone to learn from and act appropriately on previously acquired and recorded data. “Data driven techniques such as machine learning or perception require a lot of real world training data,” Kapoor said. “These robots are going to have some kind of module that will rely on machine learning, however, the problem is that most of these are data intensive. Your flying robot needs to collect a lot of flying data,” he added. “In most cases, it won’t have enough data to start the process of safely flying autonomously.”

The Microsoft team has created an open source platform that allows collaborators to essentially acquire the necessary data needed for basic machine learning before a drone or robot ever enters the real world.

Through the software developed by the team, a user can get a realistic experience of flying a drone, they can crash as many times as they want without hurting anyone or damaging any parts and if they care about machine learning data, they can run endless simulations to compile usable data that can inform future operations.

For 30 years, researchers have been working to create a system that allows sensors or robots the ability to judge depth. Accomplishing the simple yet complex task is crucial for aerial and land robots to navigate autonomously in the real world. One of the most successful ways—and one of the most unscaleable and time intensive—involves acquiring and cataloging basic sensory data and inputting it all in a system. A drone can be carried around for instance, to capture images and then calculate the distance from the drone the images are. The approach works, as do other radio or frequency-based methods, but it is not as safe or scaleable as using a simulated environment. Using the Microsoft system, a drone developer can put its drone into the simulated world and allow the drone to  interact and then leave with massive data sets that include images and distance correlations, all of which helps the robot or drone perform reinforced calculations that will enable the unit to act safely and appropriately in the future based on information or data captured from its past.

Enabling a drone to perform reinforcement learning based on its environment is a process that is beneficial but difficult to do. If a drone can peform this task, it will be able to fly autonomously much safer and more efficiently and, it won’t have to be shut down for software upgrades, Kapoor said. The simulated world method speeds up that data library build out.

To enhance their simulation work, the Microsoft team has also put its work into an open source framework. If others have ideas on how to enhance it or improve the code that makes it all possible, Kapoor and his team are open to outside input.

To date, the team has created the simulated flight environment based on a quad rotor in combination with photorealistic technologies that according to Microsoft can accurately render subtle details like shadows or reflections. Other platforms can be run in the system by changing the code based on the specs of a particular platform. The team has also made their system supportive of robotic operating systems and others.

A beta version is available on GitHub via an open source license.