In the summer of 2010, I interned at NASA Langley Research Center in the Langley Aerospace Research Summer Scholars Program.
My lab established an Autonomous Vehicle Lab for testing unmanned aerial vehicles, both indoors and outdoors.
Overview
I worked in the Laser Remote Sensing Branch of the Engineering Directorate under mentor Garry D. Qualls. There were nine interns besides me - here's the full list, alphabetically:
- Brianna Conrad, Massachusetts Institute of Technology
- Avik Dayal, University of Virginia
- Michael Donnelly, Christopher Newport University
- Jake Forsberg, Boise State University
- Amanda Huff, Western Kentucky University
- Jacqueline Kory, Vassar College
- Leonardo Le, University of Minnesota
- Duncan Miller, University of Michigan
- Stephen Pace, Virginia Tech
- Elizabeth Semelsberger, Christopher Newport University
Our project's abstract
Autonomous Vehicle Laboratory for "Sense and Avoid" Research
As autonomous, unmanned aerial vehicles begin to operate regularly in the National Airspace System, the ability to safely test the coordination and control of multiple vehicles will be an important capability. This team has been working to establish a autonomous vehicle testing facility that will allow complex, multi-vehicle tests to be run both indoors and outdoors. Indoors, a commercial motion capture system is used to track vehicles in a 20'x20'x8' volume with sub-millimeter accuracy. This tracking information is transmitted to navigation controllers, a flight management system, and real-time visual displays. All data packets sent over the network are recorded and the system has the ability to play back any test for further analysis. Outdoors, a differential GPS system replaces the functionality of the motion capture system, allowing the same tests to be conducted as indoors, but on a much larger scale.
Presently, two quadrotor helicopters and one wheeled ground vehicle operate routinely in the volume. The navigation controllers implement Proportional-Integral-Derivative (PID) control algorithms and collision avoidance capabilities for each vehicle. Virtual, moving points in the volume are generated by the flight management system for the vehicles to track and follow. This allows the creation of specific flight paths, allowing the efficient evaluation of navigation control algorithms. Data from actual vehicles, virtual vehicles, and vehicles that are part of hardware in the loop simulations are merged into a common simulation environment using FlightGear, an open source flight simulator. Evaluating the reactions of both air and ground vehicles in a simulated environment reduces time and cost, while allowing the user to log, replay and explore critical events with greater precision. This testing facility will allow NASA researchers and aerospace contractors to address sense and avoid problems associated with autonomous multi-vehicle flight control in a safe and flexible manner.
Articles and other media
In the media
On my blog
Videos
Most of the summer was spent developing all the pieces of software and hardware needed to get our autonomous vehicle facility up and running, but by the end, we were flying quadcopters! (Captions are below their corresponding videos.)
Credit for these videos goes to one of my labmates, Jake Forsberg.
Object tracking for human interaction with autonomous quadcopter: Here, the flying quadcopter is changing its yaw and altitude to match the other object in the flight volume (at first, another copter's protective foam frame; later, the entertaining hat we constructed). The cameras you see in the background track the little retro-reflective markers that we place on objects we want to track -- this kind of motion capture systems is often used to acquire human movement for animation in movies and video games. In the camera software, groups of markers can be selected as representing an object so that the object is recognized any time that specific arrangement of markers is seen. Our control software uses the position and orientation data from the camera software and sends commands to the copter via wifi.
Autonomous sense and avoid with AR.Drone quadcopter: The flying copter is attempting to maintain a certain position in the flight volume. When another tracked object gets too close, the copter avoids. We improved our algorithm between the first and second halves of this video. Presently, only objects tracked by the cameras are avoided, since we have yet to put local sensors on the copters (the obstacle avoidance is done using global information from the camera system about all the objects' locations).
Autonomous quadcopter tracking and following a ground vehicle: The flying copter is attempting to maintain a position above the truck. The truck was driven manually by one of my labmates, though eventually, it'll be autonomous, too.
Virtual flight boundaries with the AR.Drone and the Vicon motion capture system: As a safety precaution, we implemented virtual boundaries in our flight volume. Even if the copter is commanded to fly to a point beyond one of the virtual walls, it won't fly past the walls.
Hardware-in-the-loop simulation: Some of my labmates built a hardware-in-the-loop simulation with a truck, and also with a plane. Essentially, a simulated environment emulates sensor and state data for the real vehicle, which responds as if it is in the simulated world.