Email Login
News | Research | DeepLearning | Seminar | Q&A

As drones and their components get smaller, more efficient, and more capable, we’ve seen an increasing amount of research towards getting these things flying by themselves in semi-structured environments without relying on external localization. The University of Pennsylvania has done some amazing work in this area, as has DARPA’s Fast Lightweight Autonomy program.

At NASA’s Jet Propulsion Laboratory, they’ve been working on small drone autonomy for the past few years as part of a Google-funded project. The focus is on high-speed dynamic maneuvering, in the context of flying a drone as fast as possible around an indoor race course using only on-board hardware. For the project’s final demo, JPL raced their autonomous drones through an obstacle course against a professional human racing drone pilot.




The AI-powered drone is fully autonomous, meaning that there’s no external localization or off-board computer control. A Qualcomm Snapdragon Flight board is used for real-time flight control. The drone has a 3D map of the course that it constructs itself using its two wide field-of-view cameras: one pointing forwards and the other pointing downwards resulting in a 250-degree-plus FOV with a persistent horizon. The two cameras generate a depth map from motion stereo, and in flight, the cameras plus an IMU localize to the map and perform visual-inertial odometry for motion tracking.

The AI-powered drone is fully autonomous: There’s no external localization or off-board computer control … It generates depth maps from motion stereo, and in flight, cameras plus an IMU localize and perform visual-inertial odometry for motion tracking

While the drones are capable of straight line speeds of over 120 km/h, JPL’s warehouse isn’t quite large enough for them to go flat out, sadly. The constrained track proved especially tricky for the professional human drone racing pilotKen Loo, who got mentally fatigued by the density of the track. Once Loo learned the course, though, he could complete it in an average of just over 11 seconds, while the autonomous drone took an average of 3 seconds longer. The time difference mostly came from aggression—while the autonomous drone was smoother and more consistent (flying nearly the same time every lap), Loo accelerated and decelerated more quickly, and was able to dynamically improvise maneuvers and shortcuts that the autonomous system couldn’t.

The project’s manager at JPL is Rob Reid, who helped develop that nifty robotic space hedgehog back in 2015. We spoke with Reid to find out why the heck they let a human win this race, and how they’re going to stop that from ever happening again.

IEEE Spectrum: Can you describe the drone autonomy research that JPL has been involved in that led to this demonstration?

Rob Reid: JPL has been researching camera-based navigation techniques for spacecraft and micro aerial vehicles (drones) for decades. Since 2013, it has collaborated with Google on Project Tango, and over the last two years, it has integrated Tango into a drone to demonstrate novel navigation algorithms. The team has explored various trajectory optimization techniques that account for effects such as aerodynamics and camera motion blur.

JPL AI-powered racing dronePhoto: NASA JPL

Why was a drone race an ideal way for you to demonstrate progress in this area?

The goal was to demonstrate high-performance autonomous flight among obstacles—an indoor drone race provides a complex track full of obstacles, along with a compelling reason to fly fast through them!

Were you expecting that the human pilot would win?

I wasn’t surprised by the outcome; we were confident that our drone system was going to be competitive; however, we weren’t sure who was going to learn an optimal trajectory (i.e. racing line) the fastest! With only one afternoon of flying, Ken was able to shave seconds off his lap time much faster than our algorithms could. In the weeks since, we have sped up our optimization approach considerably.

What are the limitations of the hardware that the drones are using to navigate, and how did that affect their performance in the race?

The biggest performance limitation for fast indoor flight comes from the shutter speed of the onboard cameras that are used to track the drone’s motion—flying too fast while too close to the ground, or rolling or pitching too quickly can cause the image to blur and the drone to become lost. We addressed this in two ways: First, by using two wide field-of-view cameras—by pointing one forwards and the other downwards, the >250-degree field-of-view allows the drone to always see the horizon. Second, we adjusted trajectories to cap rotation rates and speed-to-height ratio.

What will it take before drones like these are competitive with human expert pilots in structured environments?

For a typical drone race, the hardware is ready to beat human experts: Our drones are “race spec” and can pull a few g’s. We couldn’t fly a night time race, or on a track with lots of visual repetition.

Are you continuing this project? If so, what can we look forward to?

The work is ongoing, unfortunately I can’t say much of what’s next! But, you can look forward to drones with the ability to sense obstacles and update their own trajectories online.

This area of robotics is progressing rapidly; things like event-based camerascould potentially solve the issue of motion-blur to some extent and enable even more dynamic autonomous maneuvers. And Reid is definitely right that drone hardware is poised to surpass human performance, although that’s the case with robotics in general—we’re at the point where, with a few exceptions, robotics is much more of a software challenge than a hardware challenge. This doesn’t mean that it’s necessarily any easier to solve, though, and we’re excited to see how JPL’s drones evolve.




번호 제목 글쓴이 날짜 조회 수
공지 2019 IVPL 송년회 [1] yjchoi 2019.12.20 347
공지 [한국과총]글로벌 현장연수 참가자 모집 GreatMind 2018.04.04 371
공지 Inform Your Students: Global Student Challenge. Over US$2,000 in Prizes! GreatMind 2017.01.12 4432
공지 nVidia Parallel Computing Software Engineer 채용 GreatMind 2016.11.16 798
29 GTC Korea 2016 (코엑스 인터컨티넨탈호텔)...!!!!! GreatMind 2016.08.26 480
28 Bringing Big Neural Networks to Self-Driving Cars, Smartphones, and Drones GreatMind 2016.04.08 494
27 [News] IVPL 김병규 교수 2018 연구/산학협력 우수교원으로 선정....!!!! GreatMind 2018.12.17 523
26 Scopus Newsletter: CiteScore metrics Special Edition..!!! GreatMind 2016.12.16 533
25 [News] Prof. Lee Kye-Shin has visited to our group.....!!!! file GreatMind 2016.07.05 554
24 Google Is Helping the Pentagon Build AI for Drones GreatMind 2018.03.09 557
23 Graphene Wristband Senses Your Blood Sugar—and Treats It [1] GreatMind 2016.04.09 606
22 [News] Congratulations..!!!! Young-Ju Choi's research paper has been accepted for presentation in ACiVS 2020 (Auckland, New Zealand)!!! GreatMind 2019.11.09 645
21 [News] Mr. Kim Hyun-Teak visited to our lab. to collaborate with us..!!! [12] file GreatMind 2018.02.12 706
20 Graphene Could Help Generate Power From Rain GreatMind 2016.04.08 740
19 카이스트, 모바일용 딥 러닝 AI반도체 개발 GreatMind 2018.03.06 748
» JPL's AI-Powered Racing Drone Challenges Pro Human Pilot GreatMind 2017.12.20 751
17 nVidia Parallel Computing Software Engineer 채용 GreatMind 2016.11.16 798
16 [Notice] Dr. Kalyan's paper has been accepted as minor revision in IEEE Transactions on Industrial Electronics...!!! GreatMind 2017.12.26 811
15 Automatic emergency braking systems need 180-degree frontal vision to help cars avoid hitting cyclists..!!! GreatMind 2018.05.10 819
14 [Notice] Mr. Park's paper has been accepted with minor revision in Displays Journal (Elsevier)...!! GreatMind 2016.05.11 858
13 NVIDIA AI CONFERENCE 2019 GreatMind 2019.06.21 919
12 [Notice] Mr. Lee Jong-Hyeok has been nominated in Marquis Who's Who in the World ...!! GreatMind 2017.09.07 1109
11 Carnegie Mellon Solves 12-Year-Old DARPA Grand Challenge Mystery..!!! GreatMind 2017.10.26 1120
10 [News] Congratulation...!!! Dr. Kalyan's paper has been accepted for publication in the IEEE Transactions on Industrial Electronics...!!! GreatMind 2018.03.02 1201