Email Login
Community
News | Research | DeepLearning | Seminar | Q&A

As drones and their components get smaller, more efficient, and more capable, we’ve seen an increasing amount of research towards getting these things flying by themselves in semi-structured environments without relying on external localization. The University of Pennsylvania has done some amazing work in this area, as has DARPA’s Fast Lightweight Autonomy program.

At NASA’s Jet Propulsion Laboratory, they’ve been working on small drone autonomy for the past few years as part of a Google-funded project. The focus is on high-speed dynamic maneuvering, in the context of flying a drone as fast as possible around an indoor race course using only on-board hardware. For the project’s final demo, JPL raced their autonomous drones through an obstacle course against a professional human racing drone pilot.

 

 

 

The AI-powered drone is fully autonomous, meaning that there’s no external localization or off-board computer control. A Qualcomm Snapdragon Flight board is used for real-time flight control. The drone has a 3D map of the course that it constructs itself using its two wide field-of-view cameras: one pointing forwards and the other pointing downwards resulting in a 250-degree-plus FOV with a persistent horizon. The two cameras generate a depth map from motion stereo, and in flight, the cameras plus an IMU localize to the map and perform visual-inertial odometry for motion tracking.

The AI-powered drone is fully autonomous: There’s no external localization or off-board computer control … It generates depth maps from motion stereo, and in flight, cameras plus an IMU localize and perform visual-inertial odometry for motion tracking

While the drones are capable of straight line speeds of over 120 km/h, JPL’s warehouse isn’t quite large enough for them to go flat out, sadly. The constrained track proved especially tricky for the professional human drone racing pilotKen Loo, who got mentally fatigued by the density of the track. Once Loo learned the course, though, he could complete it in an average of just over 11 seconds, while the autonomous drone took an average of 3 seconds longer. The time difference mostly came from aggression—while the autonomous drone was smoother and more consistent (flying nearly the same time every lap), Loo accelerated and decelerated more quickly, and was able to dynamically improvise maneuvers and shortcuts that the autonomous system couldn’t.

The project’s manager at JPL is Rob Reid, who helped develop that nifty robotic space hedgehog back in 2015. We spoke with Reid to find out why the heck they let a human win this race, and how they’re going to stop that from ever happening again.

IEEE Spectrum: Can you describe the drone autonomy research that JPL has been involved in that led to this demonstration?

Rob Reid: JPL has been researching camera-based navigation techniques for spacecraft and micro aerial vehicles (drones) for decades. Since 2013, it has collaborated with Google on Project Tango, and over the last two years, it has integrated Tango into a drone to demonstrate novel navigation algorithms. The team has explored various trajectory optimization techniques that account for effects such as aerodynamics and camera motion blur.

JPL AI-powered racing dronePhoto: NASA JPL

Why was a drone race an ideal way for you to demonstrate progress in this area?

The goal was to demonstrate high-performance autonomous flight among obstacles—an indoor drone race provides a complex track full of obstacles, along with a compelling reason to fly fast through them!

Were you expecting that the human pilot would win?

I wasn’t surprised by the outcome; we were confident that our drone system was going to be competitive; however, we weren’t sure who was going to learn an optimal trajectory (i.e. racing line) the fastest! With only one afternoon of flying, Ken was able to shave seconds off his lap time much faster than our algorithms could. In the weeks since, we have sped up our optimization approach considerably.

What are the limitations of the hardware that the drones are using to navigate, and how did that affect their performance in the race?

The biggest performance limitation for fast indoor flight comes from the shutter speed of the onboard cameras that are used to track the drone’s motion—flying too fast while too close to the ground, or rolling or pitching too quickly can cause the image to blur and the drone to become lost. We addressed this in two ways: First, by using two wide field-of-view cameras—by pointing one forwards and the other downwards, the >250-degree field-of-view allows the drone to always see the horizon. Second, we adjusted trajectories to cap rotation rates and speed-to-height ratio.

What will it take before drones like these are competitive with human expert pilots in structured environments?

For a typical drone race, the hardware is ready to beat human experts: Our drones are “race spec” and can pull a few g’s. We couldn’t fly a night time race, or on a track with lots of visual repetition.

Are you continuing this project? If so, what can we look forward to?

The work is ongoing, unfortunately I can’t say much of what’s next! But, you can look forward to drones with the ability to sense obstacles and update their own trajectories online.


This area of robotics is progressing rapidly; things like event-based camerascould potentially solve the issue of motion-blur to some extent and enable even more dynamic autonomous maneuvers. And Reid is definitely right that drone hardware is poised to surpass human performance, although that’s the case with robotics in general—we’re at the point where, with a few exceptions, robotics is much more of a software challenge than a hardware challenge. This doesn’t mean that it’s necessarily any easier to solve, though, and we’re excited to see how JPL’s drones evolve.

JPL ]

 

link: https://spectrum.ieee.org/automaton/robotics/drones/jpl-ai-powered-racing-drone?utm_source=roboticsnews&utm_campaign=roboticsnews-12-19-17&utm_medium=email

번호 제목 글쓴이 날짜 조회 수
공지 2019 IVPL 송년회 [1] yjchoi 2019.12.20 149
공지 [한국과총]글로벌 현장연수 참가자 모집 GreatMind 2018.04.04 187
공지 Inform Your Students: Global Student Challenge. Over US$2,000 in Prizes! GreatMind 2017.01.12 4215
공지 nVidia Parallel Computing Software Engineer 채용 GreatMind 2016.11.16 619
84 [News] Mr. Kim Hyun-Teak visited to our lab. to collaborate with us..!!! [1] file GreatMind 2018.02.12 237
83 [News] 성경식품 UCC 아이디어 공모전 동상 수상....! GreatMind 2018.01.11 68
82 Building AI systems that work is still hard..! [3] GreatMind 2018.01.04 1427
81 [Notice] Dr. Kalyan's paper has been accepted as minor revision in IEEE Transactions on Industrial Electronics...!!! GreatMind 2017.12.26 646
» JPL's AI-Powered Racing Drone Challenges Pro Human Pilot GreatMind 2017.12.20 695
79 [Notice] 캡스톤디자인팀 똑띠 학부우수 논문상 수상 (DdokDdi has won the Outstanding Paper Award in KMMS Autumn conference)...!!!! file GreatMind 2017.11.29 178
78 [Notice] Young-Ju Choi, Ji-Hae Kim's paper has been accepted for publication in Journal of Digital Contents Society..!!!! GreatMind 2017.11.21 52
77 Carnegie Mellon Solves 12-Year-Old DARPA Grand Challenge Mystery..!!! GreatMind 2017.10.26 921
76 NVIDIA Jetson Developer Contest...!!! GreatMind 2017.10.24 51
75 [Notice] Mr. Young-Woon Lee and Ms. Young-Ju Choi's paper has been accepted for publication in Journal of Digital Contents Society..!!!! GreatMind 2017.10.10 47
74 [Notice] Mr. Lee Jong-Hyeok of our group, his biography has been listed (accepted) on Who's Who in the World 2018....!!! GreatMind 2017.09.08 247
73 [Notice] Mr. Lee Jong-Hyeok has been nominated in Marquis Who's Who in the World ...!! GreatMind 2017.09.07 913
72 [Notice] Mr. Young-Woon Lee's paper has been accepted for presentation in IEEE ICCE 2018..!!! GreatMind 2017.09.06 20
71 2017년 실리콘밸리 인턴십 운영지원사업 참여학생 선발 공고 GreatMind 2017.09.05 267
70 Smart glasses let you turn off the lights in the blink of an eye GreatMind 2017.08.01 274
69 Facebook, Microsoft, and IBM Leaders on Challenges for AI and Their AI Partnership..! GreatMind 2017.07.14 101
68 AI Creates Fake Obama...! GreatMind 2017.07.14 250
67 [Notice] Mr. Hong's paper has been accepted in Displays Journal (Elsevier)....!!!! GreatMind 2017.07.13 57
66 [Notice] Dr. Tae-Jung Kim's paper has been accepted for publication in Personal and Ubiquitous Computing (Springer)...!!! GreatMind 2017.07.06 56
65 JCR 2016 Update...!!! GreatMind 2017.06.27 85