I am happy to announce today that my team achieved the first ever closed-loop autonomous flight using an event camera for state estimation! Watch the video here!
This achievement is the product of several years of research, and I am very proud of the result. Thanks to the event camera, out quadrotor can “see” in high-speed, even in dark environments.
The algorithm running onboard the quadrotor is largely based on my recent paper: Real-time Visual-Inertial Odometry for Event Cameras using Keyframe-based Nonlinear Optimization, which we extended to use standard frames as an additional sensing modality in a following paper: Hybrid, Frame and Event based Visual Inertial Odometry for Robust, Autonomous Navigation of Quadrotors. The resulting pipeline is able to work as well as state of the art visual-inertial odometry pipelines in normal conditions (good lighting, motions with moderate-speed) by exploiting the standard images, and can seamlessly use the event camera in more challenging situations (dark / HDR scenes, high-speed motions).
Expect to see more soon!