Ultimate SLAM? Combining Events, Images, and IMU for Robust Visual SLAM in HDR and High Speed Scenarios


In this paper, we present the first state estimation pipeline that leverages the complementary advantages of a standard camera and an event camera by fusing, in a tightly-coupled manner events, standard frames, and inertial measurements. Furthermore, we use our pipeline to demonstrate the first autonomous quadrotor flight using an event camera for state estimation, unlocking flight scenarios that were not reachable with traditional visual-inertial odometry, such as low-light environments and high-dynamic range scenes.

In Robotics and Automation Letters (RA-L)