My paper EMVS: Event-Based Multi-View Stereo - 3D Reconstruction with an Event Camera in Real-Time about semi-dense 3D reconstruction with an event camera has been accepted to the International Journal of Computer Vision!
This work is the first to show that event cameras can be used to provide accurate, semi-dense 3D maps of a given environment, without explicitly trying to solve data association. You can watch the video here!
I am happy to announce today that my team achieved the first ever closed-loop autonomous flight using an event camera for state estimation! Watch the video here!
This achievement is the product of several years of research, and I am very proud of the result. Thanks to the event camera, out quadrotor can “see” in high-speed, even in dark environments.
The algorithm running onboard the quadrotor is largely based on my recent paper: Real-time Visual-Inertial Odometry for Event Cameras using Keyframe-based Nonlinear Optimization, which we extended to use standard frames as an additional sensing modality in a following paper: Hybrid, Frame and Event based Visual Inertial Odometry for Robust, Autonomous Navigation of Quadrotors.
Our paper Real-time Visual-Inertial Odometry for Event Cameras using Keyframe-based Nonlinear Optimization about visual-inertial odometry using an event camera has been accepted at BMVC’17 for oral presentation (acceptance rate: 5.6 %)!
You can watch the video here!
Our paper EVO: A Geometric Approach to Event-based
6-DOF Parallel Tracking and Mapping in Real-time has been accepted for publication in the Robotics and Automation Letters (RA-L), and for presentation at ICRA’17!
Our paper EMVS: Event-based Multi-View Stereo, receives the BMVC’16 Best Industry Paper Award!