R2LIVE is a robust, real-time tightly-coupled multi-sensor fusion framework, which fuses the measurement from the LiDAR, inertial sensor, visual camera to achieve robust, accurate state estimation.

R2LIVE

A Robust, Real-time, LiDAR-Inertial-Visual tightly-coupled state Estimator and mapping

Our preprint paper: we have corrected some typos and errors of our previous version of paper, the amended paper can be access at here. When amending our paper, I would like to thanks narutojxl (焦小亮), who has found my errors and provided his corrections.

Our related video: our related video is now available on YouTube (click below images to open):

video

R2LIVE is a robust, real-time tightly-coupled multi-sensor fusion framework, which fuses the measurement from the LiDAR, inertial sensor, visual camera to achieve robust, accurate state estimation. Taking advantage of measurement from all individual sensors, our algorithm is robust enough to various visual failure, LiDAR-degenerated scenarios, and is able to run in real time on an on-board computation platform, as shown by extensive experiments conducted in indoor, outdoor, and mixed environment of different scale.

The reconstructed 3D maps of HKU main building are shown in (d), and the detail point cloud with the correspondence panorama images are shown in (a) and (b). (c) shows that our algorithm can close the loop by itself (returning the starting point) without any additional processing (e.g. loop closure). In (e), we merge our map with the satellite image to further examine the accuracy of our system.

1. Prerequisites

1.1 Ubuntu and ROS

Ubuntu 64-bit 16.04 or 18.04. ROS Kinetic or Melodic. ROS Installation and its additional ROS pacakge:

    sudo apt-get install ros-XXX-cv-bridge ros-XXX-tf ros-XXX-message-filters ros-XXX-image-transport

NOTICE: remember to replace "XXX" on above command as your ROS distributions, for example, if your use ROS-kinetic, the command should be:

    sudo apt-get install ros-kinetic-cv-bridge ros-kinetic-tf ros-kinetic-message-filters ros-kinetic-image-transport

1.2. Ceres Solver

Follow Ceres Installation.

1.3. livox_ros_driver

Follow livox_ros_driver Installation.

2. Build

Clone the repository and catkin_make:

    cd ~/catkin_ws/src
    git clone https://github.com/hku-mars/r2live.git
    cd ../
    catkin_make
    source ~/catkin_ws/devel/setup.bash

3. Run our examples

Download Our recorded rosbag and then

roslaunch r2live demo.launch
rosbag play YOUR_DOWNLOADED.bag

4.Acknowledgments

Our repository contains two main subsystems, with our LiDAR-inertial and visual-inertial system is developed based on FAST-LIO and VINS-Mono, respectively. Besides, our implementations also use the codes of ikd-Tree, BALM and loam-livox.

5. License

The source code is released under GPLv2 license.

We are still working on improving the performance and reliability of our codes. For any technical issues, please contact me via email Jiarong Lin < [email protected] >.

For commercial use, please contact Dr. Fu Zhang < [email protected] >

Owner
HKU-Mars-Lab
Mechatronics and Robotic Systems (MaRS) Laboratory
HKU-Mars-Lab
Comments
  • Launch error

    Launch error

    Hello,

    I am able to build the project with opecv4 changes it is successful. But when I running

    roslaunch r2live demo.launch

    It is crashing and giving the following error:

    Multi thread started /home/adit/catkin_ws/src/r2live/r2live/src/./fast_lio/fast_lio.hpp 791 [Ros_parameter]: fast_lio/dense_map_enable ==> 1 [Ros_parameter]: fast_lio/lidar_time_delay ==> 0 [Ros_parameter]: fast_lio/max_iteration ==> 4 [Ros_parameter]: fast_lio/fov_degree ==> 360 [Ros_parameter]: fast_lio/filter_size_corner ==> 0.4 [Ros_parameter]: fast_lio/filter_size_surf ==> 0.4 [Ros_parameter]: fast_lio/filter_size_surf_z ==> 0.4 [Ros_parameter]: fast_lio/filter_size_map ==> 0.4 [Ros_parameter]: fast_lio/cube_side_length ==> 1e+09 [Ros_parameter]: fast_lio/maximum_pt_kdtree_dis ==> 0.5 [Ros_parameter]: fast_lio/maximum_res_dis ==> 0.3 [Ros_parameter]: fast_lio/planar_check_dis ==> 0.1 [Ros_parameter]: fast_lio/long_rang_pt_dis ==> 50 [Ros_parameter]: fast_lio/publish_feature_map ==> 0 /home/adit/catkin_ws/src/r2live/r2live/src/./fast_lio/fast_lio.hpp 816 /home/adit/catkin_ws/src/r2live/r2live/src/./fast_lio/fast_lio.hpp 829 /home/adit/catkin_ws/src/r2live/r2live/src/./fast_lio/fast_lio.hpp 831 [ WARN] [1625127750.923615584]: waiting for image and imu...

    ~~~~/home/adit/catkin_ws/src/r2live/r2live/ doesn't exist

    [r2live-4] process has died [pid 33098, exit code -11, cmd /home/adit/catkin_ws/devel/lib/r2live/r2live __name:=r2live __log:=/home/adit/.ros/log/7a2ab4f8-da45-11eb-a931-3df647905bdb/r2live-4.log]. log file: /home/adit/.ros/log/7a2ab4f8-da45-11eb-a931-3df647905bdb/r2live-4*.log

  • Hardware schema

    Hardware schema

    Does anyone have a schematic of how to pair all the hardware together? It especially bothers me how to connect the camera and make sync with others ...

  • issue

    issue

    ERROR: cannot launch node of type [feature_tracker/feature_tracker]: Cannot locate node of type [feature_tracker] in package [feature_tracker]. Make sure file exists in package path and permission is set to executable (chmod +x) ERROR: cannot launch node of type [r2live/lio_feat_extract]: Cannot locate node of type [lio_feat_extract] in package [r2live]. Make sure file exists in package path and permission is set to executable (chmod +x) ERROR: cannot launch node of type [r2live/r2live]: Cannot locate node of type [r2live] in package [r2live]. Make sure file exists in package path and permission is set to executable (chmod +x)

    How do I resolve this? thanks

  • Requirements (Software + Hardware).

    Requirements (Software + Hardware).

    Hey guys! Great work.

    Could you please already indicate already your requirements? (at least which LiDAR, Ubuntu and ROS version) It will be helpfully to prepare the material in the meanwhile..

    Thanks!

  • Experiencing large drift while mapping with livox avia

    Experiencing large drift while mapping with livox avia

    Hello,

    Thanks for sharing such amazing work. I am experiencing large drift when I am running demo.launch on live stream. I have changed the calibration params in config file.

    For livox, I am using the following driver (https://github.com/ziv-lin/livox_ros_driver_for_R2LIVE)

    The following is the set of errors I am gettting: /opt/catkin_ws/src/r2live/src/fast_lio/IMU_Processing.cpp, 400, dt = 0.105697 /opt/catkin_ws/src/r2live/src/fast_lio/IMU_Processing.cpp, 135, check_state fail !!!! 5.55778 -9.90081 -10.1357 /opt/catkin_ws/src/r2live/src/fast_lio/IMU_Processing.cpp, 135, check_state fail !!!! 5.55778 -9.90081 -10.1357 /opt/catkin_ws/src/r2live/src/fast_lio/IMU_Processing.cpp, 135, check_state fail !!!! 5.55778 -9.90081 -10.1357 /opt/catkin_ws/src/r2live/src/fast_lio/IMU_Processing.cpp, 135, check_state fail !!!! 5.55778 -9.90081 -10.1357 /opt/catkin_ws/src/r2live/src/fast_lio/IMU_Processing.cpp, 135, check_state fail !!!! 5.55778 -9.90081 -10.1357 /opt/catkin_ws/src/r2live/src/fast_lio/IMU_Processing.cpp, 135, check_state fail !!!! 5.55778 -9.90081 -10.1357 /opt/catkin_ws/src/r2live/src/fast_lio/IMU_Processing.cpp, 135, check_state fail !!!! 5.55778 -9.90081 -10.1357 /opt/catkin_ws/src/r2live/src/fast_lio/IMU_Processing.cpp, 135, check_state fail !!!! 5.55778 -9.90081 -10.1357 /opt/catkin_ws/src/r2live/src/fast_lio/IMU_Processing.cpp, 135, check_state fail !!!! 5.55778 -9.90081 -10.1357 /opt/catkin_ws/src/r2live/src/fast_lio/IMU_Processing.cpp, 135, check_state fail !!!! 5.55778 -9.90081 -10.1357 state_inout |[673.69852] | (-12.992, -47.860, 54.054) | (18.039, -1.217, -9.484) | (5.558, -9.901, 0.000) | (-0.150, -1.614, 0.984) | (-1.413, 3.676, -6.752) state_in |[673.69852] | (-12.992, -47.860, 54.054) | (18.039, -1.217, -9.484) | (5.558, -9.901, -10.136) | (-0.150, -1.614, 0.984) | (-1.413, 3.676, -6.752) /opt/catkin_ws/src/r2live/src/fast_lio/IMU_Processing.cpp, 135, check_state fail !!!! 5.31588 -10.1107 -0.17032

    Everytime after the following error I am getting large drift. [ INFO] [1632737692.276809736]: big translation [ WARN] [1632737692.277026159]: failure detection! [ WARN] [1632737692.281113827]: system reboot!

  • R2Live + external IMU

    R2Live + external IMU

    Good day! I have a running environment with FAST-LIO2 using an external IMU, thanks for that! Would be possible in the future to use an external IMU also with r2live? will be great for MID40 and MID70.

    Thanks.

  • What is the unit of acceleration in the IMU topic?

    What is the unit of acceleration in the IMU topic?

    What is the unit of linear acceleration in the IMU topic?

    In livox _ros _driver, the unit of acceleration in the IMU topic is [g] by default. (https://github.com/Livox-SDK/livox_ros_driver/issues/63) But in your sample bag file(harbor.bag), the unit of acceleration in your IMU topic(/livox/imu) looks like [m/s^2] for me.

    Best regards,

  • lio_feat_extract died when using Livox Mid-70

    lio_feat_extract died when using Livox Mid-70

    Dear developers, thanks for your amazing work on this multi sensor algorithm.

    I'm currently trying to run R2live on my devices, they are Livox Mid-70, Zed2i(camera and built-in IMU). I've customized the filr_cam.yaml to adapt my device, and rename the topics to the same provided in you rosbag. When play my own recorded rosbag. The camera part and imu data have been successfully connected. Screenshot from 2021-12-17 12-43-42

    but my lidar data can not be linked. Screenshot from 2021-12-17 12-44-01

    It shows this error in terminal: Screenshot from 2021-12-16 19-47-17

    I check the log file but it's a blank file. So far I have no idea how to connect my lidar(mid-70) data. Could you help me with this problem? Thanks in advance!

  • LiDAR incoming frame too old, need to be drop!!!

    LiDAR incoming frame too old, need to be drop!!!

    Hei. Today your code is using for MID40 ros::Time ct(ros::Time::now()); on line 163 of feature_extract.cpp

    This make it impossible to re run pre recorded bags as the following error show up for every LiDAR message LiDAR incoming frame too old, need to be drop!!! Failed to find match for field 'normal_x'. Failed to find match for field 'normal_y'. Failed to find match for field 'normal_z'. Failed to find match for field 'curvature'.

    Is there any possibility or could you help me to implement a mid_handler that can receive CustomMsg so the timestamp is synchronized with the camera and IMU?

    Thanks!

  • Is it compatible with opencv 4?

    Is it compatible with opencv 4?

    Thank you for sharing your amazing work. I am trying to catkin_make r2live but it is failing and I guess the error is related to opecv version. Currently I am using opencv 4.4.

  • great!but how can I upgrade my mid-40😢,I have an external imu.

    great!but how can I upgrade my mid-40😢,I have an external imu.

    It's really great, but can you please tell me how to integrate mid40 with external imu, because I have done it according to your algorithm a month ago, according to the original loam version (although your algorithm has been upgraded many times ) I want to upgrade now. It is too extravagant for me to buy a lidar with imu built-in again. I have a nine-axis imu. How can I upgrade? Thanks, the poor from the northeast of the mainland, thank you again

This robot lcoalisation package for lidar-map based localisation using multi-sensor state estimation.
This robot lcoalisation package for lidar-map based localisation using multi-sensor state estimation.

A ROS-based NDT localizer with multi-sensor state estimation This repo is a ROS based multi-sensor robot localisation. An NDT localizer is loosely-cou

Dec 15, 2022
RP-VIO: Robust Plane-based Visual-Inertial Odometry for Dynamic Environments (Code & Dataset)
RP-VIO: Robust Plane-based Visual-Inertial Odometry for Dynamic Environments (Code & Dataset)

RP-VIO: Robust Plane-based Visual-Inertial Odometry for Dynamic Environments RP-VIO is a monocular visual-inertial odometry (VIO) system that uses onl

Jan 6, 2023
SSL_SLAM2: Lightweight 3-D Localization and Mapping for Solid-State LiDAR (mapping and localization separated) ICRA 2021
SSL_SLAM2: Lightweight 3-D Localization and Mapping for Solid-State LiDAR (mapping and localization separated)  ICRA 2021

SSL_SLAM2 Lightweight 3-D Localization and Mapping for Solid-State LiDAR (Intel Realsense L515 as an example) This repo is an extension work of SSL_SL

Dec 27, 2022
A Modular Framework for LiDAR-based Lifelong Mapping
A Modular Framework for LiDAR-based Lifelong Mapping

LT-mapper News July 2021 A preprint manuscript is available (download the preprint). LT-SLAM module is released.

Dec 30, 2022
Yggdrasil Decision Forests (YDF) is a collection of state-of-the-art algorithms for the training, serving and interpretation of Decision Forest models.
Yggdrasil Decision Forests (YDF) is a collection of state-of-the-art algorithms for the training, serving and interpretation of Decision Forest models.

Yggdrasil Decision Forests (YDF) is a collection of state-of-the-art algorithms for the training, serving and interpretation of Decision Forest models. The library is developed in C++ and available in C++, CLI (command-line-interface, i.e. shell commands) and in TensorFlow under the name TensorFlow Decision Forests (TF-DF).

Jan 9, 2023
A toolkit for making real world machine learning and data analysis applications in C++

dlib C++ library Dlib is a modern C++ toolkit containing machine learning algorithms and tools for creating complex software in C++ to solve real worl

Dec 31, 2022
A RGB-D SLAM system for structural scenes, which makes use of point-line-plane features and the Manhattan World assumption.
A RGB-D SLAM system for structural scenes, which makes use of point-line-plane features and the Manhattan World assumption.

This repo proposes a RGB-D SLAM system specifically designed for structured environments and aimed at improved tracking and mapping accuracy by relying on geometric features that are extracted from the surrounding.

Jan 2, 2023
Time series anomaly detection for Ruby

AnomalyDetection.rb ?? Time series AnomalyDetection for Ruby Learn how it works Installation Add this line to your application’s Gemfile: gem 'anomaly

Nov 16, 2022
Caffe: a fast open framework for deep learning.

Caffe Caffe is a deep learning framework made with expression, speed, and modularity in mind. It is developed by Berkeley AI Research (BAIR)/The Berke

Jan 1, 2023
[RSS 2021] An End-to-End Differentiable Framework for Contact-Aware Robot Design
[RSS 2021] An End-to-End Differentiable Framework for Contact-Aware Robot Design

DiffHand This repository contains the implementation for the paper An End-to-End Differentiable Framework for Contact-Aware Robot Design (RSS 2021). I

Jan 4, 2023
Ingescape - Model-based framework for broker-free distributed software environments
 Ingescape - Model-based framework for broker-free distributed software environments

Ingescape - Model-based framework for broker-free distributed software environments Overview Scope and Goals Ownership and License Dependencies with o

Jan 5, 2023
Machine Learning Framework for Operating Systems - Brings ML to Linux kernel
Machine Learning Framework for Operating Systems - Brings ML to Linux kernel

Machine Learning Framework for Operating Systems - Brings ML to Linux kernel

Nov 24, 2022
R3live - A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package
R3live - A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package

R3LIVE A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package News [Dec 31, 2021] Release of cod

Jan 4, 2023
RRxIO - Robust Radar Visual/Thermal Inertial Odometry: Robust and accurate state estimation even in challenging visual conditions.
RRxIO - Robust Radar Visual/Thermal Inertial Odometry: Robust and accurate state estimation even in challenging visual conditions.

RRxIO - Robust Radar Visual/Thermal Inertial Odometry RRxIO offers robust and accurate state estimation even in challenging visual conditions. RRxIO c

Dec 20, 2022
Tightly coupled GNSS-Visual-Inertial system for locally smooth and globally consistent state estimation in complex environment.
Tightly coupled GNSS-Visual-Inertial system for locally smooth and globally consistent state estimation in complex environment.

GVINS GVINS: Tightly Coupled GNSS-Visual-Inertial Fusion for Smooth and Consistent State Estimation. paper link Authors: Shaozu CAO, Xiuyuan LU and Sh

Dec 30, 2022
VID-Fusion: Robust Visual-Inertial-Dynamics Odometry for Accurate External Force Estimation
VID-Fusion: Robust Visual-Inertial-Dynamics Odometry for Accurate External Force Estimation

VID-Fusion VID-Fusion: Robust Visual-Inertial-Dynamics Odometry for Accurate External Force Estimation Authors: Ziming Ding , Tiankai Yang, Kunyi Zhan

Nov 18, 2022
A real-time, direct and tightly-coupled LiDAR-Inertial SLAM for high velocities with spinning LiDARs
A real-time, direct and tightly-coupled LiDAR-Inertial SLAM for high velocities with spinning LiDARs

LIMO-Velo [Alpha] ?? [16 February 2022] ?? The project is on alpha stage, so be sure to open Issues and Discussions and give all the feedback you can!

Dec 28, 2022
LVI-SAM: Tightly-coupled Lidar-Visual-Inertial Odometry via Smoothing and Mapping
LVI-SAM: Tightly-coupled Lidar-Visual-Inertial Odometry via Smoothing and Mapping

LVI-SAM This repository contains code for a lidar-visual-inertial odometry and mapping system, which combines the advantages of LIO-SAM and Vins-Mono

Jan 8, 2023
Fuses IMU readings with a complementary filter to achieve accurate pitch and roll readings.
Fuses IMU readings with a complementary filter to achieve accurate pitch and roll readings.

SimpleFusion A library that fuses accelerometer and gyroscope readings quickly and easily with a complementary filter. Overview This library combines

Aug 22, 2022