How to simulate a Livox MID-360 using Gazebo

With Gazebo Classic having reached End-of-Life in January 2025 many users are forced to upgrade to modern Gazebo (formerly known as Ignition). Unfortunately in case of Livox Lidar’s, this also means that the vendor provided Lidar simulation plugin Livox-SDK/livox_laser_simulation can no longer be used, as only Gazebo Classic is supported. In this article I’ll take a look at the necessary steps to simulate a Livox MID-360 Lidar scanner using modern Gazebo’s gpu_lidar plugin.

Note: The content of this article applies to Gazebo Harmonic 8.6.0. Same or similar principles apply to other versions of Gazebo, though the specific implementation details may differ.

Update: Since the Livox MID-360 scanner contains an Inertial Measurement Unit (IMU) this article was expanded to contain information how to obtain simulated IMU data.

To ensure that Gazebo’s sensor plugins are loaded upon Gazebo start the following paragraph needs to be added under the <world> which is located in the SDF-File describing your simulation environment.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
<sdf version="1.8">
  <world name="flat">
    <!-- Gazebo plugin for simulating Lidar sensor. --->
    <plugin
        filename="gz-sim-sensors-system"
        name="gz::sim::systems::Sensors">
        <render_engine>ogre2</render_engine>
    </plugin>
    <!-- Gazebo plugin for simulating IMU sensor. --->
    <plugin
        filename="gz-sim-imu-system"
        name="gz::sim::systems::Imu">
    </plugin>

Lidar

If no specific frame describing the spatial relationship between Lidar and the rest of the robot exists, necessary descriptions need to be added to the robot’s URDF (or SDF) model.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
<robot>
  <!-- ... --->
  <link name="body_link"/>
  <!-- ... --->
  <link name="laser_link"/>
  <joint name="body_laser_joint" type="fixed">
    <parent link="body_link"/>
    <child link="laser_link"/>
    <origin xyz="0.10 0 0.05" rpy="0 0 0"/>
  </joint>

URDF (and SDF) allow the attachment of mesh files to robot elements which allows Gazebo (and also Rviz2) to show an actual 3D model of the Lidar instead of a simple coordinate frame visual. The MID-360 mesh files (mid-360.stl) can be downloaded from the vendors product page and need to be stored in the ROS robot description package (i.e. l4xz_description/meshes).

It is imperative that those mesh files are installed during the package install process:

1
install(DIRECTORY meshes DESTINATION share/${PROJECT_NAME})

By adding a <visual> the correct mesh file can be loaded for visualisation by both Gazebo and Rviz2 (Hint: use the filename="file://$(find PKG_NAME)/... syntax as it is supported by both Gazebo and Rviz2.)

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
-<link name="laser_link"/>
+<link name="laser_link">
+    <visual>
+        <origin xyz="0 0 0" rpy="${pi/2} 0 ${pi}"/>
+        <geometry>
+            <mesh filename="file://$(find l4xz_description)/meshes/mid-360.stl" scale="0.001 0.001 0.001"/>
+        </geometry>
+        <material name="Gray">
+            <color rgba="0.5 0.5 0.5 1.0"/>
+        </material>
+    </visual>
+</link>

After starting the Gazebo simulation the MID-360’s 3D visualisation is rendered on the top of the robot.

Livox MID-360 Gazebo

Adding a <sensor name="gpu_lidar" type="gpu_lidar"> tag to the robot description instructs Gazebo to load the Lidar sensor plugin. A host of parameters can be used to describe the desired Lidar behaviour, such as horizontal and vertical field of view, Lidar range, update rate, etc.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
<gazebo reference="laser_link">
  <sensor name="gpu_lidar" type="gpu_lidar">
    <pose relative_to="laser_link">0 0 0 0 0 0</pose>
    <topic>/model/l4xz/scan</topic>
    <update_rate>10</update_rate>
    <ray>
      <scan>
        <horizontal>
          <samples>720</samples>      <!-- MID-360 0.5° resolution -->
          <resolution>1</resolution>
          <!-- MID-360 HFOV = 360° -->
          <min_angle>-3.14156</min_angle>
          <max_angle>3.14156</max_angle>
        </horizontal>
        <vertical>
          <samples>40</samples>       <!-- MID-360 point cloud density -->
          <resolution>1</resolution>
          <!-- MID-360 VFOV = 59° -->
          <min_angle>0</min_angle>
          <max_angle>0.9774</max_angle>
        </vertical>
      </scan>
      <range>
        <min>0.01</min>                 <!-- MID-360 min. detection range -->
        <max>40.0</max>                 <!-- MID-360 max. detection range -->
        <resolution>0.01</resolution>
      </range>
    </ray>
    <always_on>1</always_on>
    <visualize>true</visualize>
  </sensor>
</gazebo>

After starting the simulation verify that actual Lidar data is published using gz topic --list.

1
2
3
$ gz topic --list | grep scan
/model/l4xz/scan
/model/l4xz/scan/points

The gpu_lidar plugin is publishing two different message types, a 2D Lidar scan gz.msgs.LaserScan under /model/l4xz/scan and a 3D point cloud gz.msgs.PointCloudPacked under /model/l4xz/scan/points. (Hint: this information can be retrieved using gz topic --info --topic ....)

In order to have the Lidar data available within ROS it needs to be “bridged” from the Gazebo message transport system to the ROS message transport system. This can be achieved by using the ros_gz_brige package.

1
2
3
4
5
6
7
8
9
pkg_l4xz_gz_bringup = get_package_share_directory('l4xz_gz_bringup')
gazebo_ros_bridge = Node(
    package='ros_gz_bridge',
    executable='parameter_bridge',
    output='screen',
    parameters=[
        {'config_file': os.path.join(pkg_l4xz_gz_bringup, 'config', 'ros_gz_bridge.yaml')},
    ]
)

Types and topic names of bridged messages can be specified in ros_gz_bridge.yaml. In the listing below /model/l4xz/scan/points of Gazebo message type gz.msgs.PointCloudPacked is translated to /l4xz/scan of ROS message type sensor_msgs/msg/PointCloud2.

1
2
3
4
5
6
---
- ros_topic_name: "/l4xz/scan"
  gz_topic_name: "/model/l4xz/scan/points"
  ros_type_name: "sensor_msgs/msg/PointCloud2"
  gz_type_name: "gz.msgs.PointCloudPacked"
  direction: GZ_TO_ROS

The correct configuration of the bridge can be verified using ros2 topic:

1
2
3
4
$ ros2 topic info /l4xz/scan
Type: sensor_msgs/msg/PointCloud2
Publisher count: 1
Subscription count: 1

However, upon checking the frame_id field of the /l4xz/scan message the field does not contain the expected laser_link frame but contains l4xz/base_link/gpu_lidar instead. This is due to the fact that at this moment you can not manually set the frame of a Gazebo message - instead Gazebo is self-assigning a frame_id based on the link hierarchy from robot to sensor.

1
2
$ ros2 topic echo /l4xz/scan | grep frame_id
  frame_id: l4xz/base_link/gpu_lidar

As a solution a static transform publisher establishing the spatial relation between l4xz/base_link/gpu_lidar and laser_link needs to be launched.

1
2
3
4
5
gazebo_lidar_tf_node = Node(
    package='tf2_ros',
    executable='static_transform_publisher',
    arguments = ['0', '0', '0', '0', '0', '0', 'laser_link', 'l4xz/base_link/gpu_lidar'],
)

Now with this last and final step it is now possible to sensibly access the simulated Livox MID-360 data in ROS, as done on the screenshot below using Rviz2.

Livox MID-360 Rviz2

IMU

Since Lidar and IMU data share the same coordinate frame, the spatial relationship of the IMU can be defined in a similarly straightforward manner by adding the following SDF description:

1
2
3
4
5
6
7
8
9
<robot>
  <!-- ... --->
  <link name="imu_link" />
  <!-- ... --->
  <joint name="laser_imu_joint" type="fixed">
    <parent link="laser_link"/>
    <child link="imu_link"/>
    <origin xyz="0 0 0" rpy="0 0 0"/>
  </joint>

Adding a <sensor name="imu_sensor" type="imu"> tag to the robot description instructs Gazebo to load the IMU sensor plugin.

1
2
3
4
5
6
7
8
<gazebo reference="imu_link">
  <sensor name="imu_sensor" type="imu">
    <always_on>1</always_on>
    <update_rate>100</update_rate>
    <visualize>true</visualize>
    <topic>/model/l4xz/imu</topic>
 </sensor>
</gazebo>

After starting the simulation verify that actual IMU is published using gz topic --list.

1
2
$ gz topic --list | grep imu
/model/l4xz/imu

In order to have the IMU data available within ROS it needs to be “bridged” from the Gazebo message transport system to the ROS message transport system. This can be achieved by using the ros_gz_brige package (as was done for bridging Lidar scan data). Types and topic names of bridged messages can be specified in ros_gz_bridge.yaml. In the listing below /model/l4xz/imu of Gazebo message type gz.msgs.IMU is translated to /l4xz/imu of ROS message type sensor_msgs/msg/Imu.

1
2
3
4
5
6
---
- ros_topic_name: "/l4xz/imu"
  gz_topic_name: "/model/l4xz/imu"
  ros_type_name: "sensor_msgs/msg/Imu"
  gz_type_name: "gz.msgs.IMU"
  direction: GZ_TO_ROS

The correct configuration of the bridge can be verified using ros2 topic:

1
2
3
4
$ ros2 topic info /l4xz/imu
Type: sensor_msgs/msg/Imu
Publisher count: 1
Subscription count: 1

However, upon checking the frame_id field of the /l4xz/imu message the field does not contain the expected laser_link frame but contains l4xz/base_link/imu_sensor instead. This is due to the fact that at this moment you can not manually set the frame of a Gazebo message - instead Gazebo is self-assigning a frame_id based on the link hierarchy from robot to sensor.

1
2
$ ros2 topic echo /l4xz/imu | grep frame_id
  frame_id: l4xz/base_link/imu_sensor

As a solution a static transform publisher establishing the spatial relation between l4xz/base_link/imu_sensor and laser_link needs to be launched.

1
2
3
4
5
gazebo_imu_tf_node = Node(
    package='tf2_ros',
    executable='static_transform_publisher',
    arguments = ['0', '0', '0', '0', '0', '0', 'imu_link', 'l4xz/base_link/imu_sensor'],
)

With this last step both Lidar scan and IMU data from a simulated Livox Mid-360 Lidar sensor is available for further processing inside your ROS 2 based robotics system.


How to simulate a Livox MID-360 using Gazebo

Alexander Entinger is a highly experienced embedded engineer with a focus on robotic systems. By providing hard-won expertise designing embedded systems for real-world embedded applications Alexander Entinger is helping companies developing robotic systems or components for robotic systems to achieve their desired business outcomes.