Skip to content

ROS with Raspberry Pi: Investigating the Core Issue of Slow Streaming Performance

By Sebastian Günther

Posted in Robots, Ros, Raspberry_pi

Building a moving robot will typically lead to adding visual sensors that enable the robot to inspect its surroundings and navigate. Visual sensors encompass ultrasonic sensors or laser scanners for distance measurements, LIDAR for a 360-degree laser scan image, cameras that provide RGB images, and sensors that provide complex point cloud. ROS supports all these sensors: Attach the correct plugin in RVIZ and Gazebo, start the hardware sensor, publish the correct topic and subscribe to its data.

In my robot project, I have the goal to stream image data from a mobile robot, connected via WiFi, to my Linux workstation, connected via ethernet. The camera of my choice is the Intel Realsense D435 because of its great form factor and having an official ROS plugin. Once the ROS software and configuration was done, streaming of images with the topic /camera/color/image_raw and of pointcloud data with /camera/depth/color/points could start. However, I was suprised by the bad performance. While images were published with about 17FPS on the robot, it was only 6FPS on the workstation. And for the pointcloud data, it dropped from 12FPS to 3FPS. This is not fast enough to see the robot surroundings in real-time, and not adequate for navigation.

This problem took me more than 2 months to investigate, find and solve several issues, to finally get 30FPS for both types of data. Follow along to learn a lot about optimizing the usage of the D435 camera, image and pointcloud configuration in ROS, and network performance.

Note: The technical environment is Ubuntu 20.04 with ROS Noetic 1.15.11.

Hardware & Software Overview

To detail the context of this article, here is the concrete hardware and software that I'm using in my robot project.

  • Mobile Robot
    • Raspberry Pi4B 4GB RAM
    • Ubuntu 20.04 Server LTS (headless)
    • ROS Noetic 1.15.11
  • Linux Workstation
    • Intel Celeron N3450 @ 4x 2,2GHz, 6GB RAM
    • Ubuntu 20.04 focal
    • ROS Noetic 1.15.11

On the mobile robot, I also use the Realsense SDK and the Realsense ROS packages - in different version, because they have difference performance. Details will be explained in the next sections.

Baseline Performance

With Realsense Camera SDK 2.47 and Realsense ROS1 2.3.2, I could get the camera node starting and stream images. On the mobile robot node I started the ROS nodes, and then on my Linux workstation I measured the number of received messages with the command ros topic hz.

The baseline, then, is this: 7 FPS for /camera/color/image_raw and 3 FPS for /camera/depth/color/points.

From here on, I systematically tried different configuration parameters for the ROS realsense node, different versions of the camera SD and the realsense ROS package. The results of these measurements are explained in the next sections.

ROS1: ROS Camera Parameter Configurations

In the first attempt, these versions of the SDK and library were used:

Realsense  SDK v2.47
Realsense ROS 2.3.2

Configuring different parameters of the ROS node leads to these results.

Parameter
depth&color widthnot setnot set640640640640640640
depth&color heightnot setnot set480480480480480480
depth_fpsnot setnot set555666
color_fpsnot setnot set555666
initial_resetTRUETRUETRUETRUETRUETRUETRUE
enable_syncTRUE
align_depth
filterspointcloudpointcloudpointcloudpointcloudpointcloudpointcloudpointcloudpointcloud
texture_streamanycoloranyanycolorcolorcolorcolor
ordered_pcyes
Topic HZ Receive
/color/image_raw6.5no data1.51.5no data666
/depth/color/pointsno datano datano datano datano data313
/depth/image-rect76.5no datano data6666

The first results showed that no stream can get better than 7 fps for any stream, and only when setting the pointcloud data specifically to 6FPS any data at all was received.

So I opened another Github issue about ROS1 performance and continued the experiments.

In other issues, I read that downgrading the Realsense SDK helped to increase the performance. I tried a downgraded SDK version both for ROS1 and ROS2.

ROS1: Decreasing Realsense SDK

In the next attempt, I downgraded the SDK.

Realsense  SDK v2.41
Realsense Ros 2.2.21

This time, I also measured the topic frequency on the sender side.

Parameters
depth/color width640640640
depth/color height480480480
depth_fps53030
color_fps53030
initial_resetFALSETRUETRUE
enable_syncFALSEFALSEFALSE
align_depthFALSETRUETRUE
filterspointcloud--
texture_stream
Topic HZ Sender
color/image_raw17152826
depth/color/pointsno data12160
depth/image-rect302517
/camera/aligned_depth_to_color/image_rawno datano data815
Topic HZ Receiver
color/image_raw661110
depth/color/points223
depth/image-rect9955

These results show that stable 28FPS for the topic /camera/color/image_raw and 16 FPS for /camera/depth/color/points are possible - on the sender side! However, the workstation that receives this data and is connected via ethernet, receives only about 1/3 of these frames.

ROS2: Decreasing Realsense SDK

The next measurement in ROS2 was also done with a downgraded version of the SDK.

Realsense  SDK v2.41
Realsense Ros 3.13

These are the parameters that I used, and the results.

Parameters
depth/color width640
depth/color height480
depth_fps
color_fps
initial_reset
enable_sync
align_depth
filterspointcloud
texture_stream
infra1disabled
infra2disabled
Topic HZ Sender
color/image_raw191616
depth/color/pointsno datano data13
depth/image-rect303028
Topic HZ Receiver
color/image_raw2.522
depth/color/pointsno data1
depth/image-rect4.54.54

A maximum of 19FPS for images, and 13 for point cloud - that’s worse than in ROS1. Also, the severe framerate drop on the receiver side prevents any usage for visualization of navigation.

ROS Performance Considerations

Following up on these results, I made a general search on the internet about ROS1 and ROS2 performance. This should provide a general feeling about where to look for performance gaps. Here are some of the potential issues to investigate:

Overall, these issues lead me to check my local WIFI. From the Raspberry Pi 4 itself, I made a simple online speedtest.

Using a WLAN connection:

Testing download speed................................................................................
Download: 4.22 Mbit/s
Testing upload speed......................................................................................................
Upload: 6.23 Mbit/s

Using an ethernet cable:

Testing download speed................................................................................
Download: 106.04 Mbit/s
Testing upload speed......................................................................................................
Upload: 42.32 Mbit/s

I was surprised! The upload speed limit of about 6MB could explain the severe performance drop: Images are not sent fast enough. Assuming about 1MB per image, getting 30HZ image topic frequency on the receiver side would mean an upload speed of at least 30MBs. Clearly, this is not possible with my current WLAN speed.

And from here on, I took a totally new direction for my investigation: Optimizing the WLAN speed of my Raspberry Pi. And this is the story of another article.

Conclusion

Visual sensors are essential for robot navigation. Amongst the many available data formats, this article investigated the topics /camera/color/image_raw and /camera/depth/color/points, which are raw image data and point cloud date for depth information. The particular setup - a Raspberry Pi4 with the Realsense D435 Camera, connected via WIFI to a Linux workstation - showed a severe performance drop if these topics on the receiver side. by systematically trying different versions of the Realsense SDK with both ROS1 and with ROS2, and by trying startup parameters for the Realsense Node, I could see improve the publication for the sender, but the receiver capped at about 11 FPS for raw images and 3 FPS for pointcloud. Finally, I measured the raw upload speed of my Raspberry Pi4: It is only about 6MB. How to improve that was covered in my previous article Improving Image Streaming Performance.