Planet ROS
Planet ROS - http://planet.ros.org
Planet ROS - http://planet.ros.org
http://planet.ros.org
ROS Discourse General: ROS 2 Rust Meeting: January 2026
The next ROS 2 Rust Meeting will be Mon, Jan 12, 2026 2:00 PM UTC
The meeting room will be at https://meet.google.com/rxr-pvcv-hmu
In the unlikely event that the room needs to change, we will update this thread with the new info!
1 post - 1 participant
ROS Discourse General: Easier Protobuf and ROS 2 Integration
For anyone integrating ROS 2 with Protobuf-based systems, we at the RAI Institute want to highlight one of our open-source tools: proto2ros!
proto2ros generates ROS 2 message definitions and bi-directional conversion code directly from .proto files, reducing boilerplate and simplifying integration between Protobuf-based systems and ROS 2 nodes.
Some highlights:
-
Automatic ROS 2 message generation from Protobuf
-
C++ and Python conversion utilities
-
Supports Protobuf v2 and v3
It is currently available for both Humble and Jazzy and can be installed with
apt install ros-<distro>-proto2ros
Check out the full repo here: https://github.com/bdaiinstitute/proto2ros
Thanks to everyone who has contributed to this project including @hidmic @khughes1 @jbarry !
As always, feedback and contributions are welcome!
The RAI Institute
1 post - 1 participant
ROS Discourse General: ROSCon Review Continued | Cloud Robotics WG Meeting 2026-01-14
Please come and join us for this coming meeting at Wed, Jan 14, 2026 4:00 PM UTC→Wed, Jan 14, 2026 5:00 PM UTC, where we plan to dive deeper into the ROSCon talks collected together during the last session. By examining more details about the talks, we can highlight any that would be relevant to Logging & Observability, the current focus of the group. We can also pull out interesting tips to release as part of a blog post.
The details for the talks have been gathered into the Links/Notes column of this document. Please feel free to read ahead and take a look at the notes and videos ahead of the meeting, if you’re interested.
The meeting link for next meeting is here, and you can sign up to our calendar or our Google Group for meeting notifications or keep an eye on the Cloud Robotics Hub.
Hopefully we will see you there!
1 post - 1 participant
ROS Discourse General: Goodbye RQt, Hello RQml [NEW RELEASE]

Greetings fellow roboticists,
During our transition to ROS 2 and the build of our new robot Athena, we’ve encountered quite a few issues both in ROS 2, with the middleware, but also with rqt.
For instance, when testing our manipulator, we have noticed that the ControllerManager in rqt gives you around 20 seconds of time before the application freezes completely when used over WiFi.
This is not the only issue, but that’s also not the point of this post.
You could chime in and say, “Hey, you could’ve fixed that and made a PR
”, and you would be right, and we did this in several instances.
But I’m not a fan of using Python for UI, and this presented the perfect opportunity to demonstrate how easy it is to create a nice ROS interface using my QML ROS 2 module.
So, instead, I’ve spent that time quickly developing a modern alternative, fixing all the issues that bothered me in rqt.
Hello RQml 
Please note that this is still in beta and not all plugins exist yet.
You are very welcome to point me to the ones that you think would be great to have, or even implement them yourself and make a PR ![]()
Currently, the following plugins are available:
- ActionCaller: Interface for calling ROS 2 Actions.
- Console: A log viewer for ROS 2 messages.
- ControllerManager: Manage and switch ROS 2 controllers.
- ImageView: View camera streams and images.
- JointTrajectoryController: Interface for sending joint trajectory commands.
- MessagePublisher: Publish custom ROS 2 messages.
- RobotSteering: Teleoperation tool for mobile robots.
- ServiceCaller: Interface for calling ROS 2 Services.
Notably, the ImageView now also uses transparency for depth image values that are not valid (instead of using black, which also represents very close values).
As always, I hope this is of interest to you, and I would love to hear from you if you build something cool with this ![]()
If it wasn’t, my little turtle buddy will be very disappointed because he already considered you a special friend ![]()
3 posts - 2 participants
ROS Discourse General: New packages for Humble Hawksbill 2026-01-07
Package Updates for Humble
Added Packages [27]:
- ros-humble-ardrone-sdk: 2.0.3-1
- ros-humble-ardrone-sdk-dbgsym: 2.0.3-1
- ros-humble-ardrone-sumo: 2.0.3-1
- ros-humble-ardrone-sumo-dbgsym: 2.0.3-1
- ros-humble-cloudini-lib: 0.11.1-2
- ros-humble-cloudini-lib-dbgsym: 0.11.1-2
- ros-humble-cloudini-ros: 0.11.1-2
- ros-humble-cloudini-ros-dbgsym: 0.11.1-2
- ros-humble-event-camera-tools: 3.1.1-1
- ros-humble-event-camera-tools-dbgsym: 3.1.1-1
- ros-humble-fibar-lib: 1.0.2-1
- ros-humble-frequency-cam: 3.1.0-1
- ros-humble-frequency-cam-dbgsym: 3.1.0-1
- ros-humble-hitch-estimation-apriltag-array: 0.0.1-1
- ros-humble-mavros-examples: 2.14.0-1
- ros-humble-mujoco-vendor: 0.0.6-1
- ros-humble-mujoco-vendor-dbgsym: 0.0.6-1
- ros-humble-olive-interfaces: 0.1.0-1
- ros-humble-olive-interfaces-dbgsym: 0.1.0-1
- ros-humble-persist-parameter-server: 1.0.4-1
- ros-humble-persist-parameter-server-dbgsym: 1.0.4-1
- ros-humble-pointcloud-to-ply: 0.0.1-1
- ros-humble-qml6-ros2-plugin: 0.25.121-1
- ros-humble-qml6-ros2-plugin-dbgsym: 0.25.121-1
- ros-humble-yasmin-editor: 4.2.2-1
- ros-humble-yasmin-factory: 4.2.2-1
- ros-humble-yasmin-factory-dbgsym: 4.2.2-1
Updated Packages [390]:
- ros-humble-ackermann-steering-controller: 2.50.2-1 → 2.52.0-1
- ros-humble-ackermann-steering-controller-dbgsym: 2.50.2-1 → 2.52.0-1
- ros-humble-admittance-controller: 2.50.2-1 → 2.52.0-1
- ros-humble-admittance-controller-dbgsym: 2.50.2-1 → 2.52.0-1
- ros-humble-apriltag-detector: 3.0.3-1 → 3.1.0-1
- ros-humble-apriltag-detector-dbgsym: 3.0.3-1 → 3.1.0-1
- ros-humble-apriltag-detector-mit: 3.0.3-1 → 3.1.0-1
- ros-humble-apriltag-detector-mit-dbgsym: 3.0.3-1 → 3.1.0-1
- ros-humble-apriltag-detector-umich: 3.0.3-1 → 3.1.0-1
- ros-humble-apriltag-detector-umich-dbgsym: 3.0.3-1 → 3.1.0-1
- ros-humble-apriltag-draw: 3.0.3-1 → 3.1.0-1
- ros-humble-apriltag-draw-dbgsym: 3.0.3-1 → 3.1.0-1
- ros-humble-apriltag-tools: 3.0.3-1 → 3.1.0-1
- ros-humble-apriltag-tools-dbgsym: 3.0.3-1 → 3.1.0-1
- ros-humble-aruco-opencv: 2.3.1-1 → 2.4.1-1
- ros-humble-aruco-opencv-dbgsym: 2.3.1-1 → 2.4.1-1
- ros-humble-aruco-opencv-msgs: 2.3.1-1 → 2.4.1-1
- ros-humble-aruco-opencv-msgs-dbgsym: 2.3.1-1 → 2.4.1-1
- ros-humble-automatika-ros-sugar: 0.4.1-1 → 0.4.2-1
- ros-humble-automatika-ros-sugar-dbgsym: 0.4.1-1 → 0.4.2-1
- ros-humble-autoware-internal-debug-msgs: 1.12.0-2 → 1.12.1-1
- ros-humble-autoware-internal-debug-msgs-dbgsym: 1.12.0-2 → 1.12.1-1
- ros-humble-autoware-internal-localization-msgs: 1.12.0-2 → 1.12.1-1
- ros-humble-autoware-internal-localization-msgs-dbgsym: 1.12.0-2 → 1.12.1-1
- ros-humble-autoware-internal-metric-msgs: 1.12.0-2 → 1.12.1-1
- ros-humble-autoware-internal-metric-msgs-dbgsym: 1.12.0-2 → 1.12.1-1
- ros-humble-autoware-internal-msgs: 1.12.0-2 → 1.12.1-1
- ros-humble-autoware-internal-msgs-dbgsym: 1.12.0-2 → 1.12.1-1
- ros-humble-autoware-internal-perception-msgs: 1.12.0-2 → 1.12.1-1
- ros-humble-autoware-internal-perception-msgs-dbgsym: 1.12.0-2 → 1.12.1-1
- ros-humble-autoware-internal-planning-msgs: 1.12.0-2 → 1.12.1-1
- ros-humble-autoware-internal-planning-msgs-dbgsym: 1.12.0-2 → 1.12.1-1
- ros-humble-behaviortree-cpp: 4.7.1-1 → 4.8.3-1
- ros-humble-behaviortree-cpp-dbgsym: 4.7.1-1 → 4.8.3-1
- ros-humble-beluga: 2.0.2-1 → 2.1.0-1
- ros-humble-beluga-amcl: 2.0.2-1 → 2.1.0-1
- ros-humble-beluga-amcl-dbgsym: 2.0.2-1 → 2.1.0-1
- ros-humble-beluga-ros: 2.0.2-1 → 2.1.0-1
- ros-humble-bicycle-steering-controller: 2.50.2-1 → 2.52.0-1
- ros-humble-bicycle-steering-controller-dbgsym: 2.50.2-1 → 2.52.0-1
- ros-humble-camera-calibration: 3.0.8-1 → 3.0.9-1
- ros-humble-camera-ros: 0.5.0-1 → 0.5.2-1
- ros-humble-camera-ros-dbgsym: 0.5.0-1 → 0.5.2-1
- ros-humble-clearpath-common: 1.3.7-1 → 1.3.8-1
- ros-humble-clearpath-control: 1.3.7-1 → 1.3.8-1
- ros-humble-clearpath-customization: 1.3.7-1 → 1.3.8-1
- ros-humble-clearpath-description: 1.3.7-1 → 1.3.8-1
- ros-humble-clearpath-generator-common: 1.3.7-1 → 1.3.8-1
- ros-humble-clearpath-generator-common-dbgsym: 1.3.7-1 → 1.3.8-1
- ros-humble-clearpath-manipulators: 1.3.7-1 → 1.3.8-1
- ros-humble-clearpath-manipulators-description: 1.3.7-1 → 1.3.8-1
- ros-humble-clearpath-mounts-description: 1.3.7-1 → 1.3.8-1
- ros-humble-clearpath-platform-description: 1.3.7-1 → 1.3.8-1
- ros-humble-clearpath-sensors-description: 1.3.7-1 → 1.3.8-1
- ros-humble-control-toolbox: 3.6.2-1 → 3.6.3-1
- ros-humble-control-toolbox-dbgsym: 3.6.2-1 → 3.6.3-1
- ros-humble-controller-interface: 2.52.2-1 → 2.53.0-1
- ros-humble-controller-interface-dbgsym: 2.52.2-1 → 2.53.0-1
- ros-humble-controller-manager: 2.52.2-1 → 2.53.0-1
- ros-humble-controller-manager-dbgsym: 2.52.2-1 → 2.53.0-1
- ros-humble-controller-manager-msgs: 2.52.2-1 → 2.53.0-1
- ros-humble-controller-manager-msgs-dbgsym: 2.52.2-1 → 2.53.0-1
- ros-humble-depth-image-proc: 3.0.8-1 → 3.0.9-1
- ros-humble-depth-image-proc-dbgsym: 3.0.8-1 → 3.0.9-1
- ros-humble-depthai: 2.30.0-1 → 2.31.0-1
- ros-humble-depthai-bridge: 2.11.2-1 → 2.12.1-1
- ros-humble-depthai-bridge-dbgsym: 2.11.2-1 → 2.12.1-1
- ros-humble-depthai-dbgsym: 2.30.0-1 → 2.31.0-1
- ros-humble-depthai-descriptions: 2.11.2-1 → 2.12.1-1
- ros-humble-depthai-examples: 2.11.2-1 → 2.12.1-1
- ros-humble-depthai-examples-dbgsym: 2.11.2-1 → 2.12.1-1
- ros-humble-depthai-filters: 2.11.2-1 → 2.12.1-1
- ros-humble-depthai-filters-dbgsym: 2.11.2-1 → 2.12.1-1
- ros-humble-depthai-ros: 2.11.2-1 → 2.12.1-1
- ros-humble-depthai-ros-driver: 2.11.2-1 → 2.12.1-1
- ros-humble-depthai-ros-driver-dbgsym: 2.11.2-1 → 2.12.1-1
- ros-humble-depthai-ros-msgs: 2.11.2-1 → 2.12.1-1
- ros-humble-depthai-ros-msgs-dbgsym: 2.11.2-1 → 2.12.1-1
- ros-humble-diff-drive-controller: 2.50.2-1 → 2.52.0-1
- ros-humble-diff-drive-controller-dbgsym: 2.50.2-1 → 2.52.0-1
- ros-humble-dynamixel-hardware-interface: 1.4.16-1 → 1.5.0-2
- ros-humble-dynamixel-hardware-interface-dbgsym: 1.4.16-1 → 1.5.0-2
- ros-humble-effort-controllers: 2.50.2-1 → 2.52.0-1
- ros-humble-effort-controllers-dbgsym: 2.50.2-1 → 2.52.0-1
- ros-humble-event-camera-codecs: 2.0.1-1 → 3.0.0-1
- ros-humble-event-camera-codecs-dbgsym: 2.0.1-1 → 3.0.0-1
- ros-humble-event-camera-msgs: 2.0.0-1 → 2.0.1-1
- ros-humble-event-camera-msgs-dbgsym: 2.0.0-1 → 2.0.1-1
- ros-humble-event-camera-py: 2.0.1-1 → 3.0.0-1
- ros-humble-event-camera-renderer: 2.0.1-1 → 3.0.0-1
- ros-humble-event-camera-renderer-dbgsym: 2.0.1-1 → 3.0.0-1
- ros-humble-examples-tf2-py: 0.25.17-1 → 0.25.18-1
- ros-humble-fastcdr: 1.0.24-2 → 1.0.29-1
- ros-humble-fastcdr-dbgsym: 1.0.24-2 → 1.0.29-1
- ros-humble-fastrtps: 2.6.10-1 → 2.6.11-1
- ros-humble-fastrtps-cmake-module: 2.2.3-1 → 2.2.4-1
- ros-humble-fastrtps-dbgsym: 2.6.10-1 → 2.6.11-1
- ros-humble-force-torque-sensor-broadcaster: 2.50.2-1 → 2.52.0-1
- ros-humble-force-torque-sensor-broadcaster-dbgsym: 2.50.2-1 → 2.52.0-1
- ros-humble-forward-command-controller: 2.50.2-1 → 2.52.0-1
- ros-humble-forward-command-controller-dbgsym: 2.50.2-1 → 2.52.0-1
- ros-humble-generate-parameter-library: 0.5.0-1 → 0.6.0-1
- ros-humble-generate-parameter-library-py: 0.5.0-1 → 0.6.0-1
- ros-humble-geometry2: 0.25.17-1 → 0.25.18-1
- ros-humble-gpio-controllers: 2.50.2-1 → 2.52.0-1
- ros-humble-gpio-controllers-dbgsym: 2.50.2-1 → 2.52.0-1
- ros-humble-gripper-controllers: 2.50.2-1 → 2.52.0-1
- ros-humble-gripper-controllers-dbgsym: 2.50.2-1 → 2.52.0-1
- ros-humble-hardware-interface: 2.52.2-1 → 2.53.0-1
- ros-humble-hardware-interface-dbgsym: 2.52.2-1 → 2.53.0-1
- ros-humble-hardware-interface-testing: 2.52.2-1 → 2.53.0-1
- ros-humble-hardware-interface-testing-dbgsym: 2.52.2-1 → 2.53.0-1
- ros-humble-husarion-components-description: 0.0.2-1 → 0.1.0-1
- ros-humble-image-pipeline: 3.0.8-1 → 3.0.9-1
- ros-humble-image-proc: 3.0.8-1 → 3.0.9-1
- ros-humble-image-proc-dbgsym: 3.0.8-1 → 3.0.9-1
- ros-humble-image-publisher: 3.0.8-1 → 3.0.9-1
- ros-humble-image-publisher-dbgsym: 3.0.8-1 → 3.0.9-1
- ros-humble-image-rotate: 3.0.8-1 → 3.0.9-1
- ros-humble-image-rotate-dbgsym: 3.0.8-1 → 3.0.9-1
- ros-humble-image-view: 3.0.8-1 → 3.0.9-1
- ros-humble-image-view-dbgsym: 3.0.8-1 → 3.0.9-1
- ros-humble-imu-sensor-broadcaster: 2.50.2-1 → 2.52.0-1
- ros-humble-imu-sensor-broadcaster-dbgsym: 2.50.2-1 → 2.52.0-1
- ros-humble-joint-limits: 2.52.2-1 → 2.53.0-1
- ros-humble-joint-limits-dbgsym: 2.52.2-1 → 2.53.0-1
- ros-humble-joint-state-broadcaster: 2.50.2-1 → 2.52.0-1
- ros-humble-joint-state-broadcaster-dbgsym: 2.50.2-1 → 2.52.0-1
- ros-humble-joint-trajectory-controller: 2.50.2-1 → 2.52.0-1
- ros-humble-joint-trajectory-controller-dbgsym: 2.50.2-1 → 2.52.0-1
- ros-humble-kitti-metrics-eval: 2.2.1-1 → 2.4.0-1
- ros-humble-kitti-metrics-eval-dbgsym: 2.2.1-1 → 2.4.0-1
- ros-humble-kompass: 0.3.2-1 → 0.3.3-1
- ros-humble-kompass-interfaces: 0.3.2-1 → 0.3.3-1
- ros-humble-kompass-interfaces-dbgsym: 0.3.2-1 → 0.3.3-1
- ros-humble-launch-pal: 0.19.0-1 → 0.20.0-1
- ros-humble-libmavconn: 2.12.0-1 → 2.14.0-1
- ros-humble-libmavconn-dbgsym: 2.12.0-1 → 2.14.0-1
- ros-humble-mapviz: 2.5.10-1 → 2.6.0-1
- ros-humble-mapviz-dbgsym: 2.5.10-1 → 2.6.0-1
- ros-humble-mapviz-interfaces: 2.5.10-1 → 2.6.0-1
- ros-humble-mapviz-interfaces-dbgsym: 2.5.10-1 → 2.6.0-1
- ros-humble-mapviz-plugins: 2.5.10-1 → 2.6.0-1
- ros-humble-mapviz-plugins-dbgsym: 2.5.10-1 → 2.6.0-1
- ros-humble-mavlink: 2025.9.9-1 → 2025.12.12-1
- ros-humble-mavros: 2.12.0-1 → 2.14.0-1
- ros-humble-mavros-dbgsym: 2.12.0-1 → 2.14.0-1
- ros-humble-mavros-extras: 2.12.0-1 → 2.14.0-1
- ros-humble-mavros-extras-dbgsym: 2.12.0-1 → 2.14.0-1
- ros-humble-mavros-msgs: 2.12.0-1 → 2.14.0-1
- ros-humble-mavros-msgs-dbgsym: 2.12.0-1 → 2.14.0-1
- ros-humble-mecanum-drive-controller: 2.50.2-1 → 2.52.0-1
- ros-humble-mecanum-drive-controller-dbgsym: 2.50.2-1 → 2.52.0-1
- ros-humble-metavision-driver: 2.0.1-1 → 3.0.0-1
- ros-humble-metavision-driver-dbgsym: 2.0.1-1 → 3.0.0-1
- ros-humble-mola: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-bridge-ros2: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-bridge-ros2-dbgsym: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-demos: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-gnss-to-markers: 0.1.0-1 → 0.1.2-1
- ros-humble-mola-gnss-to-markers-dbgsym: 0.1.0-1 → 0.1.2-1
- ros-humble-mola-imu-preintegration: 1.14.0-1 → 1.14.1-1
- ros-humble-mola-imu-preintegration-dbgsym: 1.14.0-1 → 1.14.1-1
- ros-humble-mola-input-euroc-dataset: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-input-euroc-dataset-dbgsym: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-input-kitti-dataset: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-input-kitti-dataset-dbgsym: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-input-kitti360-dataset: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-input-kitti360-dataset-dbgsym: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-input-lidar-bin-dataset: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-input-lidar-bin-dataset-dbgsym: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-input-mulran-dataset: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-input-mulran-dataset-dbgsym: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-input-paris-luco-dataset: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-input-paris-luco-dataset-dbgsym: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-input-rawlog: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-input-rawlog-dbgsym: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-input-rosbag2: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-input-rosbag2-dbgsym: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-input-video: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-input-video-dbgsym: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-kernel: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-kernel-dbgsym: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-launcher: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-launcher-dbgsym: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-lidar-odometry: 1.2.2-1 → 1.3.1-1
- ros-humble-mola-lidar-odometry-dbgsym: 1.2.2-1 → 1.3.1-1
- ros-humble-mola-metric-maps: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-metric-maps-dbgsym: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-msgs: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-msgs-dbgsym: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-pose-list: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-pose-list-dbgsym: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-relocalization: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-relocalization-dbgsym: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-traj-tools: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-traj-tools-dbgsym: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-viz: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-viz-dbgsym: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-yaml: 2.2.1-1 → 2.4.0-1
- ros-humble-mola-yaml-dbgsym: 2.2.1-1 → 2.4.0-1
- ros-humble-mp2p-icp: 2.1.1-1 → 2.2.0-1
- ros-humble-mp2p-icp-dbgsym: 2.1.1-1 → 2.2.0-1
- ros-humble-mrpt-apps: 2.15.1-2 → 2.15.4-1
- ros-humble-mrpt-apps-dbgsym: 2.15.1-2 → 2.15.4-1
- ros-humble-mrpt-libapps: 2.15.1-2 → 2.15.4-1
- ros-humble-mrpt-libapps-dbgsym: 2.15.1-2 → 2.15.4-1
- ros-humble-mrpt-libbase: 2.15.1-2 → 2.15.4-1
- ros-humble-mrpt-libbase-dbgsym: 2.15.1-2 → 2.15.4-1
- ros-humble-mrpt-libgui: 2.15.1-2 → 2.15.4-1
- ros-humble-mrpt-libgui-dbgsym: 2.15.1-2 → 2.15.4-1
- ros-humble-mrpt-libhwdrivers: 2.15.1-2 → 2.15.4-1
- ros-humble-mrpt-libhwdrivers-dbgsym: 2.15.1-2 → 2.15.4-1
- ros-humble-mrpt-libmaps: 2.15.1-2 → 2.15.4-1
- ros-humble-mrpt-libmaps-dbgsym: 2.15.1-2 → 2.15.4-1
- ros-humble-mrpt-libmath: 2.15.1-2 → 2.15.4-1
- ros-humble-mrpt-libmath-dbgsym: 2.15.1-2 → 2.15.4-1
- ros-humble-mrpt-libnav: 2.15.1-2 → 2.15.4-1
- ros-humble-mrpt-libnav-dbgsym: 2.15.1-2 → 2.15.4-1
- ros-humble-mrpt-libobs: 2.15.1-2 → 2.15.4-1
- ros-humble-mrpt-libobs-dbgsym: 2.15.1-2 → 2.15.4-1
- ros-humble-mrpt-libopengl: 2.15.1-2 → 2.15.4-1
- ros-humble-mrpt-libopengl-dbgsym: 2.15.1-2 → 2.15.4-1
- ros-humble-mrpt-libposes: 2.15.1-2 → 2.15.4-1
- ros-humble-mrpt-libposes-dbgsym: 2.15.1-2 → 2.15.4-1
- ros-humble-mrpt-libros-bridge: 3.0.2-1 → 3.1.1-1
- ros-humble-mrpt-libros-bridge-dbgsym: 3.0.2-1 → 3.1.1-1
- ros-humble-mrpt-libslam: 2.15.1-2 → 2.15.4-1
- ros-humble-mrpt-libslam-dbgsym: 2.15.1-2 → 2.15.4-1
- ros-humble-mrpt-libtclap: 2.15.1-2 → 2.15.4-1
- ros-humble-mrpt-path-planning: 0.2.3-1 → 0.2.4-1
- ros-humble-mrpt-path-planning-dbgsym: 0.2.3-1 → 0.2.4-1
- ros-humble-multires-image: 2.5.10-1 → 2.6.0-1
- ros-humble-multires-image-dbgsym: 2.5.10-1 → 2.6.0-1
- ros-humble-mvsim: 0.14.2-1 → 0.15.0-1
- ros-humble-mvsim-dbgsym: 0.14.2-1 → 0.15.0-1
- ros-humble-parameter-traits: 0.5.0-1 → 0.6.0-1
- ros-humble-pid-controller: 2.50.2-1 → 2.52.0-1
- ros-humble-pid-controller-dbgsym: 2.50.2-1 → 2.52.0-1
- ros-humble-plotjuggler: 3.13.2-1 → 3.15.0-1
- ros-humble-plotjuggler-dbgsym: 3.13.2-1 → 3.15.0-1
- ros-humble-plotjuggler-ros: 2.3.1-1 → 2.3.1-2
- ros-humble-plotjuggler-ros-dbgsym: 2.3.1-1 → 2.3.1-2
- ros-humble-pluginlib: 5.1.2-1 → 5.1.3-1
- ros-humble-pluginlib-dbgsym: 5.1.2-1 → 5.1.3-1
- ros-humble-pose-broadcaster: 2.50.2-1 → 2.52.0-1
- ros-humble-pose-broadcaster-dbgsym: 2.50.2-1 → 2.52.0-1
- ros-humble-position-controllers: 2.50.2-1 → 2.52.0-1
- ros-humble-position-controllers-dbgsym: 2.50.2-1 → 2.52.0-1
- ros-humble-python-mrpt: 2.15.1-1 → 2.15.3-1
- ros-humble-range-sensor-broadcaster: 2.50.2-1 → 2.52.0-1
- ros-humble-range-sensor-broadcaster-dbgsym: 2.50.2-1 → 2.52.0-1
- ros-humble-rclcpp: 16.0.16-1 → 16.0.17-1
- ros-humble-rclcpp-action: 16.0.16-1 → 16.0.17-1
- ros-humble-rclcpp-action-dbgsym: 16.0.16-1 → 16.0.17-1
- ros-humble-rclcpp-components: 16.0.16-1 → 16.0.17-1
- ros-humble-rclcpp-components-dbgsym: 16.0.16-1 → 16.0.17-1
- ros-humble-rclcpp-dbgsym: 16.0.16-1 → 16.0.17-1
- ros-humble-rclcpp-lifecycle: 16.0.16-1 → 16.0.17-1
- ros-humble-rclcpp-lifecycle-dbgsym: 16.0.16-1 → 16.0.17-1
- ros-humble-rcutils: 5.1.7-1 → 5.1.8-1
- ros-humble-rcutils-dbgsym: 5.1.7-1 → 5.1.8-1
- ros-humble-realtime-tools: 2.14.1-1 → 2.15.0-1
- ros-humble-realtime-tools-dbgsym: 2.14.1-1 → 2.15.0-1
- ros-humble-rko-lio: 0.1.6-1 → 0.2.0-1
- ros-humble-rko-lio-dbgsym: 0.1.6-1 → 0.2.0-1
- ros-humble-robotraconteur: 1.2.6-1 → 1.2.7-1
- ros-humble-robotraconteur-dbgsym: 1.2.6-1 → 1.2.7-1
- ros-humble-ros-babel-fish: 0.25.2-1 → 0.25.120-1
- ros-humble-ros-babel-fish-dbgsym: 0.25.2-1 → 0.25.120-1
- ros-humble-ros-babel-fish-test-msgs: 0.25.2-1 → 0.25.120-1
- ros-humble-ros-babel-fish-test-msgs-dbgsym: 0.25.2-1 → 0.25.120-1
- ros-humble-ros2-control: 2.52.2-1 → 2.53.0-1
- ros-humble-ros2-control-test-assets: 2.52.2-1 → 2.53.0-1
- ros-humble-ros2-controllers: 2.50.2-1 → 2.52.0-1
- ros-humble-ros2-controllers-test-nodes: 2.50.2-1 → 2.52.0-1
- ros-humble-ros2cli-common-extensions: 0.1.1-4 → 0.1.2-1
- ros-humble-ros2controlcli: 2.52.2-1 → 2.53.0-1
- ros-humble-ros2plugin: 5.1.2-1 → 5.1.3-1
- ros-humble-rosbag2rawlog: 3.0.2-1 → 3.1.1-1
- ros-humble-rosbag2rawlog-dbgsym: 3.0.2-1 → 3.1.1-1
- ros-humble-rosidl-adapter: 3.1.7-1 → 3.1.8-1
- ros-humble-rosidl-cli: 3.1.7-1 → 3.1.8-1
- ros-humble-rosidl-cmake: 3.1.7-1 → 3.1.8-1
- ros-humble-rosidl-generator-c: 3.1.7-1 → 3.1.8-1
- ros-humble-rosidl-generator-cpp: 3.1.7-1 → 3.1.8-1
- ros-humble-rosidl-parser: 3.1.7-1 → 3.1.8-1
- ros-humble-rosidl-runtime-c: 3.1.7-1 → 3.1.8-1
- ros-humble-rosidl-runtime-c-dbgsym: 3.1.7-1 → 3.1.8-1
- ros-humble-rosidl-runtime-cpp: 3.1.7-1 → 3.1.8-1
- ros-humble-rosidl-typesupport-fastrtps-c: 2.2.3-1 → 2.2.4-1
- ros-humble-rosidl-typesupport-fastrtps-c-dbgsym: 2.2.3-1 → 2.2.4-1
- ros-humble-rosidl-typesupport-fastrtps-cpp: 2.2.3-1 → 2.2.4-1
- ros-humble-rosidl-typesupport-fastrtps-cpp-dbgsym: 2.2.3-1 → 2.2.4-1
- ros-humble-rosidl-typesupport-interface: 3.1.7-1 → 3.1.8-1
- ros-humble-rosidl-typesupport-introspection-c: 3.1.7-1 → 3.1.8-1
- ros-humble-rosidl-typesupport-introspection-c-dbgsym: 3.1.7-1 → 3.1.8-1
- ros-humble-rosidl-typesupport-introspection-cpp: 3.1.7-1 → 3.1.8-1
- ros-humble-rosidl-typesupport-introspection-cpp-dbgsym: 3.1.7-1 → 3.1.8-1
- ros-humble-rqt-controller-manager: 2.52.2-1 → 2.53.0-1
- ros-humble-rqt-joint-trajectory-controller: 2.50.2-1 → 2.52.0-1
- ros-humble-rviz-assimp-vendor: 11.2.23-1 → 11.2.25-1
- ros-humble-rviz-common: 11.2.23-1 → 11.2.25-1
- ros-humble-rviz-common-dbgsym: 11.2.23-1 → 11.2.25-1
- ros-humble-rviz-default-plugins: 11.2.23-1 → 11.2.25-1
- ros-humble-rviz-default-plugins-dbgsym: 11.2.23-1 → 11.2.25-1
- ros-humble-rviz-ogre-vendor: 11.2.23-1 → 11.2.25-1
- ros-humble-rviz-ogre-vendor-dbgsym: 11.2.23-1 → 11.2.25-1
- ros-humble-rviz-rendering: 11.2.23-1 → 11.2.25-1
- ros-humble-rviz-rendering-dbgsym: 11.2.23-1 → 11.2.25-1
- ros-humble-rviz-rendering-tests: 11.2.23-1 → 11.2.25-1
- ros-humble-rviz-visual-testing-framework: 11.2.23-1 → 11.2.25-1
- ros-humble-rviz2: 11.2.23-1 → 11.2.25-1
- ros-humble-rviz2-dbgsym: 11.2.23-1 → 11.2.25-1
- ros-humble-septentrio-gnss-driver: 1.4.5-1 → 1.4.6-1
- ros-humble-septentrio-gnss-driver-dbgsym: 1.4.5-1 → 1.4.6-1
- ros-humble-simple-launch: 1.11.0-1 → 1.11.1-1
- ros-humble-slider-publisher: 2.4.1-1 → 2.4.2-1
- ros-humble-steering-controllers-library: 2.50.2-1 → 2.52.0-1
- ros-humble-steering-controllers-library-dbgsym: 2.50.2-1 → 2.52.0-1
- ros-humble-stereo-image-proc: 3.0.8-1 → 3.0.9-1
- ros-humble-stereo-image-proc-dbgsym: 3.0.8-1 → 3.0.9-1
- ros-humble-tcb-span: 1.0.2-2 → 1.2.0-1
- ros-humble-tf2: 0.25.17-1 → 0.25.18-1
- ros-humble-tf2-bullet: 0.25.17-1 → 0.25.18-1
- ros-humble-tf2-dbgsym: 0.25.17-1 → 0.25.18-1
- ros-humble-tf2-eigen: 0.25.17-1 → 0.25.18-1
- ros-humble-tf2-eigen-kdl: 0.25.17-1 → 0.25.18-1
- ros-humble-tf2-eigen-kdl-dbgsym: 0.25.17-1 → 0.25.18-1
- ros-humble-tf2-geometry-msgs: 0.25.17-1 → 0.25.18-1
- ros-humble-tf2-kdl: 0.25.17-1 → 0.25.18-1
- ros-humble-tf2-msgs: 0.25.17-1 → 0.25.18-1
- ros-humble-tf2-msgs-dbgsym: 0.25.17-1 → 0.25.18-1
- ros-humble-tf2-py: 0.25.17-1 → 0.25.18-1
- ros-humble-tf2-py-dbgsym: 0.25.17-1 → 0.25.18-1
- ros-humble-tf2-ros: 0.25.17-1 → 0.25.18-1
- ros-humble-tf2-ros-dbgsym: 0.25.17-1 → 0.25.18-1
- ros-humble-tf2-ros-py: 0.25.17-1 → 0.25.18-1
- ros-humble-tf2-sensor-msgs: 0.25.17-1 → 0.25.18-1
- ros-humble-tf2-tools: 0.25.17-1 → 0.25.18-1
- ros-humble-tile-map: 2.5.10-1 → 2.6.0-1
- ros-humble-tile-map-dbgsym: 2.5.10-1 → 2.6.0-1
- ros-humble-tl-expected: 1.0.2-2 → 1.2.0-1
- ros-humble-tracetools-image-pipeline: 3.0.8-1 → 3.0.9-1
- ros-humble-tracetools-image-pipeline-dbgsym: 3.0.8-1 → 3.0.9-1
- ros-humble-transmission-interface: 2.52.2-1 → 2.53.0-1
- ros-humble-transmission-interface-dbgsym: 2.52.2-1 → 2.53.0-1
- ros-humble-tricycle-controller: 2.50.2-1 → 2.52.0-1
- ros-humble-tricycle-controller-dbgsym: 2.50.2-1 → 2.52.0-1
- ros-humble-tricycle-steering-controller: 2.50.2-1 → 2.52.0-1
- ros-humble-tricycle-steering-controller-dbgsym: 2.50.2-1 → 2.52.0-1
- ros-humble-turtlebot3: 2.3.3-1 → 2.3.6-1
- ros-humble-turtlebot3-bringup: 2.3.3-1 → 2.3.6-1
- ros-humble-turtlebot3-cartographer: 2.3.3-1 → 2.3.6-1
- ros-humble-turtlebot3-description: 2.3.3-1 → 2.3.6-1
- ros-humble-turtlebot3-example: 2.3.3-1 → 2.3.6-1
- ros-humble-turtlebot3-navigation2: 2.3.3-1 → 2.3.6-1
- ros-humble-turtlebot3-node: 2.3.3-1 → 2.3.6-1
- ros-humble-turtlebot3-node-dbgsym: 2.3.3-1 → 2.3.6-1
- ros-humble-turtlebot3-teleop: 2.3.3-1 → 2.3.6-1
- ros-humble-ur: 2.10.0-1 → 2.11.0-1
- ros-humble-ur-bringup: 2.10.0-1 → 2.11.0-1
- ros-humble-ur-calibration: 2.10.0-1 → 2.11.0-1
- ros-humble-ur-calibration-dbgsym: 2.10.0-1 → 2.11.0-1
- ros-humble-ur-client-library: 2.6.0-1 → 2.6.1-1
- ros-humble-ur-client-library-dbgsym: 2.6.0-1 → 2.6.1-1
- ros-humble-ur-controllers: 2.10.0-1 → 2.11.0-1
- ros-humble-ur-controllers-dbgsym: 2.10.0-1 → 2.11.0-1
- ros-humble-ur-dashboard-msgs: 2.10.0-1 → 2.11.0-1
- ros-humble-ur-dashboard-msgs-dbgsym: 2.10.0-1 → 2.11.0-1
- ros-humble-ur-description: 2.8.0-1 → 2.9.0-1
- ros-humble-ur-moveit-config: 2.10.0-1 → 2.11.0-1
- ros-humble-ur-robot-driver: 2.10.0-1 → 2.11.0-1
- ros-humble-ur-robot-driver-dbgsym: 2.10.0-1 → 2.11.0-1
- ros-humble-vector-pursuit-controller: 1.0.1-1 → 1.0.2-2
- ros-humble-vector-pursuit-controller-dbgsym: 1.0.1-1 → 1.0.2-2
- ros-humble-velocity-controllers: 2.50.2-1 → 2.52.0-1
- ros-humble-velocity-controllers-dbgsym: 2.50.2-1 → 2.52.0-1
- ros-humble-yasmin: 3.5.1-1 → 4.2.2-1
- ros-humble-yasmin-dbgsym: 3.5.1-1 → 4.2.2-1
- ros-humble-yasmin-demos: 3.5.1-1 → 4.2.2-1
- ros-humble-yasmin-demos-dbgsym: 3.5.1-1 → 4.2.2-1
- ros-humble-yasmin-msgs: 3.5.1-1 → 4.2.2-1
- ros-humble-yasmin-msgs-dbgsym: 3.5.1-1 → 4.2.2-1
- ros-humble-yasmin-ros: 3.5.1-1 → 4.2.2-1
- ros-humble-yasmin-ros-dbgsym: 3.5.1-1 → 4.2.2-1
- ros-humble-yasmin-viewer: 3.5.1-1 → 4.2.2-1
- ros-humble-yasmin-viewer-dbgsym: 3.5.1-1 → 4.2.2-1
- ros-humble-zmqpp-vendor: 0.0.2-1 → 0.1.0-3
- ros-humble-zmqpp-vendor-dbgsym: 0.0.2-1 → 0.1.0-3
Removed Packages [7]:
- ros-humble-feetech-ros2-driver
- ros-humble-feetech-ros2-driver-dbgsym
- ros-humble-generate-parameter-library-example
- ros-humble-generate-parameter-library-example-dbgsym
- ros-humble-generate-parameter-library-example-external
- ros-humble-generate-parameter-library-example-external-dbgsym
- ros-humble-generate-parameter-module-example
Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:
- Adam Serafin
- Automatika Robotics
- Autoware
- Bence Magyar
- Berkay Karaman
- Bernd Pfrommer
- Chris Lalancette
- Christian Rauch
- Davide Faconti
- Felix Exner
- Fictionlab
- Gerardo Puga
- Haroon Rasheed
- Husarion
- Ivan Paunovic
- Jacob Perron
- Jeremie Deray
- John Wason
- Jordan Palacios
- Jose Luis Blanco-Claraco
- Jose-Luis Blanco-Claraco
- José Luis Blanco-Claraco
- Kostubh Khandelwal
- Luis Camero
- M. Fatih Cırıt
- Markus Bader
- Masaya Kataoka
- Meher Malladi
- Michel Hidalgo
- Miguel Ángel González Santamarta
- Olivier Kermorgant
- Pyo
- Raul Sanchez Mateos
- Ryohsuke Mitsudome
- Sai Kishor Kothakota
- Samuel Hafner
- Shane Loretz
- Southwest Research Institute
- Stefan Fabian
- Steven! Ragnarök
- Temkei Kem
- Tibor Dome
- Tomoya Fujita
- Tyler Weaver
- Vincent Rabaud
- Vladimir Ermakov
- Víctor Mayoral-Vilches
- Yukihiro Saito
- bmagyar
- li9i
- miguel
- victor
1 post - 1 participant
ROS Industrial: ROSCon 2025 & RIC-AP Summit 2025 Blog Series: Singapore’s Defining Week for Open-Source Robotics
As we look back on 2025, this blog is a recap of one of the most impactful weeks for open-source robotics in the Asia-Pacific region.
On 30 October, the RIC-AP Summit expanded beyond conference halls into the real world with a series of curated site tours across Singapore. These tours showcased how ROS and Open-RMF are not just concepts but living deployments across manufacturing, healthcare, and smart infrastructure.
If the Summit sessions were about vision and strategy, the tours were about seeing robotics in motion—from factory floors to hospitals, airports, and digital districts.
Importantly, the tours brought together participants from different companies and countries, reflecting the truly international nature of the ROS-Industrial community and the collaborative spirit of Asia Pacific’s robotics ecosystem.
1. ROS in Manufacturing: SIMTech & ARTC + Black Sesame Technologies, Singapore Polytechnic
SIMTech & ARTC
Spotlight on smart manufacturing innovations.
Demonstrations of autonomous material handling and intelligent inspection systems.
ROS-powered robotics showing how open-source frameworks are shaping industrial transformation.
Reinforced Singapore’s role as a hub for advanced automation and digitalisation.
Singapore Polytechnic – Robotics, Automation and Control (RAC) Hub
Cutting-edge RAC Hub at the School of Electrical and Electronic Engineering.
Co-location labs with industry partners like ShenHao and JP Neura.
Demonstrations of collaborative and inspection robotics powered by ROS.
Clear example of academia-industry collaboration driving automation and intelligent control systems.
2. RMF Deployment in Healthcare & Reconfigurable Robotics: CHART, SUTD
CHART – Centre for Healthcare Assistive & Robotics Technology (CGH)
Demonstration of RoMi-H (Robotic Middleware for Healthcare), built on Open-RMF.
Multi-fleet interoperability enabling ROS and non-ROS robots to work seamlessly in hospitals.
Integration with lifts, automatic doors, and enterprise systems for streamlined operations.
Showcased how robotics enhance patient care and operational efficiency in smart hospitals.
SUTD – Reconfigurable Robotics Showcase
Outdoor mosquito-catching robot “Dragonfly” and snake-repulsing “Naja.”
Infrastructure-focused robots like “Meerkat” and “Panthera 2.0.”
Nested reconfigurable robots demonstrating adaptability across environments.
A creative exploration of embodied AI, blending research ingenuity with real-world challenges.
3. RMF/ROS Deployments: CAG, CPCA, KABAM Robotics, Punggol Digital District – Panasonic
Panasonic – Fleet Management with RMF
Proprietary AI-enhanced RMF integration.
Features like congestion detection, human presence recognition in elevators, and prevention of unintended companion following.
Practical, operationally relevant fleet management for smart districts.
KABAM Robotics
Smart+ RMF Solution integrating multi-robot coordination with PABX and access systems.
Security robots tied into surveillance, access control, and facility management.
Tour of R&D facilities showcasing innovation in robotics for secure, automated environments.
Changi Airport Group (CAG)
Firsthand insights into CAG’s Open-RMF journey.
Live demonstrations of RMF features supporting airport operations.
Strategic vision for scaling interoperability across one of the world’s busiest airports.
CPCA – Hospitality Robotics Integration
Work-in-progress deployment of cleaning and delivery robots in hotel operations.
Robots integrated with lifts and automated doors via RMF dashboard.
Future vision: hotel staff requesting ad hoc robot tasks via StayPlease app.
Demonstrations of robots performing floor cleaning, restaurant bussing, and seamless interaction with smart infrastructure.
RIC-AP Summit Tour 2025: Key Takeaways
Manufacturing track: ROS is powering industrial transformation, bridging academia and industry.
Healthcare track: Open-RMF is operationalised in hospitals, enhancing patient care and efficiency.
Smart infrastructure track: Airports, hotels, and digital districts are adopting RMF for multi-robot orchestration.
The tours underscored a powerful message: Singapore is not just hosting conversations about robotics—it is living them. From labs to live deployments, the RIC-AP Summit tours demonstrated how open-source robotics is shaping industries, communities, and everyday life.
ROS Discourse General: High frequency log persistence on Jetson Orin (Rosbag alternative?)
Hi everyone,
My team has been working on a storage engine specifically optimized for the Jetson/Orin architecture to handle high bandwidth sensor streams (Lidar/Cameras) that tend to choke rosbag record or mcap writing at the edge.
The main architectural difference is that we bypass the kernel page cache and stream directly to NVMe using custom drivers. We are seeing sustained writes of ~1GB/s with <10us latency on Orin AGX, even ensuring persistence during power cuts (no RAM buffer loss).
We are looking for 3-5 teams running ROS 2 on hardware to test a binary adapter we wrote. It exposes a standard ROS 2 subscriber but pipes the data into our crash-proof storage instead of the standard recorder.
If you are hitting bottlenecks with dropped messages at high frequency or struggling with data corruption on power loss, this might solve it.
DM me or reply here and I can send over the binary for aarch64.
1 post - 1 participant
ROS Discourse General: Best practices for thermal camera intrinsics (FLIR A400) in sensor fusion
I’m working with a FLIR A400 thermal camera as part of a sensor-fusion pipeline
(thermal + radar / LiDAR).
I just found that unlike RGB cameras, FLIR does not expose factory intrinsics, and traditional
OpenCV checkerboard calibration has proven unreliable due to thermal contrast
limitations.
I wanted to start a discussion on what practitioners typically do in this case:
- Using FOV-derived pinhole intrinsics (fx, fy from datasheet FOV)
- Optimizing intrinsics during downstream tasks (SLAM / NeRF / reconstruction)
- Avoiding explicit intrinsics and relying on extrinsics only
I’m especially interested in what has worked in real robotic systems rather than
textbook calibration.
Looking forward to hearing how others approach this.
5 posts - 5 participants
ROS Discourse General: Importing PiPER URDF into Isaac_Sim
Importing PiPER URDF into Isaac_Sim
Preface
With the continuous development of robotics technology, simulation platforms play an increasingly important role in the research, development and testing of robotic arms. As a high-performance simulation tool launched by NVIDIA, Isaac Sim helps developers efficiently model, simulate and verify algorithms for robotic arms. This article will detail how to import the URDF model of the PiPER robotic arm into Isaac Sim, perform relevant configurations and operations, providing a reference for subsequent development and applications.
Tags
PiPER robotic arm、Isaac Sim
Respositories
- Navigation Respository: GitHub - agilexrobotics/Agilex-College: Agilex College
- Project Repository: GitHub - agilexrobotics/piper_isaac_sim: piper_isaac_sim
Environment Configuration
- Operating System:Ubuntu 24.04
- ROS Version:ROS2 jazzy
- Graphics Card:5090
Install Graphics Card Driver
sudo apt update
sudo apt upgrade
sudo add-apt-repository ppa:graphics-drivers/ppa
sudo apt update
sudo ubuntu-drivers autoinstall
#reboot
reboot
After rebooting, you can use the following command to verify if the driver is installed successfully:
nvidia-smi
Install isaac sim
Installation Method: Installation — Isaac Sim Documentation
After downloading according to the link, unzip it using the following method:
cd ~/Downloads
unzip "isaac-sim-standalone-5.1.0-linux-x86_64.zip" -d ~/
cd isaac-sim-standalone-5.1.0-linux-x86_64/
./post_install.sh
./isaac-sim.selector.sh
Select isaacsim.ros2.bridge for the ROS Bridge Extension; then click Start to launch:
Once successfully opened, you can prepare to import the URDF model.
Import URDF
Download URDF Model
Download Link: GitHub - agilexrobotics/piper_isaac_sim: piper_isaac_sim
The URDF files and USD for the Piper series will be continuously updated in the future.
After opening Isaac Sim; select File->Import in the upper left corner; select the URDF model to import according to the actual path:
After successful import, you can see the robotic arm appear at the center position; you can add a ground plane and increase the brightness:
Click the triangle button on the left; after starting the simulation, you will find that the gripper moves. This is because some physical parameters are not defined in the imported URDF and need to be set in Isaac Sim:
The parameter setting method is as follows: open joint1 of the robotic arm; set Damping to 80 and Stiffness to 400 in Drive->Angular ; set all movable joints in the same way.
These parameters are for reference only.
After setting, start the simulation again, and the robotic arm is successfully imported.
Add Camera
Right-click in the blank space, select Create->Camera ; create a new camera perspective:
After creation, you need to adjust the camera perspective in Property->Transform ; then in Visual->Visibility , select invisible to hide the camera:
Right-click in the blank space, select Create->Visual Scripting->Action Graph ; create an Action Graph to publish the camera perspective via ROS2:
The content in the Action Graph is shown below:
After connecting the modules, you need to set some parameters:
For Isaac Create Render Product ; select the newly created camera perspective for camera Prim :
For ROS2 Camera Helper ; you can set the frame id and topicName of the camera topic:
Press Ctrl+S to save the USD model, and the import of the Piper USD model is completed.
This article details the complete process of importing the PiPER robotic arm URDF model into the Isaac Sim environment, including environment configuration, model import, physical parameter setting, and camera perspective creation and ROS2 topic publishing. Through these steps, developers can quickly realize visualization and interaction of the PiPER robotic arm in the simulation environment, laying a solid foundation for subsequent algorithm development and system integration. If you encounter problems during actual operation, you can refer to relevant official documents or community resources for further study and communication.
1 post - 1 participant
ROS Discourse General: ROS 2 Kilted on macOS Apple Silicon — TurtleBot4 + Navigation Stack Working End-to-End
I’d like to share the third major milestone in my ongoing work to run ROS 2 natively on macOS Apple Silicon.
At this point, TurtleBot4 is fully functional on ROS 2 Kilted, tested end-to-end with the complete mobile robotics stack:
-
Gazebo Ionic
-
slam_toolbox
-
Navigation2
-
ros2_control (gz_ros2_control)
This setup validates the full mobile robotics pipeline on macOS: simulation, control, SLAM, and autonomous navigation.
What is working reliably
-
SLAM + Navigation2 running stably on Apple Silicon
-
gz_ros2_control integrated and functioning correctly with TurtleBot4
-
Clean integration with ROS 2 Kilted (much closer to upstream than Humble)
-
No runtime hacks, manual relinking, or environment-variable workarounds
After successfully validating Gazebo, MoveIt 2, ros2_control, and now Navigation2, I’ve published the Kilted branchpublicly:
ROS 2 macOS (Kilted):
https://github.com/idesign0/ros2_macOS/tree/kilted
To make TurtleBot4 work correctly with Gazebo Ionic + ROS 2 Kilted, some targeted changes were required compared to the current official simulator instructions.
Those changes are now packaged and ready to test here:
TurtleBot4 (Gazebo Ionic + Kilted, macOS):
https://github.com/idesign0/ROS2_Humble/tree/kilted-mac/turtlebot4
Reference documentation (for comparison):
https://turtlebot.github.io/turtlebot4-user-manual/software/turtlebot4_simulator.html
Toolchain improvements (Kilted, macOS)
One of the biggest improvements over my earlier Humble-on-macOS setup is the maturity of the Kilted toolchain, particularly around RPATH handling and merged installs.
The updated toolchain provides:
-
Proper RPATH configuration for merged installs
-
Reliable runtime library resolution using
@loader_path -
Clean handling of external dependencies (e.g., Boost)
-
No reliance on
DYLD_LIBRARY_PATHor manual relinking
As a result, I’ve seen no dynamic library loading issues at runtime, even when launching large stacks like Navigation2and MoveIt 2.
Startup behavior is noticeably smoother, lifecycle transitions are clean, and systems come up consistently—issues that were common for me on macOS Humble.
At this point, the toolchain feels mature enough to support Gazebo, MoveIt 2, ros2_control, and Nav2 under a single, consistent build setup.
Toolchain reference:
https://github.com/idesign0/ros2_macOS/blob/kilted/cmake/toolchain.cmake
Demo video
A short demo of TurtleBot4 running SLAM and Navigation2 on macOS Apple Silicon (Gazebo Ionic + ROS 2 Kilted) is available here:
https://www.linkedin.com/posts/classy29_ros2-ros-navigation2-activity-7409757575837036544-vJtq
This feels like the point where ROS 2 on macOS Apple Silicon moves from experimental to genuinely usable for larger systems.
If you are running ROS 2 on Apple Silicon, feedback and testing are very welcome.
I’ll also be adding Kilted-specific setup and usage instructions to the README soon.
More demos and upstream-related work to follow.
1 post - 1 participant
ROS Discourse General: How to get a type of the message having its string type name in C++?
As an input I have a topic name. I want to get the message type of this topic.
- I know how to find the string type name using
node->get_topic_names_and_types();. The resulting string type name is smth likercl_interfaces/msg/ParameterEvent(just for example) - Is there any ways in C++ how to get the instance of
rcl_interfaces::msg::ParameterEventor it bumps into C++ being static typed language?
In Python I found the following way of doing that:
import argparse
from ros2cli.node.strategy import NodeStrategy
import rosidl_runtime_py.utilities
with NodeStrategy(argparse.Namespace()) as node:
for (
topic,
topic_type,
) in node.daemon_node.get_topic_names_and_types():
if topic == requested_topic:
requested_type_str = topic_type[0]
topic_type = rosidl_runtime_py.utilities.get_message(requested_type_str)
print(topic_type) # would print something like <class 'my_custom_package.msg._my_custom_message.MyCustomMessage'>)
3 posts - 3 participants
ROS Discourse General: UrdfArchitect: A AI-powered visual editor to build robot models without manual XML coding
Hi ROS Community,
I’m excited to share a new tool I’ve been developing called UrdfArchitect.
UrdfArchitect is a state-of-the-art, web-based visual environment engineered for the seamless creation, manipulation, and export of Unified Robot Description Format (URDF) models. By abstracting the complexities of raw XML authoring into an intuitive graphical interface, it empowers roboticists to focus on design and innovation.
This platform orchestrates the entire robotic modeling lifecycle—from kinematic skeleton definition to high-fidelity geometric detailing and precise hardware specification. Enhanced by Generative AI, UrdfArchitect accelerates prototyping and ensures compatibility with industry-standard simulation ecosystems like MuJoCo.
Live demo: https://urdf.d-robotics.cc/
GitHub Link: https://github.com/OpenLegged/URDF-Architect
Core Capabilities
-
Multi-Mode Design: Seamlessly switch between Skeleton (kinematics), Detail (meshes/collisions), and Hardware (actuator/transmission) design phases. -
Immersive 3D Workspace: Real-time, high-fidelity visualization powered by Three.js. Includes professional transformation gizmos and instant visual analytics for joint axes and frames. -
AI-Augmented Engineering: A natural language interface (OpenAI/DeepSeek) to automate complex tasks—generate entire quadruped platforms or integrate sensors using simple text prompts. -
Seamless Interoperability: * Import: One-click ZIP ingest of URDFs and meshes.- Export: Production-ready packages including standard URDFs, automated BOM (CSV), and pre-configured MuJoCo XML.
-
Built-in Motor Library: Instant access to industry-standard actuators from Unitree (Go1/A1) and RobStride, with easy custom extensions.
I’d love to hear your feedback or feature requests! If you find it useful, feel free to give it a
on GitHub or contribute to the development.
2 posts - 1 participant
ROS Industrial: ROSCon 2025 & RIC-AP Summit 2025 Blog Series: Singapore’s Defining Week for Open-Source Robotics
On 30 October, the focus shifted from global to regional at the ROS-Industrial Consortium Asia Pacific Summit, held in conjunction with SWITCH and ROSCon.
ROSCon focused more on the global community, while the RIC-AP Summit was about the growth and contributions within Asia Pacific: deployments, testbeds, standards, talent, and industry partnerships.
1.Launch of ELEVATE @ BCA Braddell Campus
A headline moment of the Annual Summit was the showcase of ELEVATE @ BCA Braddell Campus—an Open-RMF sandbox designed for robot OEMs, system integrators and even end users to validate interoperable solutions, test technical compliance and adopt standard practices for robotics within the built environment space. a-star.edu.sg
Key aspects:
Cyber-physical test environment to validate how robots interact with lifts, doors, building management systems and shared infrastructure using Open-RMF.
Open to end-users, OEMs, system integrators and startups to trial real deployments before scaling into commercial deployments
Backed by a multi-agency partnership (A*STAR ARTC, BCA and S&TPPO) and seeded with leading industry participants including robotics and infrastructure providers.
ELEVATE positions Singapore as a reference site for Open-RMF adoption and conformance, giving both local and international players a low-friction path to test, integrate and prove interoperable systems.
During the summit, 8 companies has shown their commitment to use the space to advance open-RMF through the use of ELEVATE for testing. The companies that have signed a Letter of Support (LoS) are:
Black Sesame Technologies
Delta Electronics
HOPE Technik
KABAM Robotics
Lionsbot
Megazo Technologies
Panasonic
SIMPPLE
1 ELEVATE stands for EvaLuatE, VAlidate, Test Environment.
Photographs used with permission from Singapore Week of Innovation and Technology (SWITCH) www.switchsg.org
2. Expansion of the ROS-Industrial Train and Place Programme with SGInnovate
The Annual Summit also marked the announcement of a partnership between ROS-Industrial Consortium Asia Pacific (led by ARTC) and SGInnovate, to leverage on the Deep Tech Central platform to accelerate ROS and robotics talent placement into industry roles.
Key Highlights:
Future-Ready Robotics Workforce: ROS-Industrial Consortium Asia Pacific (ROS-I AP), led by A*STAR ARTC, partners with SGInnovate to strengthen Singapore’s robotics talent pipeline.
Industry Placement Focus: New partnership emphasizes connecting talent directly with robotics companies such as AiTreat, Fabrica AI, Griffin Labs, Hivebotics, and Vilota, and many others.
Hands-On Experience: Trainees gain practical deployment exposure, addressing talent gaps in robotics and embodied AI.
Data-Driven Insights: Deep Tech Central provides analytics to understand talent needs and industry demand, strengthening Singapore’s robotics ecosystem and contributing globally.
Through this collaboration, we connect talent with real deployment opportunities, strengthening Singapore’s position in robotics and deep tech.
What We Shared at RIC-AP Summit 2025 @ SWITCH Beyond: Highlights from the Stage
The summit showcased cutting-edge developments in robotics and embodied AI, starting with a keynote on open-source frameworks like ROS 2 and Open-RMF driving adaptive systems, with Yadu from Intrinsic setting the stage for this transformative conversation
Global leaders shared the state of ROS-Industrial across APAC, America, and Europe, emphasizing collaboration and talent exchange, with insights from Paul Evans (Executive Director, SwRI), Vishnuprasad Prachandabhanu (Consortium Manager, ROS-I Europe at Fraunhofer), and Maria Vergo (Consortium Manager, ROS-I APAC)
Photographs used with permission from Singapore Week of Innovation and Technology (SWITCH) www.switchsg.org
Real-world Open-RMF deployments in airports and hospitals demonstrated its maturity as production infrastructure, while the ELEVATE Sandbox at BCA Braddell Campus was introduced as the national testbed for interoperability.
Panels spotlighted diversity through Women in Robotics and explored multidisciplinary challenges in embodied AI beyond code. It was refreshing to hear from women leaders such as Suchitra Narayan (SGInnovate), Chan Min Ling (HMGICS), Samantha Su (IMDA), and Prof. Malika Meghjani (SUTD), who shared inspiring stories of leadership, innovation, and resilience in robotics.
Photographs used with permission from Singapore Week of Innovation and Technology (SWITCH) www.switchsg.org
Future-facing sessions explored two critical themes: the rise of humanoid robotics and the need for a strong talent pipeline. Prof. Han Boon Siew (Schaeffler) delivered an insightful presentation on innovations in humanoid design, mobility, and interaction, framing their societal impact and strategic opportunities in Asia.
Adding to the excitement, Panasonic’s Duyhinh Nguyen shared their journey with Open-RMF, underscoring growing interest from Japanese companies in interoperability and real-world deployments.
On the talent front, Priscilla (SGInnovate) and Sheila (ROS-I APAC) introduced the ROS-Industrial Train-and-Place Programme, calling on industry partners to collaborate in building a future-ready robotics workforce through Deep Tech Central.
Photographs used with permission from Singapore Week of Innovation and Technology (SWITCH) www.switchsg.org
The RIC-AP Summit 2025 made one thing clear: Asia Pacific is not just participating in the robotics revolution—it is leading it. From the heart of Singapore, the region is building the future of interoperable robotics, where open-source frameworks, industry partnerships, and talent converge to transform industries and societies.
A special thanks to all our speakers and panellists for sharing their insights and driving meaningful conversations that shape the future of robotics.
ROS Discourse General: Low-cost ROS2/SLAM educational kit upgraded to Jazzy
Hello,
a while ago I designed, manufactured and made available commercially a low-cost LiDAR robot kit for ROS2 beginners. This includes Nav2, SLAM, Gazebo and complete step-by-step from-scratch video instructions. Everything is open source.
I’d like to share that the kit’s software has been upgraded to ROS2 Jazzy.
This includes porting all Gazebo simulations
This means, if you are building (or migrating) a differential LiDAR robot for Jazzy or later - you can copy my working Gazebo simulations.
Here are pointers to more information:
- Kit Online Store | Maker's Pet
- Release announcement Kaia.ai ROS2 Jazzy Released | Kaia.ai
- Complete step-by-step instructions (Iron, just use the Jazzy Docker image) https://www.youtube.com/watch?v=RCPUQmvS37Q&list=PLOSXKDW70aR8uA1IFahSKVuk5ODDfjTZV
1 post - 1 participant
ROS Discourse General: Upcoming a new RMW Implementation RMW_SWIFTDDS by Greenstone
We are excited to announce that a new ROS2 RMW (ROS Middleware) layer implementation based on SWIFT_DDS will be released soon! This integration enables ROS2 developers to leverage the high-performance, safety-certified commercial DDS communication middleware developed by Greenstone Technology Co., Ltd.
This RMW implementation contains one projects:
- RMW for SWIFT_DDS: https://github.com/greenstonesoft/rmw_swiftdds
About Greenstone
Greenstone Technology Co., Ltd. (Greenstone) was founded in 2020 by a team with strong roots in Tsinghua University. The company brings together years of research and commercial experience across multiple domains including communications, computer science, intelligent driving, and artificial intelligence. Greenstone is dedicated to building fully autonomous and controllable intelligent driving foundational software platforms through its proprietary core technologies.
For more information, please visit https://www.greenstonesoft.com/en_homepage.
About SWIFT_DDS
1. Remarkable Performance & High reliability
-
Commercial DDS product with 6 years of R&D and verifications by projects from OEMs and Tier1’s
-
Proprietary deterministic execution/communication technology, guarantee the real-time data communication
-
Low overhead and high throughput
-
Efficient resource utilization
2. Industry-Leading Safety Guarantees
-
ISO 26262 ASIL-D functional safety product certification
-
Suitable for safety-critical applications in autonomous driving
-
Compliant with MISRA C/C++ coding standards and pass static code analysis in QAC
-
Comprehensive tests (unit, integration, embedded, and fault injection) achieving 100% safety requirements coverage and mandatory MC/DC code coverage, as required by ISO 26262 ASIL-D certification
3. Minimum Dependency
-
No third-party codes included
-
Can be easily customized/extended according to users’ needs
-
Support a wide range of SoC/OS and MCU/RTOS. Easy for cross-platform porting
4. Full DDS Standard Compliance and performance-enhancing extensions
-
Full compliance with DDS standard specifications
-
Support for all DDS QoS policies
-
Seamless interoperability with other DDS implementations
-
Supported Customized Features:
-
ZeroCopy: Enables direct data transfer mechanisms to eliminate unnecessary memory copying operations
-
Inter-core communication: Support direct inter-core communication on heterogeneous chips (Horizon Robotics Journal 6, TDA4, S32G, etc.) such as arm A Core with R Core and arm A Core with M Core
-
UDP_FlowControl: Supports configuring DataWriter transmission and DataReader reception bandwidth at the process level, with flexible options to set either individually or simultaneously—the most restrictive flow limit will take effect
-
PreferTransportOrder: Dynamically selects the optimal communication channel based on the configured channel priority order
-
NetworkPortRangeQoSPolicy: Restricts processes to operate within a specified UDP port range
-
ThreadInstanceConfigQoSPolicy: When enabled, elevates thread resources from the participant level to the process level, reducing the number of threads created for nodes
-
5. Comprehensive toolchains that further accelerate system integration
- Developers only need to focus on the intelligent driving logic and algorithms. Compatible with common intelligent driving chips, help fast integration and verification of intelligent driving products
Performance Results
Based on the ROS2 standard performance testing framework (https://github.com/ros2/performance_test), we executed systematic performance evaluations of SWIFT_DDS for both fixed-length array data and variable-length string data.
I. Hardware Configuration
1. CPU: x86_64, Intel(R) Xeon(R) E-2314 CPU, 4 cores/4 threads, 2.80 GHz
2. Memory: 32 GB DDR4 3200 MHz
3. Network Card: NetXtreme BCM5720 Gigabit Ethernet PCIe
II. System Configuration
1. Operating System: Ubuntu 24.04 LTS, kernel 6.8.0-87-generic, x86_64
2. Compiler: GCC 13.3.0
The tests covered key metrics including latency, throughput, and CPU utilization. The results show that SWIFT_DDS delivers excellent performance across all tests. Detailed data and testing methodologies are documented in the attached test report(https://greenstonesoft.github.io/performance_test_report/swift_array_result.html,https://greenstonesoft.github.io/performance_test_report/swift_string_result.html), and we welcome the community to review and provide feedback.
The Merge Request for the RMW_SWIFTDDS binary package has now been submitted. Once the implementation is approved and officially released, we will post an update announcement here. Stay tuned for further updates!Thanks.
Below is some test result excerpted from the attached test report. The latency results for different transport channels are shown: Intra-process communication using local transmission (INTRA), inter-process communication using shared memory transmission (SHMEM), and inter-process communication using zero_copy transmission (Zero Copy), for fixed-lengh data type (array) packets with packet size 1k, 4k, 16k, 64k, 256k, 1M, 4M, and ROS_DISABLE_LOANED_MESSAGES=0.
1 post - 1 participant
ROS Discourse General: ROSCon Review Continued | Cloud Robotics WG Meeting 2025-12-17
Please come and join us for this coming meeting at Wed, Dec 17, 2025 4:00 PM UTC→Wed, Dec 17, 2025 5:00 PM UTC, where we plan to dive deeper into the ROSCon talks collected together during the last session. By examining more details about the talks, we can highlight any that would be relevant to Logging & Observability, the current focus of the group. We can also pull out interesting tips to release as part of a blog post.
Last meeting we looked at the talks listed for ROSCon, examining the description of each for any that would be relevant to Logging & Observability. If you’re interested to see the meeting, it is available on YouTube.
The meeting link for next meeting is here, and you can sign up to our calendar or our Google Group for meeting notifications or keep an eye on the Cloud Robotics Hub.
Hopefully we will see you there!
1 post - 1 participant
ROS Discourse General: Toio ROS 2 wrapper
I published ROS 2 package for using toio.
- GitHub - atinfinity/toio_ros2: toio_ros2 is ROS 2 package for using toio.
- GitHub - atinfinity/toio_description: toio_description provides a 3D model of the toio for visualization.
You can watch demo movie.
1 post - 1 participant
ROS Discourse General: MoveIt 2 Motion Planning Working on ROS 2 Kilted for macOS Apple Silicon
ROS 2 Kilted running natively on macOS Apple Silicon — MoveIt 2 Planning Demo
After getting Gazebo Ionic and the ros_gz_bridge running smoothly on macOS Apple Silicon, I’m excited to share the next big milestone:
MoveIt 2 motion planning is now working natively on the ROS 2 Kilted build for macOS.
Using modified MoveIt 2 tutorials adapted for the Kilted branch, I’ve been testing multiple planners:
-
OMPL -
CHOMP -
pilz_industrial_motion_planner
The results are surprisingly stable — even when pushing the robot into unusual joint configurations, MoveIt 2 consistently recovers, replans, and stays responsive.
Highlights:
-
No crashes on failed planning requests — MoveIt 2 cleanly handles errors and retries. -
ros2_control runtime controller switching works reliably. -
More stable than Humble on macOS — Kilted is closer to upstream, with smoother behavior and less patching.
Resources:
-
MoveIt 2 tutorials Kilted branch update: Commit link
-
Kilted macOS ROS 2 repo: ros2_macOS
1 post - 1 participant
ROS Discourse General: What do you use for localization and mapping?
Hi folks!
Ditto. There are plenty of options out there these days, and perhaps that (partially?) explains why the choice isn’t as clear cut as it was back when robot_pose_ekf, amcl, and karto or cartographer were as much as you could get, at least for mobile robots. GitHub insights and packages.ros.org stats do not tell the whole story, so I come to you for the wisdom of the crowds.
What are you using? Why? Not looking for theoretical justifications (+1 if you have them though), some account of a package or library that you fall back often because it’s proven useful is enough.
And I must confess this isn’t pure intellectual curiosity. At Ekumen we’ve been working on Ekumen-OS/beluga for some time now, and we’ve crossed paths with unexpected users more than once. Often by chance. So if you are using it we definitely want to hear from you too.
Thanks in advance!
PS: it sounds like work but it’d be cool to have something like a community tech radar for stuff like this.
3 posts - 3 participants
ROS Industrial: One Attendees Observations from ROSCon 2025
ROSCon 2025 took place from October 27–29 in Singapore, drawing approximately 1,000 attendees from 52 countries. It was followed by the ROS-I Asia-Pacific Annual Summit on October 30, which featured speakers, panels, lab tours, and a networking dinner with about 150 participants. This year’s ROSCon showcased an array of sessions, workshops, and discussions designed to share advancements and inspire innovation in the ROS ecosystem.
The event kicked off with Day 0 workshops, showcasing highlights such as Picknik's MoveIt Pro performance demonstrations, ROS-Industrial's Scan-N-Plan approach, and a session on deploying ros2_control on ESP32 microcontrollers, alongside an engaging underwater robotics workshop. The main conference centered on emerging robotics trends, including AI integration with ROS, advanced motion planning, and the use of physics simulators like Mujoco. Key discussions covered multi-fleet coordination with OpenRMF, improved ROS workflows, Rust-based development with the feature-complete rclrs, and identified challenges like scaling AI with multi-modal sensors and improving policy execution pipelines.
Attendees left with actionable insights, including the importance of building familiarity with RMW Zenoh, exploring AI-driven solutions like imitation and reinforcement learning, and optimizing ROS2’s capabilities for industrial robotics applications. The conference concluded with significant community enthusiasm and a look ahead to ROSCon 2026, which will take place in Toronto. While participants were encouraged to engage and share their work, in particular submitting abstracts for talks for ROSCon 2026, there is opportunity for industry to engage and drive improvement and awareness for what makes a sustainable ROS 2 leverage possible on shop floors.
Stay tuned and we look forward to hearing more around industry needs and opportunities for industrial leverage of ROS 2.
ROS Discourse General: DDS in ROS 2: Consolidated User Insights
Hello everyone,
Before anything else: my intention here is not to turn this thread into a complaint fest.
DDS and ROS 2 are topics that periodically raise questions for me - and for others as well - so I’d like to collect and consolidate existing insights. Positive experiences, success stories, and best-case user datapoints about DDS are especially welcome!
My goal is to gather constructive feedback or identify where such information has already been summarized. I noticed several ROSCon DE 2024 talks in Strasbourg touched on related challenges, which inspired me to start this thread.
My question: What solid resources already exist - talks, guides (e.g., Clearpath’s ROS 2 networking material), previous discourse threads, etc. - that I could collect into one place?
I’d really appreciate neutral or positive, technically grounded feedback.
Thanks in advance for your help! ![]()
Cheers from Franconia
Here would be my starting list:
- ROS2 Default Behavior (Wifi) - 2020 (old)
- Clearpath ROS 2 Connectivity Guide - Jazzy Version (1-2 years old?)
- Zenoh ROSCon Talk 2023
- ROS 2 Alternative middleware report - 2023
- Another Zenoh ROSCon Talk 2024
- From DDS to Zenoh Migrating the Dexory Autonomy ROS Stack—Configuration, Performance, and External Integration - 2025 ROSCon
- Zenoh Strikes Back: From a New Hope to Tier-1 - 2025 ROSCon
- DoosanRobotics - Toward Scalable Collaborative Robot Controllers: Internalizing ROS 2 With Zenoh - ROSCon 2025
Please share, if you have any more datapoints.
11 posts - 10 participants
ROS Discourse General: ROS 2 Kilted Running Natively on macOS Apple Silicon – First Demo
Hi everyone,
After months of debugging, patching, and testing, I’m excited to share a first working demo of ROS 2 Kilted running natively on macOS Apple Silicon. This includes support for major robotics frameworks:
-
Gazebo (Ionic) -
ros2_control -
MoveIt 2 -
Navigation2
Demo (Video) Link: LinkedIn Demo
The demo showcases ros_gz_bridge in action, with real-time sensor and image data flowing between ROS 2 and Gazebo Ionic. Even ros2_control is running reliably — I’m still verifying some edge cases, but the core functionality is stable.
Most of the related pull requests have been merged upstream, so macOS support should soon be visible in the community. ![]()
I haven’t published the full kilted branch yet — planning to release it once I have solid demos for MoveIt 2 and Navigation2. Meanwhile, the Humble branch is already functional:
Humble Branch on GitHub
I’d love to hear feedback, suggestions, or experiences from anyone trying ROS 2 on macOS Apple Silicon!
Thanks,
Dhruv Patel
LinkedIn: https://www.linkedin.com/in/classy29
GitHub: https://github.com/idesign0
1 post - 1 participant
ROS Discourse General: Announcing synchros2 from the RAI Institute!
We’re excited to announce the release of synchros2 for ROS 2! ![]()
synchros2 is our open source package designed to make synchronization in ROS 2 easier, faster, and more reliable.synchros2 simplifies the challenge of synchronous programming in ROS 2 by providing a clean, modern Python API and flexible synchronization policies. We hope this helps developers and researchers to more easily use ROS 2 through features such as:
- Blocking in callbacks (subscribers, services, actions, etc) without creating deadlocks
- Single-node-per-process semantics similar to ROS 1 (optional)
- Wrapper APIs for subscribers with features like looping through messages in realtime and waiting for the next message.
We recommend starting with the Getting Started Guide:
You can explore the full API and usage docs here:
If you already have a ROS 2 project in the works and are interested in migrating to synchros2, a guide is available for that here: Migrating from rclpy to synchros2 — synchros2 1.0.4 documentation
synchros2 has been released into Humble and is in staging for Jazzy.
A big thank you to everyone who provided feedback, tested early versions, and helped get synchros2 to this milestone! And an extra big thank you to @hidmic, @khughes1, and @jbarry!
As always, contributions and issues are welcome on our repository!
— The RAI Institute
1 post - 1 participant
ROS Discourse General: Seeking ROS2-Based Drone with Onboard SLAM Capability for Educational Use
Hello everyone,
I would like to have your recommendations on ROS2-compatible drones suitable for educational and research purposes. I’ve been through several options but haven’t found the ideal solution yet.
My Requirements:
-
ROS2 native support or well-maintained ROS2 integration
-
Onboard sensors capable of SLAM (3D LiDAR, RGBD camera, or stereo camera)
-
Ability to operate indoors without external positioning infrastructure
-
Budget: approximately $6,000 USD
What I’ve Tried/Considered:
I came across this helpful discussion: https://discourse.openrobotics.org/t/trying-to-find-pre-built-drones/44168, which recommends the Crazyflie platform. While Crazyflie is excellent for swarm research and basic control, it requires external infrastructure such as motion capture systems or marker-based localisation (e.g., Lighthouse or Loco Positioning), which isn’t practical for my use case.
Similarly, I’ve used DJI Tello drones, but they share the same limitation—reliance on external environmental setup for accurate localisation and mapping.
What I’m Currently Considering:
I’ve been looking at the ModalAI Starling 2 Max (https://www.modalai.com/products/starling-2-max?variant=48172375900484), which appears promising with its VOXL 2 flight computer, stereo cameras, and PX4/ROS2 support. However, I’d appreciate feedback from anyone who has hands-on experience with this platform, particularly regarding:
-
Ease of integration with ROS2
-
Reliability of onboard VIO/SLAM for indoor navigation
-
Suitability for student projects and coursework
-
Documentation quality and community support
Use Case:
The drones will be used for teaching autonomous navigation, path planning, and SLAM concepts to postgraduate students. Ideally, students should be able to develop and test algorithms in simulation (Gazebo/Webots/PyBullet) and deploy them on real hardware with minimal friction.
I’d greatly appreciate any recommendations, alternatives, or insights from those with experience in this area. If there are other platforms I should consider within this budget range, please do share.
Thank you in advance for your help!
1 post - 1 participant
ROS Discourse General: Awesome list of ROS2 packages
Hi community.
I have created an awesome list of ROS2 packages on my github. It covers a wide range of topics, like motion and planning, localization (SLAM algorithms), logging, monitoring, client libraries for different languages, useful tools for development, AI based tools etc.
It’s regularly updated with new intreresting packages. Hope it will be useful for everyone in the community.
Here is this Awesome list of ROS2 packages.
2 posts - 1 participant






















