[Documentation] [TitleIndex] [WordIndex

Planet ROS

Planet ROS - http://planet.ros.org

Planet ROS - http://planet.ros.org[WWW] http://planet.ros.org


ROS Industrial: ROSCon 2025 & RIC-AP Summit 2025 Blog Series: Singapore’s Defining Week for Open-Source Robotics

From 27–30 October 2025, Singapore became the beating heart of the global ROS ecosystem.

Over three days, **ROSCon 2025 (27–29 Oct) ** convened more than 1,000 participants from 52 countries maintainers, developers, startups, MNCs, public agencies and researchers—united by a shared belief in open-source as the fastest path to real-world robotics at scale.

This was more than a conference week. It was a signal: open, interoperable robotics—anchored in Singapore, built with the world - is here to stay.

Where Code Meets Collaboration: Reflections from ROSCon 2025 Singapore

Hosted in Singapore for the first time, ROSCon 2025 brought the global ROS community to Marina Bay with three intense days of technical talks, tutorials, demos and hallway architecture debates.

The event was honoured by the presence of Prof Tan Chor Chuan, Chairman of A*STAR

In his remarks, Prof Tan highlighted how open-source innovation, collaborative standards, and talent development are becoming the cornerstones of Singapore’s advanced manufacturing and robotics strategy.
He commended the Open Source Robotics Foundation (OSRF) and A*STAR’s Advanced Remanufacturing and Technology Centre (ARTC) for their leadership in cultivating an ecosystem that bridges research and industry, noting that:

“Open-source robotics represents not only technological advancement but also a new model of global cooperation. By enabling interoperability and collective innovation, we can accelerate deployment across sectors — from manufacturing to healthcare — while nurturing the next generation of deep-tech talent in Singapore.”

Prof Tan’s message set the tone for the conference — underscoring Singapore’s commitment to being a neutral and collaborative hub for open-source robotics, embodied AI, and digital transformation.

Beyond the energy on stage and in the expo hall, several milestones framed the week:

1.OSRF–ARTC Collaboration on Open-RMF

At ROSCon, the Open Source Robotics Foundation (OSRF) and A*STAR’s Advanced Remanufacturing and Technology Centre (ARTC) announced a strategic collaboration to:

    • Co-develop best practices, guidelines and testing plans for Open-RMF as a foundation for global robot interoperability.

    • Use Singapore’s new national sandbox at BCA Braddell Campus as a reference site for validation and certification of RMF-based deployments.

    • Strengthen community engagement so that Open-RMF continues to evolve as a truly open, production-grade standard.

This partnership cements Singapore’s role not just as a user of open-source robotics, but as a shaper of global interoperability standards

2. National Standards & Testbeds for Interoperability

Announcements around SS 713 (data exchange between robots, lifts and automated doorways) and TR 130 (interoperability between robots and central command systems) showcased how regulation, infrastructure and open-source can move in lockstep to make multi-vendor robot fleets safe and scalable.

3. Singapore as Neutral, Open Hub

With delegates and contributors from across the US, Europe, China, India, and the broader Asia Pacific, ROSCon 2025 reinforced Singapore’s unique role as:

    • A neutral ground for collaboration amid a more fragmented geopolitical landscape.

    • A trusted environment to host shared infrastructure, reference implementations and standards for open-source robotics, embodied AI and Open-RMF-driven ecosystems.

As the curtains close on ROSCon 2025 in Singapore, we are deeply honoured and inspired to have hosted this extraordinary gathering of over a thousand innovators, engineers, and visionaries from across 52 countries. The energy, ideas, and partnerships sparked over these few days reaffirm the strength of the open-source robotics community — one that thrives on collaboration, inclusivity, and shared purpose.

At A*STAR’s Advanced Remanufacturing and Technology Centre (ARTC) and ROS-Industrial Consortium Asia Pacifi, we are excited to continue nurturing these collaborations — strengthening our ties with OSRF and the global ROS community, advancing Open-RMF, and building pathways that connect research to real-world adoption.

As we look ahead, we can’t wait to see how the community will come together again for ROSCon 2026 in Toronto — where new ideas will take flight, new contributors will emerge, and the open-source movement will reach even greater heights.

 

[WWW] https://rosindustrial.org/news/2025/11/27/roscon-2025-amp-ric-ap-summit-2025-blog-series-singapores-defining-week-for-open-source-robotics

ROS Discourse General: Is ament_auto_add_library() recommended or not?

Hi,

I see some project uses ament_auto_add_library extensively, but when I tried to lookup the document of ament_auto_add_library, I can barely find any.

At the document of ament_cmake, ament_cmake user documentation — ROS 2 Documentation: Jazzy documentation, ament_auto_add_library is actually not mentioned, instead, it says to use add_library to create a library project.

So I’d like to check what is the recommended way to use ament_cmake to create a project (library or executable).

Thanks.

4 posts - 3 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/is-ament-auto-add-library-recommended-or-not/51160

ROS Discourse General: New ROS coding agent

Hi everyone, I’ve created an AI coding agent specialized for ROS. I got tired of the current LLMs being useless/hallucinating and decided to train something that actually understands ROS conventions and workspaces. You can find it here at www.contouragent.com, I’d love your feedback.

2 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/new-ros-coding-agent/51154

ROS Discourse General: Kilted: rmw_zenoh_cpp installed by default by ament_cmake_ros?

Hi guys, I’ve just noticed I had rmw_zenoh_cpp without explicitly installing it. Not that I would mind :wink:

Backtracking who brought it into the system, the reverse dependecies are: rmw_test_fixture_implementation ← ament_cmake_ros .

Is that right? I thought there’s only one default RMW installed (FastDDS).

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/kilted-rmw-zenoh-cpp-installed-by-default-by-ament-cmake-ros/51131

ROS Discourse General: New packages for Humble Hawksbill 2025-11-24

Package Updates for Humble

Added Packages [30]:

Updated Packages [654]:

Removed Packages [2]:

Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/new-packages-for-humble-hawksbill-2025-11-24/51129

ROS Industrial: Scan-N-Plan Workshop Documentation Refresh

Scan-N-Plan technologies provide tools for real-time robot trajectory planning based on 3D scan data, addressing the limitations of traditional industrial robot programming methods like teach-pendant programming or offline simulation. This approach is ideal for applications where:

The ROS-Industrial Consortium has been advancing tools to support the development of innovative end-user applications and has made them accessible for broader use. The scan_n_plan_workshop offers a ROS 2-based software framework for perception-driven surface processing, providing all the foundational elements needed to understand and implement Scan-N-Plan solutions.

Recently an updated documentation page was publisehd to serve as a comprehensive resource for developers and learners interested in Scan-N-Plan or in using ROS 2 to build industrial applications. It outlines what is included, how to get started, and how ROS 2 can be leveraged effectively.

Key features of the documentation include:

We’re excited to see how the community adopts and engages with these resources. If you have any questions or requests, don’t hesitate to reach out!

[WWW] https://rosindustrial.org/news/2025/11/24/scan-n-plan-workshop-documentation-refresh

ROS Discourse General: Declarative node interfaces + Static recovery of the ROS Graph

Hi all - wanted to share some work we’ve been doing internally to help with static recovery and visualization of our ROS graph. We did this by hijacking some other tools we’ve been using to standardize our package structure and simplify our launch files.

We ended up creating and open sourcing three packages:

cake is a concept that started as trying to simplify the boilerplate required to set up a C++ node by using a more functional approach to node initialization. For the purposes of static graph analysis, we extended it to include a declarative interface file (publishers, subscribers, etc) which would be consumed at build time to generate the ROS code required for these interfaces. A lot of inspiration was taken from the picknik robotics generate_parameter_library (in fact we wrapped this package and included parameters in the node interface file). If you follow the folder structure suggested, cake also provides an automatic cmake macro, which uses ament_cmake_auto under the hood.

clingwrap is yet another python launch wrapper. Arguments about reinventing the launch wrapper wheel aside, it was convenient for us as we were already using it in all our launch files so it provided a good way to instrument all our launch files to statically extract node launch details. clingwrap provides a LaunchBuilder object which is a subclass of LaunchDescription. The idea is that LaunchBuilder is a mutable object which will track all launch actions as it gets mutated, meaning you just have to return it at the end of generate_launch_description. This lets us add extra tracking logic inside the LaunchBuilder class and expose a get_static_information method on it which lets the user call generate_launch_description and then get_static_information on the resulting object - which returns a dataclass of node information such as package and executable name, remappings, etc. We explored parsing the underlying actions that come out of the base launchfile system, but this got complicated quickly (especially recovering composable node information!) so we fell back to this simpler solution.

breadcrumb is a cli tool that uses the static interfaces from cake and the launch information from clingwrap to generate the final runtime ROS graph, based on a launchfile provided (without executing a ROS launch). It then spits out the graph as a json file or a graphviz dot file.

The breadcrumb repo also has an example cartpole project that shows how all three packages work together in one system: breadcrumb/breadcrumb_example at main · greenforge-labs/breadcrumb · GitHub

All of these packages are still very fresh - we are rolling them out in our codebase currently, and expect to extend them where we find extra usecases / corner cases.

Whilst they are somewhat specific to our current system (i.e. you can’t use breadcrumb without rewriting all your launchfiles with clingwrap) I thought it was worth sharing what we’ve come up with for moving our ROS codebase towards being more declarative and statically analyzable.

(the statically generated ROS graph diagram from the breadcrumb example project)

2 posts - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/declarative-node-interfaces-static-recovery-of-the-ros-graph/51124

ROS Discourse General: NVIDIA Isaac ROS 4.0 for Thor has arrived

robot-arm-gear-assembly-1

NVIDIA Isaac ROS 4.0 for Thor is HERE.

NVIDIA Isaac ROS 4.0, an open-source platform of accelerated packages and reference applications for the ROS 2 ecosystem, is now generally available.

With support for Jetpack 7.0 and Isaac Sim 5.1, you can now unlock the power of Jetson AGX Thor with your Isaac ROS applications. This release includes a new Isaac for Manipulation reference application for deploying learned policies with motion planning for a gear insertion task. The new multi-object pick-and-place workflow using behavior tree orchestration showcases new packages for FoundationStereo and GroundingDINO. Finally, improvements for FoundationPose, NITROS performance, and visual mapping and location along with new Segment Anything 2 round out Isaac ROS with the power of Thor. :mechanical_arm:

Check out the full details :right_arrow: here.

2 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/nvidia-isaac-ros-4-0-for-thor-has-arrived/51079

ROS Discourse General: Invitation: Multi-Robot Systems Summer School 2026 (Prague)

We would like to share the announcement for the IEEE MRS Summer School 2026, taking place July 29 – August 4, 2026 in Prague.

The event is open to anyone working in multi-robot systems, autonomous UAV/UGV control, distributed coordination, perception, planning, or ROS-based robotics. Over the years the summer school has welcomed more than 1000 participants from academic labs and industry teams worldwide.
The main goal is to bring together people working on similar MRS challenges and create space for collaboration, exchange of ideas, and hands-on experimentation.

What the program includes:

Registration:
Early registration fees apply until December 31.
If anyone in your team needs a few extra days, the organizers can extend the reduced fee individually.

Application link:
https://forms.gle/1PWThrYoNZwtAtAU9

Event page:
https://mrs.fel.cvut.cz/summer-school-2026/

If you or your colleagues are working with multi-robot systems, this is a solid opportunity to join the global community, work with real hardware, and connect with people solving similar problems.

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/invitation-multi-robot-systems-summer-school-2026-prague/51076

ROS Discourse General: Discoverability of documentation on search engines

I’m running into a bit of a weird problem. Maybe it’s more of an observation. There are times where I struggle to find the documentation for ROS code on google.

Take the following examples:

I noticed very clearly that google never gives me ROS source code, and very often outdated API documentation pages. This is really frustrating as it takes longer than I expect to get answers about very normal functions.

I’m really not sure what’s causing this. Is it just bad SEO? Is no one linking to ROS2 docs on the web and so google doesn’t prioritize them? Am I just bad at googling? I’m curious if other people have noticed this. It’s making me feel a little crazy

7 posts - 5 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/discoverability-of-documentation-on-search-engines/51059

ROS Discourse General: Depth Anything 3 ROS 2 Wrapper Development

edited: fix link for wrapper repo

This is a repost from openrobotics.zulipchat.com that I made earlier, it was suggested I post here, so here goes…

So, over the past couple days we’ve been working on getting Depth Anything 3 (DA3 - the new monocular depth estimation model from ByteDance) running with ROS2. For those unfamiliar, Depth Anything 3 is basically a neural network that can estimate depth from a single camera image - no stereo rig or LiDAR needed. It’s pretty impressive compared to older methods like MiDaS.

Depth Anything V3 paper: [2408.02532] Singularity categories of rational double points in arbitrary characteristic
Official DA3 repo: GitHub - ByteDance-Seed/Depth-Anything-3: Depth Anything 3
Our GitHub DA3 ROS2 Wrapper Repo:

Here’s what the system looks like running on our Jetson: GitHub - GerdsenAI/GerdsenAI-Depth-Anything-3-ROS2-Wrapper: ROS2 wrapper for Depth Anything 3 (https://github.com/ByteDance-Seed/Depth-Anything-3)

You can see three terminals:

The depth visualization uses a turbo colormap (blue = close, red/orange = far). The quality is honestly better than we expected for monocular depth.

Platform: NVIDIA Jetson AGX Orin 64GB (Syslogic A4AGX64 industrial variant)

Camera: Anker PowerConf C200 2K USB webcam

Software:

Current Performance (This is Where We Need Help)

Here’s what we’re seeing:

Inference Performance:

Is PyTorch the problem? We’re running standard PyTorch with CUDA. Would TensorRT conversion give us a significant speedup? Has anyone done DA3 → TensorRT on Jetson?

Memory bandwidth? Could we be hitting memory bandwidth limits moving tensors around?

Is the model just too big for real-time? The SMALL model is 25M params. Maybe we need to quantize to FP16 or INT8?

FP16 precision - The Ampere GPU supports FP16 tensor cores. Depth estimation might not need FP32 precision.

Optimize the preprocessing - Right now we’re doing image normalization and resizing in Python/PyTorch. Could we push this to GPU kernels?

Has anyone done any of this successfully? Especially interested if anyone’s gotten DA3 or similar transformers running fast on Jetson.

The paper claims real-time performance but they’re probably testing on desktop GPUs. Getting this fast on embedded hardware is the challenge.

But, we got it working, which is cool, but 6 FPS is pretty far from real-time for most robotics applications. We’re probably doing something obviously wrong or inefficient - this is our first attempt at deploying a transformer model on Jetson.

Contact: GerdsenAI · GitHub
License: MIT

Feel free to contribute! :slight_smile:

2 posts - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/depth-anything-3-ros-2-wrapper-development/51030

ROS Discourse General: The State of ROS Remote Debugging in 2025: Are We Still Stuck with SSH X11 and Fragile Tunnels? (Survey)

I’m a robotics engineer turned product builder. After years in R&D and recently interviewing around 30 robotics teams, I noticed a pattern that honestly surprised me:

Even teams with mature deployments are still relying on fragile, “temporary” setups for remote debugging.

You may recognize some of these:

Enterprise platforms (Formant, Freedom, etc.) are powerful but often expensive or too heavy for simple debugging needs. Meanwhile, open-source solutions feel fragmented.

My Hypothesis: We don’t need another heavy “platform.” We need a simple, reliable, UNIX-style pipe that just works.

I’m exploring a “stupidly simple” API focused purely on transport (low latency, resilient under packet loss). But before I commit to the architecture, I want to validate my assumptions with you.



What’s in it for you?

  1. The Data: I’ll compile the responses into an open “2025 ROS Remote Access Landscape Report” and share it here.

  2. The Access: I’m looking for beta testers — 10 random participants will receive lifetime free access to the managed API tier at launch.

PS: After you fill out the form, drop a quick reply below (e.g., “Done” or your biggest pain point). It helps keep this thread visible so we can get more community data!

Cheers, Russell [russellshe at gmail dot com]

5 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/the-state-of-ros-remote-debugging-in-2025-are-we-still-stuck-with-ssh-x11-and-fragile-tunnels-survey/51025

ROS Discourse General: Building a true release mode only

Hi, I’ve been working on building our software stack using only release mode and not building any packages which are test_depend. The problem I’m having is colcon scoops up all the dependencies no matter how they’re marked in the package.xml. I do not use rosdep as I don’t necessarily trust every dev out there chose wisely when building their package.xml anyways, so I’m trying to do this in a more manual way. I don’t believe I should have to build something like ament_cmake_pep257 if I have no plan to build any tests. I also shouldn’t be installing *-dev debian packages in release builds. E.g. a package I have depends on libglib2.0-dev for building, but only needs libglib2.0-0 at runtime, so the process I want is to build the package in release mode, then create a new image with only the release dependencies, and copy over the install/ space to that new image. Colcon though won’t let me separate out those packages that I don’t want to build, even though they are only <test_depend>. Does anyone else do this or have thoughts?

7 posts - 4 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/building-a-true-release-mode-only/50986

ROS Discourse General: ROSCon Review | Cloud Robotics WG Meeting 2025-11-19

Please come and join us for this coming meeting at Wed, Nov 19, 2025 4:00 PM UTCWed, Nov 19, 2025 5:00 PM UTC, where we plan to read through the talks from ROSCon 2025 and note down any talks relevant to Logging & Observability, which has been an ongoing topic of research for the group.

Last meeting we had a general catch-up, including reviewing the last few sessions with guest talks and deciding on some meetings to have going forwards. If you’re interested to see the meeting, it is available on YouTube.

The meeting link for next meeting is here, and you can sign up to our calendar or our Google Group for meeting notifications or keep an eye on the Cloud Robotics Hub.

Hopefully we will see you there!

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/roscon-review-cloud-robotics-wg-meeting-2025-11-19/50982

ROS Discourse General: Foxglove announces $40M series B

Hi!

Thought I would share the news with the community. At Foxglove, we just raised a $40M series B. You can read our announcement here: Robots are eating the world that software could not.

We are looking forward to our continued ROS support and growing together with the community!

2 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/foxglove-announces-40m-series-b/50951

ROS Discourse General: ROS 2 Rust Meeting: November 2025

The next ROS 2 Rust Meeting will be Mon, Nov 10, 2025 2:00 PM UTC

The meeting room will be at https://meet.google.com/rxr-pvcv-hmu

In the unlikely event that the room needs to change, we will update this thread with the new info!

2 posts - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/ros-2-rust-meeting-november-2025/50900

ROS Discourse General: Will intrinsic supports ros2 on bazel with bzlmod?

I found out intrinsic hosts their own bzlmod registry for ros2 core packages.

Will intrinsic supports ros2 on bazel with bzlmod officially?

2 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/will-intrinsic-supports-ros2-on-bazel-with-bzlmod/50890

ROS Discourse General: Certificate problem on packages.ros.org?

I’ve just had a very confusing time with a dockerfile

I wonder if it’s related to the cert error I see here

https://packages.ros.org/ros2/ubuntu/pool/main/r/ros-humble-image-transport-plugins/ros-humble-image-transport-plugins_2.5.4-1jammy.20251017.031841_amd64.deb

5 posts - 4 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/certificate-problem-on-packages-ros-org/50887

ROS Discourse General: Space ROS Jazzy 2025.10.0 Release

Hello ROS community!

The Space ROS team is excited to announce Space ROS Jazzy 2025.10.0 was released last week and is available as osrf/space-ros:jazzy-2025.10.0 on DockerHub. Additionally, builds Move It 2 and Navigation 2 built on the jazzy-2025.10.0 underlay are also available to accelerate work using these systems as osrf/space-ros-moveit2:jazzy-2025.10.0 on DockerHub and osrf/space-ros-nav2:jazzy-2025.10.0 on DockerHub respectively.

Release details

Beyond the addition of the Moveit and Navigation stacks to DockerHub, this release adds a demo of JAXA’s RACS2 communications bridge by @yuyuqq showing interoperability between Space ROS and NASA’s cFS flight software - check it out in the Demos repository! For an exhaustive list of all the issues addressed and PRs merged, check out the GitHub Project Board for this release here.

Code

Current versions of all packages released with Space ROS are available at:

What’s Next

This release comes 3 months after the last release. The next release is planned for January 31, 2026. If you want to contribute to features, tests, demos, or documentation of Space ROS, get involved on the Space ROS GitHub issues and discussion board.

All the best,

The Space ROS Team

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/space-ros-jazzy-2025-10-0-release/50878

ROS Discourse General: Native buffer type

Physical AI requires DNN inference for learned policies, which in turn requires accelerators. Accelerators have their own memory and compute models that need to be surfaced in ROS 2 under abstractions, similar to how tensors are surfaced in PyTorch (accelerator aware, accelerator agnostic). This abstraction would need to be available at all layers of the ROS stack (client libraries, IDL, rmw), be vendor agnostic (CUDA, ROCm, etc.), allow for runtime graphs of heterogeneous accelerators, and enable RMW implementations to handle transport of externally managed memory efficiently. Developers who implement these concepts in their packages should have CPU backwards compatibility when specified accelerators are not available at runtime.

We propose forming a working group with other vendors hosted by the ROS PMC to introduce the concept of externally managed memory and asynchronous compute that enables accelerated graphs into ROS 2 Lyrical. Tensor semantics and DNN inference standards layered on top of what is proposed here would be designed by the Physical AI SIG.

Our design sketch is a more targeted native buffer type that maps to supplied implementations in client libraries, like rclcpp::buffer. This native type only represents a memory handle for a block that could optionally be managed externally.

namespace rclcpp { 
class buffer {
  protected:
    std::unique_ptr<BufferImplBase> impl;
    std::string device_type;
};
}  // namespace rclcpp

The client library interface does not expose its underlying buffer directly, but manages all access through vendored interfaces that add support for particular frameworks or hardware architectures. For example, an implementation for Torch in a hypothetical torch_support library as shown in the example below.

By doing so, buffer is a more fundamental type that is focused on data storage abstraction, while semantics like tensors or image buffers can then be layered on top of it.

# MessageWithTensor.msg
#
# a message containing only a buffer that is to be interpreted as a tensor

buffer tensor
// sample callback that receives a messages containing a 
// buffer, interprets it as a tensor, performs an operation 
// on it, and publishes a new message with the output, with 
// all operations performed in the Torch-chosen accelerator
// backend
void topic_callback(const msg::MessageWithTensor & input_msg) {
    torch::Tensor input_tensor =
        torch_support::from_buffer(input_msg.tensor);

    auto result = input_tensor.some_operation();

    auto output_msg = msg::MessageWithTensor();
    output_msg.tensor = torch_support::to_buffer(result);

    publisher_.publish(output_msg);
}

A default implementation for CPU backed buffers would be provided as part of the base ROS distribution, while system vendors and framework designers would provide implementations for their respective memory types. All custom implementations would always provide support to convert to and from CPU backed buffers, such that compatibility across implementations is guaranteed.

Relevant tensor type discussion can be found in the other post here: Native rcl::tensor type

3 posts - 3 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/native-buffer-type/50862

ROS Discourse General: ROSCon DE+FR workshop: integrating ROS2 application and Siemens PLCs with ROSie

Workshop: Industrial deployment of ROS applications with Siemens ROSie (ROS↔PLC connector)

Hi everyone! :waving_hand:

At ROSCon FR+DE 2025 in Strasbourg (Nov 18), together with Florian Gramß (@flo), we’ll host a hands-on workshop on connecting ROS to industrial PLCs using Siemens ROSie:
:link: developer.siemens.com/rosie/overview.html

You’ll get to work with a ROS application talking live to a Siemens PLC and explore practical deployment aspects.
:page_facing_up: Workshop info

I’ll also present my template app ManyMove, sharing some of the challenges I’ve faced while bringing it toward production.

Would love to see some of you there and hear your experiences with industrial ROS integration! :mechanical_arm:

5 posts - 3 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/roscon-de-fr-workshop-integrating-ros2-application-and-siemens-plcs-with-rosie/50859

ROS Discourse General: Recursive Fiducial Marker

I would like to share an ongoing development of a fiducial marker with recursive feature. The marker would be useful to multi-range applications.

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/recursive-fiducial-marker/50847

ROS Discourse General: ROSGraph Working Group kickoff

Hey all! From the announcement in my ROSCon talk “Escape Velocity: How to be good at launch” - here is the launch of the new working group!

If you’re just in it for the slides & launch snippets, you can find them at GitHub - emersonknapp/roscon2025_launch_snippets: Slides and code snippets for ROSCon 2025 presentation "Escape Velocity: Smarter, Cleaner ROS 2 Launch Patterns, a.k.a How to write good launchfiles"

Otherwise if you’re still with me… Come start the conversation about launch, declarative node interfaces, health monitoring, and more - the tooling that will help us reason holistically about ROS applications and their structure.

Initial meeting is next Tuesday, November 11.

If you join the mailing list https://groups.google.com/u/1/g/rosgraph-wg you will receive the calendar invite.

Key links:

I’m working on the official processes to put up a charter and get onto shared calendars, but for now we can, as Kat said, “just do things” - let’s get started!

I’ll note that this is the continuation/evolution of “Robograph Project” that I discussed earlier this year Declarative ROS presentation slides & the Robograph project kickoff - just a slightly new framing to continue on the same work.

8 posts - 4 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/rosgraph-working-group-kickoff/50846

ROS Discourse General: ROS Asia Pacific Developer Meeting #1 7th.Nov

ROS Users and Developers in Asia Pacific region :rocket:

As I announced at ROSCon 2025 Singapore Lightening Talk, we will have 1st ROS Asia Pacific Developer Meeting on this Friday 7th.Nov Convert Your Time.

ROS_APAC_Developer_Meeting-LT.pdf (3.6 MB)

If you are willing to start the contribution to ROS open source projects, please come and join us :globe_showing_asia_australia: :handshake:

ROS APAC Developer Meeting #1
Friday, November 7 · 17:00 – 17:50 Japan Standard Time
Time zone: Asia/Tokyo Convert Your Time
Google Meet joining info
Video call link: https://meet.google.com/miv-whfs-yjy

Please join https://groups.google.com/g/ros-apac-developer-meeting, so you won’t miss the meeting.

looking forward to e-meeting you all :grin:

Best,
Tomoya

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/ros-asia-pacific-developer-meeting-1-7th-nov/50837

ROS Discourse General: FOSDEM 2026: Robotics & Simulation Devroom, 2nd edition!

As spoiled at ROSCon Singapore: We have another developer room for Robotics and Simulation at FOSDEM! :partying_face: FOSDEM arguably the largest open source software developer conference in the world (and it’s free!).

It full event takes place on 2026 Jan 31th - Feb 1th (Sat, Jan 31, 2026 8:00 AM UTCSun, Feb 1, 2026 5:00 PM UTC) and it is all on the ULB Campus, Brussels, Belgium.

Thanks to the success last year, we will have a full day assigned this time, namely on the Saturday (31th). We now have a call for participation open, in which you can find more information about what kind of talks/projects we are looking for! Deadline for proposal submission is 1th of December!

In the true spirit of FOSDEM, everyone is welcome from any robotics community working in open source. Of course, we would love many of the ROS community to join this time as well, as many of you showed up for the last year edition!

Here is some pictures of last year:

If you want inspiration you check out the recorded talks of last year. Make sure that it needs to be about an open source work, preferably project based, and focus on the sharing of knowledge.

Also make sure that you see any other interesting accepted devrooms you’d like to attend during the event! We will be doing the same ourselves on Sunday, being fueled on club mate.

See you in Brussels!

Arnaud, Mat, Fred, Lucas and Kim

3 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/fosdem-2026-robotics-simulation-devroom-2nd-edition/50828


2025-11-29 12:19