<?xml version="1.0" encoding="utf-8"?><feed xmlns="http://www.w3.org/2005/Atom" ><generator uri="https://jekyllrb.com/" version="3.10.0">Jekyll</generator><link href="https://www.modularmachines.ai/feed.xml" rel="self" type="application/atom+xml" /><link href="https://www.modularmachines.ai/" rel="alternate" type="text/html" /><updated>2026-03-06T15:41:49+00:00</updated><id>https://www.modularmachines.ai/feed.xml</id><title type="html">Modular Machines</title><subtitle>Bookmark this page to get project updates about Modular Machines</subtitle><author><name>Modular Machines LLC</name></author><entry><title type="html">Upgrade LEGO Technic 42203 into a Motorized Truck</title><link href="https://www.modularmachines.ai/lego/2026/02/27/LEGO-motorized_dump_truck.html" rel="alternate" type="text/html" title="Upgrade LEGO Technic 42203 into a Motorized Truck" /><published>2026-02-27T16:20:04+00:00</published><updated>2026-02-27T16:20:04+00:00</updated><id>https://www.modularmachines.ai/lego/2026/02/27/LEGO-motorized_dump_truck</id><content type="html" xml:base="https://www.modularmachines.ai/lego/2026/02/27/LEGO-motorized_dump_truck.html"><![CDATA[<p>Want to give your LEGO Technic 42203 a fun makeover? How about turning it into a motorized construction truck that you can remotely control using a gamepad!</p>

<h2 id="introduction">Introduction</h2>

<p>LEGO Technic 42203 is a charming construction truck with a flatbed that can carry a large load. It features doors that open to reveal the cabin’s cozy interior. It’s a great choice for building a robot car. The flatbed is perfect for holding computer gear, and the flat top can be used as a lidar sensor mount. Plus, the cabin has plenty of space for a camera.</p>

<p><img src="/assets/lego/42203_boxprod_v39.png" /></p>

<h2 id="what-makes-a-motorized-car">What makes a Motorized Car</h2>

<p>To make a motorized car move, we need both a steering system and a drive system. The driving and steering gears I have are from the LEGO Technic 42160 Audi RS Q e-tron. They’re much bigger and more complex than the cute little construction truck!</p>

<p>Here is the top-down, under-the-hood view of the simple construction truck:</p>

<p><a href="/assets/lego/IMG_4226.jpeg">
  <img src="/assets/lego/IMG_4226.jpeg" width="450" />
</a></p>

<p>Here are the guts of the Audi rally car. I placed the truck’s tires next to the monster Audi rally car’s tires. How much smaller the truck’s tires are! I’ll design a new driving and steering system for the truck, reuse the motors and hub from the LEGO Audi, and keep the truck’s form factor.</p>

<p><a href="/assets/lego/IMG_4233.jpeg">
  <img src="/assets/lego/IMG_4233.jpeg" width="350" />
</a>
<a href="/assets/lego/IMG_4238.jpeg">
  <img src="/assets/lego/IMG_4238.jpeg" width="350" />
</a></p>

<h3 id="gear-rack-with-a-double-bevel-gear-for-steering">Gear Rack with a Double Bevel Gear for Steering</h3>

<p>After some research, I developed this steering system for the truck. It uses a 1x7 Gear Rack with a 12-tooth Double Bevel Gear. The Gear Rack has axle holes and pin holes that can be attached to connectors to form joints for movement. The Double Bevel Gear’s axle connects to the motor. Here’s a closer look at how the connectors and pins work together.</p>

<p><a href="/assets/lego/IMG_4295.jpeg">
  <img src="/assets/lego/IMG_4295.jpeg" />
</a></p>

<p>Bevel Gears are conical gears with angled, rounded teeth designed to transfer torque at 90-degree angles.</p>

<p>The beige-colored axles are commonly used as the motor output. They have segments shaped like “+” to lock the axle to the motor and the gearwheel, while the smooth segment allows the axle to spin in the separator with minimal friction.</p>

<p>Here’s a snapshot of how it looks when it’s set up on the truck. I used the same-length front beam to keep the dimensions and shape consistent, so the upgraded truck frame can still fit under the same truck head.</p>

<p><a href="/assets/lego/IMG_4260.jpeg">
  <img src="/assets/lego/IMG_4260.jpeg" />
</a></p>

<h3 id="differential-gear-for-propulsion">Differential Gear for Propulsion</h3>
<p>A differential is a gear mechanism that allows the left and right wheels to rotate at different speeds while still being powered by the same motor.</p>

<p>This is very important when a vehicle turns because the outside wheel travels a longer distance than the inside wheel. Therefore, the outside wheel must spin faster than the inside wheel. Without a differential, the wheels fight each other, and turning becomes jerky.</p>

<p>My LEGO Technic Differential Gear has:</p>
<ul>
  <li>A round housing with five small 12-tooth Bevel Gears inside and a 28-tooth bright yellow Bevel Gear on the outside as the “lid”</li>
  <li>One input gear in the center, which is a 14-tooth bright yellow Bevel Gear that transfers motion at a 90-degree angle</li>
  <li>Two axle outputs for the wheels</li>
</ul>

<p><a href="/assets/lego/IMG_4297.jpeg">
  <img src="/assets/lego/IMG_4297.jpeg" />
</a></p>

<p>In my truck, the motor sits a bit higher than the Differential’s input. To transfer the rotation downward, I vertically aligned two Spur Gears on the motor’s output shaft. A Spur Gear is a flat gear with straight teeth.</p>

<h3 id="putting-all-together">Putting All Together</h3>

<p>Here’s the base frame for the upgraded truck! I removed the original gear set that lifts and lowers the cargo bed and used the space for the two motors and the Differential Gear.</p>

<p><a href="/assets/lego/IMG_4262.jpeg">
  <img src="/assets/lego/IMG_4262.jpeg" />
</a></p>

<p>Here is the truck’s original base frame for reference:</p>

<p><a href="/assets/lego/IMG_4222.jpeg">
  <img src="/assets/lego/IMG_4222.jpeg" />
</a></p>

<h2 id="show-time">Show Time</h2>
<p>Here is how the upgraded truck looks. You won’t notice much difference at first glance, except for the power supply in the trunk. Keeping the same look is my goal. :)</p>

<p><a href="/assets/lego/IMG_4291.jpeg">
  <img src="/assets/lego/IMG_4291.jpeg" />
</a></p>

<p>Enjoy a quick demo of the construction truck driving autonomously using a <a href="https://github.com/jiayihoffman/lego_audi_etron/blob/main/scripts/demo_drive.py">ROS2 script</a>!</p>

<p>For information on how to control a LEGO motorized car using a gamepad or autonomously, please check out my earlier post on <a href="/lego/2026/01/26/LEGO-ROS2_control.html">ROS2 Control of the LEGO Audi e-tron</a>.</p>

<iframe width="800" height="468" src="https://www.youtube.com/embed/gXDgUqN7Z0k?autoplay=1&amp;mute=0">
</iframe>

<p>Cheers!</p>]]></content><author><name>Modular Machines LLC</name></author><category term="LEGO" /><summary type="html"><![CDATA[]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://www.modularmachines.ai/assets/lego/IMG_4291.jpeg" /><media:content medium="image" url="https://www.modularmachines.ai/assets/lego/IMG_4291.jpeg" xmlns:media="http://search.yahoo.com/mrss/" /></entry><entry><title type="html">ROS2 Control of the LEGO Audi e-tron</title><link href="https://www.modularmachines.ai/lego/2026/01/26/LEGO-ROS2_control.html" rel="alternate" type="text/html" title="ROS2 Control of the LEGO Audi e-tron" /><published>2026-01-26T16:27:08+00:00</published><updated>2026-01-26T16:27:08+00:00</updated><id>https://www.modularmachines.ai/lego/2026/01/26/LEGO-ROS2_control</id><content type="html" xml:base="https://www.modularmachines.ai/lego/2026/01/26/LEGO-ROS2_control.html"><![CDATA[<p>My journey of building a mobile robot using ROS2 and ros2_control.</p>

<p><img src="/assets/lego/IMG_4165.jpeg" /></p>

<h2 id="introduction">Introduction</h2>

<p>After building a differential robot (<a href="/security_robot/2025/05/26/SecurityRobot-Mapping.html">Robot Auto Mapping using Nav2 SLAM Toolbox</a>) using mechanical parts from <a href="https://osepp.com/mechanical-parts">OSEPP</a>, I am looking to build a more durable mobile robot with stronger motors and a more attractive look.</p>

<p>I started with RC cars. However, to modify an RC car for ROS2 Control, I would have to discard many of its original RC parts, which is not very economical. Besides, the modified RC car does not look good.</p>

<p>I grew up as a LEGO fan and am now an AFOL (Adult Fan of LEGO). LEGO is the ultimate building block. I can take the blocks and build anything I want. Thanks to LEGO® Technic™, which releases model rally cars with real motors and hubs.</p>

<h3 id="lego-technic-42160">LEGO® Technic™ 42160</h3>

<p>The <a href="https://www.lego.com/en-us/product/audi-rs-q-e-tron-42160">LEGO Audi RS Q e-tron (Technic™ 42160)</a> is a model of the 2022 Audi RS Q e-tron Dakar rally car. It features many realistic details, including individual suspension on each of the car’s 4 wheels and wheel elements that reflect the full-sized Audi’s wheel design.</p>

<p>Like a real-world rally car, the LEGO Audi e-tron has front-wheel steering and all-wheel drive. The model car is powered by three <a href="https://www.lego.com/en-us/product/technic-large-motor-88013">Technic™ motors</a>, connected to the ports of the <a href="https://www.lego.com/en-us/product/technic-hub-88012">Technic™ Hub</a>.</p>

<p>The LEGO Audi e-tron can be controlled via LEGO’s Control+ iOS and Android apps, but I want to control it with ROS2 and ROS2 Control so it can integrate with the larger ecosystem and do more than just drive. :)</p>

<h2 id="ros2-controlled-lego-car">ROS2 Controlled LEGO Car</h2>

<h3 id="what-and-why-ros2-control">What and Why ROS2 Control</h3>

<p>ros2_control is a robot control framework in ROS 2 that provides a hardware abstraction layer for controlling robot actuators, sensors, and hardware interfaces in a modular and efficient way.</p>

<p>ros2_control enhances performance and offers real-time capabilities by avoiding multiple processes that collaborate via messages and topics. Besides, it promotes standardization and modular robot control. It supports various hardware interfaces through a hardware abstraction layer and enables a seamless transition between simulation and different hardware implementations with minimal code changes.</p>

<p>Developers can reuse existing controllers instead of writing their own from scratch. In fact, much of the robot control logic has already been developed by others, so the pre-built ROS2 controllers can be utilized as is in most use cases.</p>

<!-- Controller Manager is the main component of the ros2_control framework. It manages the lifecycle of controllers, provides access to hardware interfaces, and offers services to the ROS-echo system. -->

<h3 id="ros2-controllers-for-wheeled-mobile-robots">ROS2 Controllers for Wheeled Mobile Robots</h3>

<p>For wheeled mobile robots, ros2_control framework offers the following types of controllers.</p>
<ul>
  <li>Differential Drive Controller:
    <ul>
      <li>Controller for mobile robots with differential drive, which has two wheels, each of which is driven independently.</li>
    </ul>
  </li>
  <li>Steering Controllers:
    <ul>
      <li><strong>Bicycle</strong> - with the front wheel(s) steerable and the traction wheel(s) at the rear.</li>
      <li><strong>Tricycle</strong> - with a steerable front wheel and two independent traction wheels at the rear.</li>
      <li><strong>Ackermann</strong> - with two independent steering wheels at the front and two independent traction wheels at the rear.</li>
    </ul>
  </li>
  <li>Mecanum Drive Controllers:
    <ul>
      <li>Controller for a mobile robot with four mecanum wheels, allowing the robot to move sideways, spin, and drive in any direction by controlling each wheel independently.</li>
    </ul>
  </li>
</ul>

<p>For more information about the ROS2 Steering Controller, please see the documentation on <a href="https://control.ros.org/humble/doc/ros2_controllers/doc/mobile_robot_kinematics.html">control.ros.org</a> and <a href="https://control.ros.org/humble/doc/ros2_controllers/steering_controllers_library/doc/userdoc.html#steering-controllers-library-userdoc">steering_controllers</a>. The second article provides detailed information on the controller’s command and state interfaces, parameters, and subscribed and published topics.</p>

<h4 id="bicycle-steering-controllers-for-lego-audi-e-tron">Bicycle Steering Controllers for LEGO Audi e-tron</h4>

<p>This LEGO Audi e-tron is a car-like robot. It has steerable front wheels and all-wheel drive. The two front wheels work together to steer the car, and all wheels provide the same traction and forward and backward motion to propel it.</p>

<p>This resembles the Bicycle Steering model. Therefore, a bicycle steering controller is used for this LEGO car, utilizing two joints, virtual_rear_wheel_joint and virtual_front_wheel_joint, to control its movement.</p>

<h3 id="architecture">Architecture</h3>
<p>Here is the architecture diagram for a ROS2-controlled LEGO car, where the Controller Manager connects the controllers and the hardware abstraction of the ros2_control framework. On the one hand, the Controller Manager manages controllers (e.g., loading, activating, deactivating, and unloading them). On the other hand, it accesses the hardware components via the Resource Manager.</p>

<p>The control loop is managed by the Controller Manager’s update() method. It reads data from the hardware components, updates the outputs of all active controllers, and writes the results back to the components.</p>

<p>The Resource Manager loads the hardware component, manages its lifecycle, and exposes its command and state interfaces. During control loop execution, the Resource Manager’s read() and write() methods communicate with the hardware component.</p>

<p>To support the LEGO car, we need to implement the LEGO hardware component, which is the driver program for the physical LEGO car. The remaining components in the diagram are provided by the ros2_control framework.</p>

<p><br />
<a href="/assets/lego/ros2_control_lego-car.drawio.png">
  <img src="/assets/lego/ros2_control_lego-car.drawio.png" />
</a></p>
<div style="text-align: center;"><strong>Architecture Diagram</strong></div>
<p><br /></p>

<h3 id="ros2_control-lego-hardware-component">ros2_control LEGO Hardware Component</h3>
<p>There are three basic types of hardware components:</p>
<ul>
  <li><strong>Actuator</strong>: Simple 1-DOF (Degrees of Freedom) robotic hardware, such as motors and valves. An actuator implementation is related to only one joint.</li>
  <li><strong>System</strong>: Multi-DOF robotic hardware such as industrial robots. This component has reading and writing capabilities and can have multiple joints.</li>
  <li><strong>Sensor</strong>: Robotic hardware used for sensing its environment.</li>
</ul>

<p>I use the “System” type for the LEGO hardware component because it has 2-DOF and uses two joints, virtual_rear_wheel_joint and virtual_front_wheel_joint, for steering and traction.</p>

<h4 id="hardware-description-in-urdf">Hardware Description in URDF</h4>
<p>The ros2_control framework uses the <code class="language-plaintext highlighter-rouge">&lt;ros2_control&gt;</code> tag in the robot’s URDF file to describe the components and the hardware setup. Here is my <code class="language-plaintext highlighter-rouge">ros2_control.xacro</code>:</p>

<div class="language-xml highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="cp">&lt;?xml version="1.0"?&gt;</span>
<span class="nt">&lt;robot</span> <span class="na">xmlns:xacro=</span><span class="s">"http://www.ros.org/wiki/xacro"</span><span class="nt">&gt;</span>
  <span class="nt">&lt;xacro:macro</span> <span class="na">name=</span><span class="s">"carlikebot_ros2_control"</span> <span class="na">params=</span><span class="s">"name prefix"</span><span class="nt">&gt;</span>

    <span class="nt">&lt;ros2_control</span> <span class="na">name=</span><span class="s">"${name}"</span> <span class="na">type=</span><span class="s">"system"</span><span class="nt">&gt;</span>
      <span class="nt">&lt;hardware&gt;</span>
        <span class="nt">&lt;plugin&gt;</span>audi_etron/CarlikeBotSystemHardware<span class="nt">&lt;/plugin&gt;</span>
        <span class="c">&lt;!-- LEGO Motor Control Parameters --&gt;</span>
        <span class="c">&lt;!-- Maximum power for traction motors (0-100), default 80  --&gt;</span>
        <span class="nt">&lt;param</span> <span class="na">name=</span><span class="s">"max_traction_power"</span><span class="nt">&gt;</span>90.0<span class="nt">&lt;/param&gt;</span>
        <span class="c">&lt;!-- Maximum power for traction motors (0-100), default 50  --&gt;</span>
        <span class="nt">&lt;param</span> <span class="na">name=</span><span class="s">"max_steering_power"</span><span class="nt">&gt;</span>50.0<span class="nt">&lt;/param&gt;</span>
        <span class="c">&lt;!-- Maximum velocity command for scaling (rad/s) --&gt;</span>
        <span class="nt">&lt;param</span> <span class="na">name=</span><span class="s">"max_traction_velocity"</span><span class="nt">&gt;</span>25.0<span class="nt">&lt;/param&gt;</span>
        <span class="c">&lt;!-- Maximum steering position for scaling (rad), default 0.4 (~22.9 degrees)
             Must match URDF joint limits. Turning radius at max ~0.57m (wheelbase 0.24m) --&gt;</span>
        <span class="nt">&lt;param</span> <span class="na">name=</span><span class="s">"max_steering_position"</span><span class="nt">&gt;</span>0.4<span class="nt">&lt;/param&gt;</span>
        <span class="nt">&lt;param</span> <span class="na">name=</span><span class="s">"steering_deadzone"</span><span class="nt">&gt;</span>0.05<span class="nt">&lt;/param&gt;</span>
        <span class="c">&lt;!-- Hub name pattern to search for during BLE scan, default "Technic" --&gt;</span>
        <span class="nt">&lt;param</span> <span class="na">name=</span><span class="s">"hub_name"</span><span class="nt">&gt;</span>Technic<span class="nt">&lt;/param&gt;</span>
      <span class="nt">&lt;/hardware&gt;</span>
      <span class="nt">&lt;joint</span> <span class="na">name=</span><span class="s">"${prefix}virtual_front_wheel_joint"</span><span class="nt">&gt;</span>
        <span class="nt">&lt;command_interface</span> <span class="na">name=</span><span class="s">"position"</span><span class="nt">/&gt;</span>
        <span class="nt">&lt;state_interface</span> <span class="na">name=</span><span class="s">"position"</span><span class="nt">/&gt;</span>
      <span class="nt">&lt;/joint&gt;</span>
      <span class="nt">&lt;joint</span> <span class="na">name=</span><span class="s">"${prefix}virtual_rear_wheel_joint"</span><span class="nt">&gt;</span>
        <span class="nt">&lt;command_interface</span> <span class="na">name=</span><span class="s">"velocity"</span><span class="nt">/&gt;</span>
        <span class="nt">&lt;state_interface</span> <span class="na">name=</span><span class="s">"velocity"</span><span class="nt">/&gt;</span>
        <span class="nt">&lt;state_interface</span> <span class="na">name=</span><span class="s">"position"</span><span class="nt">/&gt;</span>
      <span class="nt">&lt;/joint&gt;</span>
    <span class="nt">&lt;/ros2_control&gt;</span>

  <span class="nt">&lt;/xacro:macro&gt;</span>
<span class="nt">&lt;/robot&gt;</span>
</code></pre></div></div>

<p>Among the LEGO Motor Control parameters, the <code class="language-plaintext highlighter-rouge">max_steering_power</code> limits the power sent to the steering motor. It defaults to 50% for the following reasons:</p>
<ol>
  <li>Steering requires smooth, precise control, and lower power reduces jerky movements.</li>
  <li>Lower steering power provides mechanical protection because steering components are typically more delicate; limiting power reduces stress.</li>
  <li>Steering only turns the front wheels, which requires less torque than driving the car.</li>
</ol>

<h4 id="hardware-component-class">Hardware Component Class</h4>

<p>The hardware component is a ROS2 package written in C++ with <code class="language-plaintext highlighter-rouge">ament_cmake</code> as the build type. A helpful command for creating the package is <code class="language-plaintext highlighter-rouge">ros2 pkg create</code>. After creating the package, add <robot_hardware_interface_name>.hpp and <robot_hardware_interface_name>.cpp for the hardware component implementation. Mine is `carlikebocarlikebot_system.hpp` and `carlikebot_system.cpp`.</robot_hardware_interface_name></robot_hardware_interface_name></p>

<p>The hardware_interface class must implement LifecycleNodeInterface’s <code class="language-plaintext highlighter-rouge">on_configure</code>, <code class="language-plaintext highlighter-rouge">on_cleanup</code>, <code class="language-plaintext highlighter-rouge">on_shutdown</code>, <code class="language-plaintext highlighter-rouge">on_activate</code>, <code class="language-plaintext highlighter-rouge">on_deactivate</code>, <code class="language-plaintext highlighter-rouge">on_error</code> methods; and overriding SystemInterface’s <code class="language-plaintext highlighter-rouge">on_init</code>, <code class="language-plaintext highlighter-rouge">export_state_interfaces</code>, <code class="language-plaintext highlighter-rouge">export_command_interfaces</code>, <code class="language-plaintext highlighter-rouge">read</code>, <code class="language-plaintext highlighter-rouge">write</code>.</p>

<p>For my LEGO <code class="language-plaintext highlighter-rouge">CarlikeBotSystemHardware</code> class, the LEGO motor’s control parameters are initialized in the <code class="language-plaintext highlighter-rouge">on_init</code> method. The <code class="language-plaintext highlighter-rouge">on_activate</code> method connects to the LEGO Technic Hub, and the <code class="language-plaintext highlighter-rouge">on_deactivate</code> method disconnects.</p>

<p>Code snippet from <code class="language-plaintext highlighter-rouge">carlikebot_system.cpp</code>:</p>
<div class="language-c++ highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">hardware_interface</span><span class="o">::</span><span class="n">CallbackReturn</span> <span class="n">CarlikeBotSystemHardware</span><span class="o">::</span><span class="n">on_activate</span><span class="p">(</span>
  <span class="k">const</span> <span class="n">rclcpp_lifecycle</span><span class="o">::</span><span class="n">State</span> <span class="o">&amp;</span> <span class="cm">/*previous_state*/</span><span class="p">)</span>
<span class="p">{</span>
  <span class="n">RCLCPP_INFO</span><span class="p">(</span><span class="n">get_logger</span><span class="p">(),</span> <span class="s">"Activating hardware..."</span><span class="p">);</span>

  <span class="c1">// Connect to LEGO Technic Hub</span>
  <span class="n">RCLCPP_INFO</span><span class="p">(</span><span class="n">get_logger</span><span class="p">(),</span> <span class="s">"Connecting to LEGO Technic Hub..."</span><span class="p">);</span>
  <span class="k">if</span> <span class="p">(</span><span class="o">!</span><span class="n">lego_motor_controller_</span><span class="o">-&gt;</span><span class="n">connect</span><span class="p">(</span><span class="n">hub_name_</span><span class="p">))</span> <span class="p">{</span>
    <span class="n">RCLCPP_ERROR</span><span class="p">(</span><span class="n">get_logger</span><span class="p">(),</span> <span class="s">"Failed to connect to LEGO Technic Hub"</span><span class="p">);</span>
    <span class="k">return</span> <span class="n">hardware_interface</span><span class="o">::</span><span class="n">CallbackReturn</span><span class="o">::</span><span class="n">ERROR</span><span class="p">;</span>
  <span class="p">}</span>

  <span class="k">for</span> <span class="p">(</span><span class="k">auto</span> <span class="o">&amp;</span> <span class="n">joint</span> <span class="o">:</span> <span class="n">hw_interfaces_</span><span class="p">)</span>
  <span class="p">{</span>
    <span class="n">joint</span><span class="p">.</span><span class="n">second</span><span class="p">.</span><span class="n">state</span><span class="p">.</span><span class="n">position</span> <span class="o">=</span> <span class="mf">0.0</span><span class="p">;</span>

    <span class="k">if</span> <span class="p">(</span><span class="n">joint</span><span class="p">.</span><span class="n">first</span> <span class="o">==</span> <span class="s">"traction"</span><span class="p">)</span>
    <span class="p">{</span>
      <span class="n">joint</span><span class="p">.</span><span class="n">second</span><span class="p">.</span><span class="n">state</span><span class="p">.</span><span class="n">velocity</span> <span class="o">=</span> <span class="mf">0.0</span><span class="p">;</span>
      <span class="n">joint</span><span class="p">.</span><span class="n">second</span><span class="p">.</span><span class="n">command</span><span class="p">.</span><span class="n">velocity</span> <span class="o">=</span> <span class="mf">0.0</span><span class="p">;</span>
    <span class="p">}</span>

    <span class="k">else</span> <span class="k">if</span> <span class="p">(</span><span class="n">joint</span><span class="p">.</span><span class="n">first</span> <span class="o">==</span> <span class="s">"steering"</span><span class="p">)</span>
    <span class="p">{</span>
      <span class="n">joint</span><span class="p">.</span><span class="n">second</span><span class="p">.</span><span class="n">command</span><span class="p">.</span><span class="n">position</span> <span class="o">=</span> <span class="mf">0.0</span><span class="p">;</span>
    <span class="p">}</span>
  <span class="p">}</span>

  <span class="c1">// Ensure all motors are stopped initially</span>
  <span class="n">lego_motor_controller_</span><span class="o">-&gt;</span><span class="n">stop_all_motors</span><span class="p">();</span>

  <span class="n">RCLCPP_INFO</span><span class="p">(</span><span class="n">get_logger</span><span class="p">(),</span> <span class="s">"Successfully activated!"</span><span class="p">);</span>

  <span class="k">return</span> <span class="n">hardware_interface</span><span class="o">::</span><span class="n">CallbackReturn</span><span class="o">::</span><span class="n">SUCCESS</span><span class="p">;</span>
<span class="p">}</span>
</code></pre></div></div>

<p>The class’s <code class="language-plaintext highlighter-rouge">write</code> method converts the traction velocity into <code class="language-plaintext highlighter-rouge">traction_power</code> and the steering position into <code class="language-plaintext highlighter-rouge">steering_power</code>, enabling forward/backward motion and steering.</p>

<div class="language-c++ highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">hardware_interface</span><span class="o">::</span><span class="n">return_type</span> <span class="n">audi_etron</span><span class="o">::</span><span class="n">CarlikeBotSystemHardware</span><span class="o">::</span><span class="n">write</span><span class="p">(</span>
  <span class="k">const</span> <span class="n">rclcpp</span><span class="o">::</span><span class="n">Time</span> <span class="o">&amp;</span> <span class="cm">/*time*/</span><span class="p">,</span> <span class="k">const</span> <span class="n">rclcpp</span><span class="o">::</span><span class="n">Duration</span> <span class="o">&amp;</span> <span class="cm">/*period*/</span><span class="p">)</span>
<span class="p">{</span>
  <span class="c1">// Check if motor controller is connected</span>
  <span class="k">if</span> <span class="p">(</span><span class="o">!</span><span class="n">lego_motor_controller_</span> <span class="o">||</span> <span class="o">!</span><span class="n">lego_motor_controller_</span><span class="o">-&gt;</span><span class="n">is_connected</span><span class="p">())</span> <span class="p">{</span>
    <span class="n">RCLCPP_WARN_THROTTLE</span><span class="p">(</span><span class="n">get_logger</span><span class="p">(),</span> <span class="o">*</span><span class="n">get_clock</span><span class="p">(),</span> <span class="mi">1000</span><span class="p">,</span> <span class="s">"LEGO motor controller not connected"</span><span class="p">);</span>
    <span class="k">return</span> <span class="n">hardware_interface</span><span class="o">::</span><span class="n">return_type</span><span class="o">::</span><span class="n">OK</span><span class="p">;</span>
  <span class="p">}</span>

  <span class="c1">// Convert traction velocity command to motor power (needed to determine direction for steering)</span>
  <span class="kt">double</span> <span class="n">traction_velocity</span> <span class="o">=</span> <span class="n">hw_interfaces_</span><span class="p">[</span><span class="s">"traction"</span><span class="p">].</span><span class="n">command</span><span class="p">.</span><span class="n">velocity</span><span class="p">;</span>

  <span class="c1">// Convert steering position command to motor power</span>
  <span class="kt">double</span> <span class="n">steering_command</span> <span class="o">=</span> <span class="n">hw_interfaces_</span><span class="p">[</span><span class="s">"steering"</span><span class="p">].</span><span class="n">command</span><span class="p">.</span><span class="n">position</span><span class="p">;</span>
  
  <span class="c1">// Apply deadzone: if command is within deadzone threshold, force to zero to actively center steering</span>
  <span class="k">if</span> <span class="p">(</span><span class="n">std</span><span class="o">::</span><span class="n">abs</span><span class="p">(</span><span class="n">steering_command</span><span class="p">)</span> <span class="o">&lt;</span> <span class="n">steering_deadzone_</span><span class="p">)</span> <span class="p">{</span>
    <span class="n">steering_command</span> <span class="o">=</span> <span class="mf">0.0</span><span class="p">;</span>
  <span class="p">}</span>
  
  <span class="c1">// Scale steering position (-max_steering_position to max_steering_position) to power (-max_steering_power to max_steering_power)</span>
  <span class="kt">double</span> <span class="n">steering_power_raw</span> <span class="o">=</span> <span class="mf">0.0</span><span class="p">;</span>
  <span class="k">if</span> <span class="p">(</span><span class="n">max_steering_position_</span> <span class="o">&gt;</span> <span class="mf">0.0</span><span class="p">)</span> <span class="p">{</span>
    <span class="n">steering_power_raw</span> <span class="o">=</span> <span class="p">(</span><span class="n">steering_command</span> <span class="o">/</span> <span class="n">max_steering_position_</span><span class="p">)</span> <span class="o">*</span> <span class="n">max_steering_power_</span><span class="p">;</span>
  <span class="p">}</span>
  <span class="c1">// Clamp to valid range</span>
  <span class="n">steering_power_raw</span> <span class="o">=</span> <span class="n">std</span><span class="o">::</span><span class="n">max</span><span class="p">(</span><span class="o">-</span><span class="n">max_steering_power_</span><span class="p">,</span> <span class="n">std</span><span class="o">::</span><span class="n">min</span><span class="p">(</span><span class="n">max_steering_power_</span><span class="p">,</span> <span class="n">steering_power_raw</span><span class="p">));</span>
  <span class="kt">int8_t</span> <span class="n">steering_power</span> <span class="o">=</span> <span class="k">static_cast</span><span class="o">&lt;</span><span class="kt">int8_t</span><span class="o">&gt;</span><span class="p">(</span><span class="n">std</span><span class="o">::</span><span class="n">round</span><span class="p">(</span><span class="n">steering_power_raw</span><span class="p">));</span>
  
  <span class="c1">// Negate steering only when moving backward or stationary (forward motion doesn't need negation)</span>
  <span class="c1">// When velocity &gt; 0 (forward): negate (to fix opposite direction)</span>
  <span class="c1">// When velocity &lt;= 0 (backward/stationary): don't negate (steering is already correct)</span>
  <span class="kt">int8_t</span> <span class="n">steering_power_to_send</span> <span class="o">=</span> <span class="p">(</span><span class="n">traction_velocity</span> <span class="o">&gt;</span> <span class="mf">0.0</span><span class="p">)</span> <span class="o">?</span> <span class="o">-</span><span class="n">steering_power</span> <span class="o">:</span> <span class="n">steering_power</span><span class="p">;</span>
  
  <span class="c1">// Send steering command to PORT_D</span>
  <span class="n">lego_motor_controller_</span><span class="o">-&gt;</span><span class="n">set_motor_power</span><span class="p">(</span><span class="n">LegoPort</span><span class="o">::</span><span class="n">PORT_D</span><span class="p">,</span> <span class="n">steering_power_to_send</span><span class="p">);</span>
  <span class="c1">// Scale velocity (-max_traction_velocity to max_traction_velocity) to power (-max_traction_power to max_traction_power)</span>
  <span class="kt">double</span> <span class="n">traction_power_raw</span> <span class="o">=</span> <span class="mf">0.0</span><span class="p">;</span>
  <span class="k">if</span> <span class="p">(</span><span class="n">max_traction_velocity_</span> <span class="o">&gt;</span> <span class="mf">0.0</span><span class="p">)</span> <span class="p">{</span>
    <span class="n">traction_power_raw</span> <span class="o">=</span> <span class="p">(</span><span class="n">traction_velocity</span> <span class="o">/</span> <span class="n">max_traction_velocity_</span><span class="p">)</span> <span class="o">*</span> <span class="n">max_traction_power_</span><span class="p">;</span>
  <span class="p">}</span>
  <span class="c1">// Clamp to valid range</span>
  <span class="n">traction_power_raw</span> <span class="o">=</span> <span class="n">std</span><span class="o">::</span><span class="n">max</span><span class="p">(</span><span class="o">-</span><span class="n">max_traction_power_</span><span class="p">,</span> <span class="n">std</span><span class="o">::</span><span class="n">min</span><span class="p">(</span><span class="n">max_traction_power_</span><span class="p">,</span> <span class="n">traction_power_raw</span><span class="p">));</span>
  <span class="kt">int8_t</span> <span class="n">traction_power</span> <span class="o">=</span> <span class="k">static_cast</span><span class="o">&lt;</span><span class="kt">int8_t</span><span class="o">&gt;</span><span class="p">(</span><span class="n">std</span><span class="o">::</span><span class="n">round</span><span class="p">(</span><span class="n">traction_power_raw</span><span class="p">));</span>

  <span class="c1">// Send traction commands to PORT_A and PORT_B (both wheels)</span>
  <span class="n">lego_motor_controller_</span><span class="o">-&gt;</span><span class="n">set_motor_power</span><span class="p">(</span><span class="n">LegoPort</span><span class="o">::</span><span class="n">PORT_A</span><span class="p">,</span> <span class="n">traction_power</span><span class="p">);</span>
  <span class="n">lego_motor_controller_</span><span class="o">-&gt;</span><span class="n">set_motor_power</span><span class="p">(</span><span class="n">LegoPort</span><span class="o">::</span><span class="n">PORT_B</span><span class="p">,</span> <span class="n">traction_power</span><span class="p">);</span>


  <span class="k">return</span> <span class="n">hardware_interface</span><span class="o">::</span><span class="n">return_type</span><span class="o">::</span><span class="n">OK</span><span class="p">;</span>
<span class="p">}</span>
</code></pre></div></div>

<p>I have a separate <code class="language-plaintext highlighter-rouge">LegoMotorController</code> class that handles all communication with the <a href="https://www.lego.com/en-us/product/technic-hub-88012">Technic™ Hub</a> via the SimpleBLE library. The main methods in <code class="language-plaintext highlighter-rouge">LegoMotorController</code> are <code class="language-plaintext highlighter-rouge">set_motor_power</code>, <code class="language-plaintext highlighter-rouge">connect</code>, <code class="language-plaintext highlighter-rouge">disconnect</code>, and <code class="language-plaintext highlighter-rouge">stop_all_motors</code>. SimpleBLE is a cross-platform library for Bluetooth Low Energy (BLE) with C++ and other language support.</p>

<p>For more information about creating a new ros2_control hardware component, please see the documentation on <a href="https://control.ros.org/humble/doc/ros2_control/hardware_interface/doc/writing_new_hardware_component.html">control.ros.org</a></p>

<p>The source code repository for this LEGO car’s ros2_control hardware component is: <a href="https://github.com/jiayihoffman/lego_audi_etron">https://github.com/jiayihoffman/lego_audi_etron</a>.</p>

<h4 id="build-the-project">Build the project</h4>
<p>When running ROS2 robots, I prefer using a ROS2 Docker container because it provides a complete ROS2 environment with all dependencies and can run on any environment. I have a Dockerfile prepared for this project, located in the project’s <a href="https://github.com/jiayihoffman/lego_audi_etron/tree/main/docker">docker</a> folder.</p>

<p>To build the Docker image:</p>
<div class="language-bash highlighter-rouge"><div class="highlight"><pre class="highlight"><code>docker build <span class="nt">-t</span> lego_audi_etron <span class="nt">-f</span> <span class="s2">"./docker/Dockerfile"</span> <span class="nb">.</span>
</code></pre></div></div>

<h3 id="let-the-robot-march">Let the Robot March</h3>

<h4 id="start-the-robot-on-raspberry-pi">Start the robot on Raspberry Pi</h4>
<p>To start the robot, I start the “lego_audi_etron” Docker container. The container’s launch file loads and initializes the ROS2 hardware and the controllers.</p>

<p>I turn on the LEGO car’s Hub immediately after starting the Docker container. This is because the ROS2 control manager needs to connect to the Hub upon activation.</p>

<div class="language-bash highlighter-rouge"><div class="highlight"><pre class="highlight"><code>docker run <span class="nt">-it</span> <span class="nt">--rm</span> <span class="se">\</span>
    <span class="nt">--network</span><span class="o">=</span>host <span class="se">\</span>
    <span class="nt">--privileged</span> <span class="se">\</span>
    <span class="nt">-v</span> /var/run/dbus:/var/run/dbus <span class="se">\</span>
    <span class="s2">"lego_audi_etron"</span>   
</code></pre></div></div>

<p>Explaining the <code class="language-plaintext highlighter-rouge">docker run</code> command:</p>
<ul>
  <li><code class="language-plaintext highlighter-rouge">--privileged</code>: Required to bypass AppArmor restrictions and access D-Bus/Bluetooth hardware</li>
  <li><code class="language-plaintext highlighter-rouge">-v /var/run/dbus:/var/run/dbus</code>: Mounts the host’s D-Bus socket so SimpleBLE can communicate with BlueZ</li>
</ul>

<p>SimpleBLE uses D-Bus to communicate with the BlueZ daemon for Bluetooth Low Energy operations</p>

<p>In the command line, when I see the following messages, it means the ROS2 control manager is able to activate the LEGO car via the ROS2 control hardware interface:</p>
<pre><code class="language-log">[ros2_control_node-1] [INFO] [1768345118.037541315] [controller_manager.resource_manager.hardware_component.system.CarlikeBot]: Connecting to LEGO Technic Hub...
[ros2_control_node-1] [INFO] [1768345121.052584207] [controller_manager.resource_manager.hardware_component.system.CarlikeBot]: Found Technic Hub: Technic Hub (Address: ...)
[ros2_control_node-1] [INFO] [1768345121.052596873] [controller_manager.resource_manager.hardware_component.system.CarlikeBot]: Connecting to hub...
[ros2_control_node-1] [INFO] [1768345123.247685785] [controller_manager.resource_manager.hardware_component.system.CarlikeBot]: Successfully connected to LEGO Technic Hub!
[ros2_control_node-1] [INFO] [1768345123.498398953] [controller_manager.resource_manager.hardware_component.system.CarlikeBot]: Successfully activated!
</code></pre>

<p>I then wait for the messages of the <code class="language-plaintext highlighter-rouge">bicycle_steering_controller</code> and <code class="language-plaintext highlighter-rouge">joint_state_broadcaster</code> being loaded and configured:</p>
<pre><code class="language-log">[ros2_control_node-1] [INFO] [1768345123.647514626] [controller_manager]: Loading controller 'bicycle_steering_controller'
[spawner-3] [INFO] [1768345123.792577584] [spawner_bicycle_steering_controller]: Loaded bicycle_steering_controller
[ros2_control_node-1] [INFO] [1768345123.794617367] [controller_manager]: Configuring controller 'bicycle_steering_controller'
[ros2_control_node-1] [INFO] [1768345123.794807106] [bicycle_steering_controller]: bicycle odometry configure successful
[ros2_control_node-1] [INFO] [1768345123.803230163] [bicycle_steering_controller]: configure successful
[spawner-3] [INFO] [1768345124.718624417] [spawner_bicycle_steering_controller]: Configured and activated bicycle_steering_controller
[ros2_control_node-1] [INFO] [1768345125.248599089] [controller_manager]: Loading controller 'joint_state_broadcaster'
[spawner-4] [INFO] [1768345125.515453413] [spawner_joint_state_broadcaster]: Loaded joint_state_broadcaster
[ros2_control_node-1] [INFO] [1768345125.516909386] [controller_manager]: Configuring controller 'joint_state_broadcaster'
[ros2_control_node-1] [INFO] [1768345125.517096273] [joint_state_broadcaster]: 'joints' or 'interfaces' parameter is empty. All available state interfaces will be published
[spawner-4] [INFO] [1768345126.473333746] [spawner_joint_state_broadcaster]: Configured and activated joint_state_broadcaster
</code></pre>

<p>At this point, my ros2_controlled robot has started successfully.</p>

<h4 id="start-the-ros2-joystick">Start the ROS2 Joystick</h4>

<p>To use a joystick (gamepad) to control the LEGO car, I use the ROS2 <code class="language-plaintext highlighter-rouge">teleop_twist_joy</code> package.</p>

<p>Each gamepad has an enable button. Press it while twisting the joystick to control the robot. On the PS5, it is the “PS” button.</p>

<p>One important thing to note is that the <code class="language-plaintext highlighter-rouge">ROS_DOMAIN_ID</code> environment variable must be the same on the dev machine running the “teleop_twist_joy” and in the Docker container running the robot. This environment variable controls who can access the robot’s published data and which ROS2 applications can interact with one another. If your robot cannot see the commands published by the joystick, please check that the <code class="language-plaintext highlighter-rouge">ROS_DOMAIN_ID</code> is set correctly.</p>

<p>To start the ROS2 <code class="language-plaintext highlighter-rouge">teleop_twist_joy</code>:</p>

<div class="language-bash highlighter-rouge"><div class="highlight"><pre class="highlight"><code>ros2 launch teleop_twist_joy teleop-launch.py joy_config:<span class="o">=</span><span class="s1">'ps3'</span> publish_stamped_twist:<span class="o">=</span><span class="nb">true</span>
</code></pre></div></div>

<p>Here is a demo of the car driving autonomously using a <a href="https://github.com/jiayihoffman/lego_audi_etron/blob/main/scripts/demo_drive.py">ROS2 script</a>, and we visualize it in RViz.</p>

<iframe width="800" height="468" src="https://www.youtube.com/embed/SI8qrDnAVWQ?autoplay=1&amp;mute=0">
</iframe>

<p>Enjoy!</p>]]></content><author><name>Modular Machines LLC</name></author><category term="LEGO" /><summary type="html"><![CDATA[]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://www.modularmachines.ai/assets/lego/IMG_4165.jpeg" /><media:content medium="image" url="https://www.modularmachines.ai/assets/lego/IMG_4165.jpeg" xmlns:media="http://search.yahoo.com/mrss/" /></entry><entry><title type="html">Remotely Control Security Robots over the Internet</title><link href="https://www.modularmachines.ai/droid_vision/2025/12/15/Manage-Robot-Over-Internet.html" rel="alternate" type="text/html" title="Remotely Control Security Robots over the Internet" /><published>2025-12-15T14:45:28+00:00</published><updated>2025-12-15T14:45:28+00:00</updated><id>https://www.modularmachines.ai/droid_vision/2025/12/15/Manage-Robot-Over-Internet</id><content type="html" xml:base="https://www.modularmachines.ai/droid_vision/2025/12/15/Manage-Robot-Over-Internet.html"><![CDATA[<p>In robotics, the Robot Operating System (ROS) offers software libraries and tools that help developers connect motors, sensors, and other software, speeding up the development of advanced robots.</p>

<p>However, ROS nodes in robots communicate with each other via TCP/UDP sockets, which are not accessible outside the robot’s LAN. Rosbridge WebSocket is a communication interface that enables non-ROS applications, such as web or mobile apps, to interact with a ROS system over a WebSocket connection through standard HTTP.</p>

<p>To use the robot as a security robot, I need to be able to interact with it over the internet while I’m away. How can I do that securely?</p>

<p>In my previous post, <a href="/droid_vision/2025/07/07/Video-Streaming-Server.html">Live Video Streaming of Security Robots</a>, I explained how to stream video from the robot over the internet securely. In this article, I will discuss how to control the robot over the internet securely.</p>

<h3 id="architecture">Architecture</h3>

<p>There are three components involved in the project. In the following sections, we will provide a detailed explanation of what they are and why they are needed:</p>
<ol>
  <li>The <strong>robot</strong>, where the ROS framework and ROS bridge operate.</li>
  <li>The <strong>cloud server</strong>, which acts as an SSH Tunnel to relay WebSocket messages from the mobile app to the robot.</li>
  <li>The <strong>mobile app</strong>, such as <a href="/droid_vision/2025/03/06/DroidVision-Teleop.html">Droid Vision with built-in Joystick and Keypad</a>, connects to the cloud server <code class="language-plaintext highlighter-rouge">wss://cloud-server-domain.com/rosbridge/</code> and sends ROS commands in JSON format.</li>
</ol>

<p><br />
<img src="/assets/teleop/rosbridge_tunnel.drawio.png" alt="Alt text" /></p>

<h4 id="the-robot">The Robot</h4>

<p>Robot Operating System (ROS) runs entirely on the robot</p>
<ol>
  <li>The robot exposes ROS2 topics, such as /cmd_vel, /odom, and sensors topics, on the robot’s LAN. For details on running ROS2 on a physical robot, please see my article <a href="/security_robot/2025/02/22/SecurityRobot-Ros2_control.html">ROS 2 Control, Robot Control the Right Way</a>
    <div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code> ros2 launch my_bot robot.launch.py
</code></pre></div>    </div>
  </li>
  <li>The robot launches the ROS bridge server to accept ROS commands via HTTP requests
    <div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code> ros2 launch rosbridge_server rosbridge_websocket_launch.xml
</code></pre></div>    </div>
  </li>
</ol>

<p>At this point, if I expose the robot’s IP address, I can control it from the iOS app through the websocket URL: <code class="language-plaintext highlighter-rouge">ws://robot-ip-address:9090</code>. However, I am not comfortable making my robot’s IP address public.</p>

<p>After exploring various options, I chose the “Reverse SSH Tunnel” method. It is similar to a Cloudflare tunnel but is self-hosted on a cloud compute instance. Since I already use a cloud instance as the media relay server for video streaming (<a href="/droid_vision/2025/07/07/Video-Streaming-Server.html">Live Video Streaming of Security Robots</a>), this approach fits well with my robot’s deployment architecture.</p>

<p>Here is the command I run on the robot to create a reverse SSH tunnel that opens an outbound WebSocket connection to the cloud server.</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>ssh -N -R 10000:localhost:9090 user@cloud-server-ip
</code></pre></div></div>

<p>Explanation of the argument <code class="language-plaintext highlighter-rouge">-R 10000:localhost:9090</code>:</p>
<ul>
  <li>“-R” is reverse port forwarding. This means we open port 10000 on the cloud server and forward all traffic to port 9090 on the robot.</li>
  <li>“localhost” refers to the robot where this “ssh” command runs.</li>
  <li>The rosbridge only listens locally on port 9090 on the robot. It is not accessible from the outside.</li>
</ul>

<p>The SSH tunnel disappears if the robot reboots or the network drops. Therefore, I have a <code class="language-plaintext highlighter-rouge">stream_control.py</code> program that dynamically opens and closes the tunnel based on whether the user goes live on the robot.</p>

<h4 id="cloud-server">Cloud Server</h4>
<p>The cloud server here functions solely as a relay and is not a ROS node. For simplicity, there is no ROS component on the cloud server.</p>

<p>Once the Reverse SSH Tunnel is established on the robot, any traffic on the cloud server’s port 10000 is automatically forwarded to the robot’s rosbridge.</p>

<p>However, there are two issues:</p>
<ol>
  <li>the WebSocket traffic isn’t encrypted</li>
  <li>the firewall rule for the cloud instance includes ingress on port 10000.</li>
</ol>

<p>To improve security, I will use Nginx to terminate the TLS WebSocket connection and upgrade/forward the WebSocket frames to the rosbridge tunnel at port 10000.</p>

<p>Here is the corresponding Nginx configuration:</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>server {
    listen 443 ssl;
    server_name cloud-server-domain.com;

    # SSL cert paths
    ssl_certificate ...
    ssl_certificate_key ...

    location /rosbridge/ {
        proxy_pass http://127.0.0.1:10000/;

        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection "upgrade";

        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;

        # prevents 101 switching errors
        proxy_read_timeout 3600;
        proxy_send_timeout 3600;
    }
}
</code></pre></div></div>
<p>This simple configuration encrypts WebSocket traffic over the internet, with only the standard HTTPS port open on the cloud server. Messages to the URL “/cloud-server-domain.com/rosbridge” are forwarded to port 10000, which automatically tunnels to the robot’s ROS Bridge at port 9090.</p>

<h4 id="droid-vision-ios-app">Droid Vision iOS app</h4>

<p>Now, the iOS app can connect to <code class="language-plaintext highlighter-rouge">wss://cloud-server-domain.com/rosbridge</code> to send commands to the “/cmd_vel” topic, which the robot uses to control its driving speed and direction.</p>

<p><img src="/assets/teleop/IMG_0519.PNG" alt="Alt text" /></p>

<h3 id="conclusion">Conclusion</h3>

<p>This approach is highly secure because:</p>
<ul>
  <li>The robot is never exposed to the internet.</li>
  <li>SSH encryption secures the tunnel and WebSocket connection.</li>
  <li>Only HTTPS port 443 is open on the cloud server.</li>
  <li>The cloud relay can restrict what the user can do to the robot.</li>
</ul>

<p>Here is a quick summary of what each component does:</p>

<h4 id="robot">Robot</h4>
<ul>
  <li>Runs ROS2</li>
  <li>Runs rosbridge on port 9090 locally</li>
  <li>Opens an SSH reverse tunnel between the robot and the remote cloud server.</li>
</ul>

<h4 id="cloud-server-1">Cloud Server</h4>
<ul>
  <li>The cloud server functions solely as a pass-through, not as a ROS node. It relays messages to the robot via the SSH tunnel.</li>
  <li>It provides a secure WebSocket connection for the mobile app.</li>
</ul>

<h4 id="ios-app">iOS app</h4>
<ul>
  <li>Sends ROS commands to the robot by invoking the secure cloud server URL.</li>
</ul>]]></content><author><name>Modular Machines LLC</name></author><category term="Droid_Vision" /><summary type="html"><![CDATA[]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://www.modularmachines.ai/assets/teleop/rosbridge_tunnel.drawio.png" /><media:content medium="image" url="https://www.modularmachines.ai/assets/teleop/rosbridge_tunnel.drawio.png" xmlns:media="http://search.yahoo.com/mrss/" /></entry><entry><title type="html">Stock Analyzer AI Agent</title><link href="https://www.modularmachines.ai/ai_agent/2025/07/14/Stock-Analyzer-AI-Agent.html" rel="alternate" type="text/html" title="Stock Analyzer AI Agent" /><published>2025-07-14T14:45:28+00:00</published><updated>2025-07-14T14:45:28+00:00</updated><id>https://www.modularmachines.ai/ai_agent/2025/07/14/Stock-Analyzer-AI-Agent</id><content type="html" xml:base="https://www.modularmachines.ai/ai_agent/2025/07/14/Stock-Analyzer-AI-Agent.html"><![CDATA[<p>I enjoy trading stocks for long-term investing because it requires knowledge, analytical skills, and a long-term perspective. Over time, I have developed technical skills in analyzing stock charts to identify patterns, resistance and support levels, and to gauge whether the stock is becoming bearish or bullish. I also enjoy reading news and listening to tech podcasts to stay up-to-date on upcoming trends.</p>

<p>In addition to personal interests, in the era of AGI, wealth is predominantly generated through equity market investments.</p>

<h2 id="a-stock-ai-agent">A Stock AI Agent</h2>
<p>That said, I don’t always have time to watch live stock charts. That makes me want to create a stock app where AI agents analyze the charts and relate them to market sentiment to give trading insights.</p>

<p>Additionally, in many ways, trading stocks is more of a psychological game than a numbers game. Therefore, using AI agents to recommend stock actions can be helpful during major pullbacks when everyone is scared, as well as at market peaks when everyone is greedy. AI has a much calmer mind than we humans. :)</p>

<p>I have a few stocks to start with, but I’d love to add more to the list as the AI agent researches and recommends new stocks in sectors that interest me.</p>

<h3 id="architecture">Architecture</h3>
<p>Here’s a brief overview of the architecture. The benefit of agent-based design is that:</p>
<ol>
  <li>It is modular, flexible, and with expert stock analyst intelligence.</li>
  <li>This design supports the “in the loop” evaluation, enabling the Orchestrator agent to oversee and critique the output of other agents, promoting reflection - a key aspect of the agentic reasoning pattern that helps improve performance.</li>
  <li>Additionally, the logging agent records the stock’s insights and feedback to facilitate ongoing iterative review and improvement.</li>
</ol>

<p><a href="/assets/ai_agent/stock_analyzer_components.drawio.png">
  <img src="/assets/ai_agent/stock_analyzer_components.drawio.png" />
</a></p>

<p>Here are the responsibilities of each agent:</p>

<ul>
  <li>The Orchestrator Agent initiates the flow (e.g., on schedule or event).</li>
  <li>It instructs the Technical Analysis Agent and Sentiment Agent to analyze the stocks.</li>
  <li>The Technical Agent contacts the Market Data Agent for stock data, and the Sentiment Agent uses the News Agent to fetch news data for the stocks.</li>
  <li>The Orchestrator Agent combines all signals and makes a recommendation.</li>
  <li>All insights are logged by the Logging Agent.</li>
</ul>

<p>The word “agent” is a general term here. It can refer to a large language model (LLM) or an entity that performs specific tasks. For example:</p>
<ul>
  <li>The “Technical Analysis Agent” is a large language model. It calculates various technical indicators for stocks and uses reasoning to provide a technical assessment.</li>
  <li>Conversely, the “Market Data Agent” retrieves stock data from the stock exchange.</li>
</ul>

<h3 id="technologies">Technologies</h3>
<p>I use ChatGPT-5 as the large language model (LLM) and LangGraph as the agent framework. The AI agent is built with Python and Angular. The web user interface is developed entirely through Cursor, an AI-assisted coding tool.</p>

<p>The agent functions as a REST service with endpoints for on-demand stock analysis requests. It also runs the “watchlist” job in the background to automatically generate insightful alerts for the user.</p>

<!-- Additionally, it has the "Trade Monitoring" job running in the background by the orchestrator to oversee the stocks. Just like a real Wall Street trader, the job uses a scanner that constantly monitors the list of stocks and only drills down when the technicals indicate a potential move worth trading. -->

<!-- #### LangGraph
Here is the generated LangGraph illustrating the agent workflow where the task is divided into fixed subtasks for greater accuracy and predictability due to the nature of the use case. The workflow includes a human approval step to review and authorize trade execution. 

<a href="/assets/ai_agent/agent_workflow_graph.png" >
  <img src="/assets/ai_agent/agent_workflow_graph.png" width="500"/>

The graph describes nodes involved in Prompt Chaining, where each agent node processes the output of the previous one. Here is the code for chaining the nodes. 

```
def build_state_graph(self):
    workflow = StateGraph(State)

    # Add nodes
    workflow.add_node("fetch_market_data", self.fetch_market_data)
    workflow.add_node("check_technical_indicators", self.check_technical_indicators)
    workflow.add_node("technical_analysis", self.technical_analysis)
    ...

    # Add edges to connect nodes
        workflow.add_edge(START, "fetch_market_data")
        workflow.add_edge("fetch_market_data", "check_technical_indicators")
        workflow.add_conditional_edges(
            "check_technical_indicators", self.check_should_analyze, {True: "technical_analysis", False: END}
        )
    ...

    # Compile
    memory = MemorySaver()
    graph = workflow.compile(interrupt_before=["user_trade_approval"], checkpointer=memory)
    return graph
``` -->

<h3 id="the-stock-analyzer-ai-agent">The Stock Analyzer AI Agent</h3>
<p>Here is the Stock Analyzer, the AI agent that offers professional stock technical insights and market sentiment analysis.</p>

<!-- The Stock Analyzer app is available at: [https://stock-analyzer.modularmachines.ai](https://stock-analyzer.modularmachines.ai/). It is access-controlled. If you'd like to try it, let me know, and I will create an account for you. You can reset your password once you log in.

Here are the screenshots of the product: -->

<p><a href="/assets/ai_agent/oklo_screenshot.png">
  <img src="/assets/ai_agent/oklo_screenshot.png" /></a></p>

<p>In addition to “Quick Insights,” the user can also chat with the agent about specific stock topics. Unlike a generic chatbot, the agent has access to the stock’s charts and the latest news, so it can provide the user with solid data points for quick decision-making.</p>

<p><a href="/assets/ai_agent/T_response1.png">
  <img src="/assets/ai_agent/T_response1.png" /></a></p>

<p><a href="/assets/ai_agent/T_response2.png">
  <img src="/assets/ai_agent/T_response2.png" /></a></p>

<h3 id="testimony">Testimony</h3>

<h4 id="palantir-pltr">Palantir (PLTR)</h4>

<p>Two days before Palantir’s earnings report, the Stock Analyzer’s insights recommended holding the position and not trimming it. It also noted that news headlines reinforced the technical outlook and supported a bullish run. After the earnings report, PLTR stock surged by 22%, from $154 to $188.</p>

<p>Then, on Friday, August 08, the Analyzer recommended that I trim slightly and take some profits because the stock was strongly overbought. One week later, the stock dropped 6% to $177. Ten days later, on August 19, PLTR sits at $157, 16% down from the price point where the Stock Analyzer recommended trimming.</p>

<p><a href="/assets/ai_agent/pltr_stock_chart.png">
  <img src="/assets/ai_agent/pltr_stock_chart.png" /></a></p>

<p><a href="/assets/ai_agent/pltr_analysis_8-3.png">
  <img src="/assets/ai_agent/pltr_analysis_8-3.png" /></a></p>

<p><a href="/assets/ai_agent/pltr_analysis_8-8.png">
  <img src="/assets/ai_agent/pltr_analysis_8-8.png" /></a></p>

<h4 id="oklo-inc-oklo">Oklo Inc. (OKLO)</h4>

<p>On September 25, the Stock Analyzer Agent recommended that I trim my OKLO position modestly at $131, as the OKLO chart indicates a loss of momentum and signals a potential distribution following the big run.</p>

<p>This proved to be a very helpful tip. The OKLO stock fell more than $21, closing at $110 over the next two trading days. Additionally, CNBC later reported that “<a href="https://www.cnbc.com/2025/09/25/oklo-nuclear-shares-fall-ai-data-center.html">Oklo has also seen a cluster of insider selling over the past few days</a>”, which matches my AI agent’s speculation based on the chart.</p>

<p><a href="/assets/ai_agent/oklo_stock_chart.png">
  <img src="/assets/ai_agent/oklo_stock_chart.png" /></a></p>

<p><a href="/assets/ai_agent/oklo_analysis_9-25.png">
  <img src="/assets/ai_agent/oklo_analysis_9-25.png" /></a></p>

<p>Cheers!</p>]]></content><author><name>Modular Machines LLC</name></author><category term="AI_Agent" /><summary type="html"><![CDATA[I enjoy trading stocks for long-term investing because it requires knowledge, analytical skills, and a long-term perspective. Over time, I have developed technical skills in analyzing stock charts to identify patterns, resistance and support levels, and to gauge whether the stock is becoming bearish or bullish. I also enjoy reading news and listening to tech podcasts to stay up-to-date on upcoming trends.]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://www.modularmachines.ai/assets/ai_agent/pltr_stock_chart.png" /><media:content medium="image" url="https://www.modularmachines.ai/assets/ai_agent/pltr_stock_chart.png" xmlns:media="http://search.yahoo.com/mrss/" /></entry><entry><title type="html">Live Video Streaming of Security Robots</title><link href="https://www.modularmachines.ai/droid_vision/2025/07/07/Video-Streaming-Server.html" rel="alternate" type="text/html" title="Live Video Streaming of Security Robots" /><published>2025-07-07T14:45:28+00:00</published><updated>2025-07-07T14:45:28+00:00</updated><id>https://www.modularmachines.ai/droid_vision/2025/07/07/Video-Streaming-Server</id><content type="html" xml:base="https://www.modularmachines.ai/droid_vision/2025/07/07/Video-Streaming-Server.html"><![CDATA[<p>In one of my early posts, <a href="/droid_vision/2024/11/19/DroidVision-RC.html">Droid Vision on an RC Truck</a>, I showed how to set up a streaming server on a mobile robot using GStreamer and visualize the robot’s surroundings from my mobile device. This works well if I’m using the robot at home to record videos of pets and flowers. Both the phone and the robot are on the same network, allowing the phone to access the robot’s IP address directly.</p>

<p>However, to use the robot as a security camera, I am away from home, and the robot’s IP address is not visible from the phone. How can I monitor my home in real-time while I am away? Exposing the robot’s IP address poses a significant security risk and is not an option.</p>

<p><img src="/assets/media_server/IMG_3341.jpeg" /></p>

<h2 id="relay-the-video-using-a-public-media-server">Relay the Video using a Public Media Server</h2>
<p>The solution is to use a public server to relay the video: the video stream from the robot is sent to a cloud server, which then re-publishes the stream to the mobile device. This way, the robot can stay within the private network with all the safety and security, while its video is accessible from a public media server protected by credentials and a firewall.</p>

<p>All video data sent to the server is in memory and passes through. Nothing is stored unless recording is requested.</p>

<p><a href="/assets/media_server/video_relay.drawio.png">
  <img src="/assets/media_server/video_relay.drawio.png" />
</a></p>

<h3 id="mediamtx-media-server">MediaMTX Media Server</h3>
<p>A standard relay option is using an RTMP media server. RTMP is a protocol designed for streaming audio, video, and data over the internet, especially with low latency. It allows live video and audio from a source (such as a camera) to be sent to a platform (like YouTube Live or Twitch) for viewers to watch in real-time.</p>

<p>The RTMP media server I use is <a href="https://github.com/bluenviron/mediamtx">MediaMTX</a>, an open-source solution that supports live streaming via RTSP, HLS, and WebRTC. Therefore, the Droid Vision app can continue using RTSP, and I just need to change the RTSP URL to the public media server: <code class="language-plaintext highlighter-rouge">rtsp://PUBLIC_CLOUD_SERVER:8554/live/stream</code>.</p>

<p>MediaMTX is easy to run. I can run it as a Docker container or download and run its binaries. The binary download page is on <a href="https://github.com/bluenviron/mediamtx/releases">GitHub</a>.</p>

<p>Here’s the output from MediaMTX. I used the MediaMTX Docker container. When the robot publishes the video to the MediaMTX server, message “[conn <robot ip="">:34668] is publishing to path ‘live/stream’" is printed to the screen. When the Droid Vision app contacts the media server for the video stream, the message "[session <session>] is reading from path ‘live/stream’" is displayed.</session></robot></p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>% docker run --rm -it --network=host bluenviron/mediamtx:latest

025/07/07 21:02:36 INF MediaMTX v1.13.0
2025/07/07 21:02:36 INF configuration loaded from /mediamtx.yml
2025/07/07 21:02:36 INF [RTSP] listener opened on :8554 (TCP), :8000 (UDP/RTP), :8001 (UDP/RTCP)
2025/07/07 21:02:36 INF [RTMP] listener opened on :1935
2025/07/07 21:02:36 INF [HLS] listener opened on :8888
2025/07/07 21:02:36 INF [WebRTC] listener opened on :8889 (HTTP), :8189 (ICE/UDP)
2025/07/07 21:02:36 INF [SRT] listener opened on :8890 (UDP)
2025/07/07 21:03:01 INF [RTMP] [conn &lt;robot ip&gt;:34668] opened
2025/07/07 21:03:03 INF [RTMP] [conn &lt;robot ip&gt;:34668] is publishing to path 'live/stream', 1 track (H264)
2025/07/07 21:03:10 INF [RTSP] [conn &lt;mobile device ip&gt;:62634] opened
2025/07/07 21:03:10 INF [RTSP] [session c7b3d9d8] created by &lt;mobile device ip&gt;:62634
2025/07/07 21:03:11 INF [RTSP] [session c7b3d9d8] is reading from path 'live/stream', with UDP, 1 track (H264)

</code></pre></div></div>

<!-- To enable recording through the MediaMTX media server add "runOnPublish: ffmpeg" option to the mediamtx.yml file.
```
paths:
  all:
    runOnPublish: ffmpeg -i rtsp://localhost:$RTSP_PORT/$RTSP_PATH -c copy myfile.mp4
``` -->

<h3 id="video-stream-pushes-to-the-media-server">Video Stream pushes to the Media Server</h3>
<p>Here is the GStreamer pipeline for the robot to publish the video stream to the media server. Please replace PUBLIC_CLOUD_SERVER with your server’s IP address in the cloud.</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>gst-launch-1.0 libcamerasrc ! \
  video/x-raw,format=YUY2,width=640,height=480,framerate=30/1 ! \
  videoconvert ! \
  queue ! \
  x264enc tune=zerolatency speed-preset=ultrafast bitrate=1500 ! \
  queue ! \
  flvmux streamable=true name=mux ! \
  queue ! \
  rtmpsink location="rtmp://PUBLIC_CLOUD_SERVER/live/stream"
</code></pre></div></div>

<p>Here is the corresponding Python code “push_rtmp.py” for the robot. Either method can publish the video stream to the RTMP server.</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>import gi

gi.require_version('Gst', '1.0')
from gi.repository import Gst, GLib

class RTMPPush:
    def __init__(self):
        Gst.init(None)

        self.rtmp_url = "rtmp://PUBLIC_CLOUD_SERVER/live/stream"

        self.pipeline = Gst.parse_launch(
            f"""
            libcamerasrc !
            video/x-raw,format=YUY2,width=640,height=480,key-int-max=30,framerate=30/1 !
            videoconvert !
            queue !
            x264enc tune=zerolatency speed-preset=ultrafast bitrate=1500 !
            queue !
            flvmux streamable=true name=mux !
            queue !
            rtmpsink location={self.rtmp_url}
            """
        )
    def run(self):
        self.pipeline.set_state(Gst.State.PLAYING)
        loop = GLib.MainLoop()
        loop.run()

if __name__ == "__main__":
    pusher = RTMPPush()
    pusher.run()

</code></pre></div></div>

<p>RTMP is very efficient, and MediaMTX is just a relay. Additionally, the receiving pipeline on the Droid Vision app uses UDP, which is extremely fast. So, there’s no noticeable delay when using the MediaMTX server to route the video stream.</p>

<h3 id="video-stream-fetched-for-the-droid-vision-app">Video Stream fetched for the Droid Vision app</h3>
<p>As mentioned earlier, we will continue to use RTSP to stream video to the mobile app. Therefore, the Droid Vision app only needs a URL update. The new URL is now: <code class="language-plaintext highlighter-rouge">rtsp://PUBLIC_CLOUD_SERVER:8554/live/stream</code>.</p>

<h3 id="protect-mediamtx-with-credentials">Protect MediaMTX with credentials</h3>
<p>By default, MediaMTX has no authentication — anyone who knows its RTSP/RTMP/HLS URL can push or pull. So it’s essential to protect it, especially when it runs in the public cloud.</p>

<p>MediaMTX has built-in user/password auth for:</p>
<ul>
  <li>Publishers</li>
  <li>Readers/viewers</li>
</ul>

<p>I configured users and passwords in the <code class="language-plaintext highlighter-rouge">mediamtx.yml</code> file. The credential applies to all paths.</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code># mediamtx.yml
paths:
  all_others:
    # protect push
    publishUser: mypublisher
    publishPass: secret123

    # protect pull
    readUser: myviewer
    readPass: view456
</code></pre></div></div>

<p>To run the docker container with the updated yaml file, I mount the local “mediamtx.yml” into the container:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>% docker run --rm -it --network=host \
    -v /path/to/local/config/mediamtx.yml:/mediamtx.yml \
    bluenviron/mediamtx:latest
</code></pre></div></div>

<p>When publishing or reading the stream, I include the credentials in the URL. For example, on the robot, I change the rtmp_url of the RTMPPush class to:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>MEDIA_SERVER = os.getenv("MEDIA_SERVER")
MEDIA_PUBLISH_USER = os.getenv("MEDIA_PUBLISH_USER")
MEDIA_PUBLISH_PASS = os.getenv("MEDIA_PUBLISH_PASS")

self.rtmp_url = f"rtmp://{MEDIA_SERVER}/live/stream?user={MEDIA_PUBLISH_USER}&amp;pass={MEDIA_PUBLISH_PASS}"
</code></pre></div></div>

<p>The Droid Vision app has been updated so users can configure the “Use Media Server” option with the media server’s username and password as separate fields. The app automatically inserts these credentials into the URL.</p>

<p><a href="/assets/media_server/IMG_0597.PNG">
  <img src="/assets/media_server/IMG_0597.PNG" />
</a></p>

<p>This method of protection is straightforward but not ideal for large-scale or highly secure applications because it uses hardcoded credentials. Therefore, MediaMTX offers alternative methods that utilize external authentication. For more information, please see the MediaMTX <a href="https://github.com/bluenviron/mediamtx">product site</a>.</p>

<h2 id="on-demand-video-streaming">On-Demand Video Streaming</h2>
<p>With a security camera, another critical feature is on-demand video streaming. Instead of streaming video continuously, the robot streams only when it detects something or when the user requests.</p>

<h3 id="media-control-server-polling">Media Control Server (Polling)</h3>
<p>A quick solution is for the robot to run a small control client that periodically polls the cloud server for the streaming state, as requested by the mobile app.</p>

<p><a href="/assets/media_server/media_control_server.drawio.png">
  <img src="/assets/media_server/media_control_server.drawio.png" />
</a></p>

<h3 id="mqtt-broker-messaging">MQTT Broker (Messaging)</h3>
<p>The second approach involves using Message Queuing Telemetry Transport (MQTT) for push notifications: set up a public cloud MQTT broker. The robot subscribes to a specific MQTT topic, while the mobile app publishes <code class="language-plaintext highlighter-rouge">start</code> or <code class="language-plaintext highlighter-rouge">stop</code> messages to that topic.</p>

<p>Compared to the previous solution, this method eliminates polling, allowing the robot to react instantly. It also scales well to many cameras. MQTT is widely used for smart cameras, doorbells, drones, and other devices, which is why I chose this approach for my security robot.</p>

<h4 id="media-bridge-server">Media Bridge Server</h4>
<p>To use messaging in the mobile app, I need to make some deployment adjustments: instead of sending start/stop commands directly from the mobile app, which would require embedding the MQTT Swift client library, I use a cloud-based bridge server. The bridge server is an HTTP server that posts start/stop messages to the MQTT broker.</p>

<p>This setup keeps the Droid Vision app simple - it can use HTTP requests to start and stop the robot’s streaming through the bridge server rather than sending direct messages by embedding the MQTT client library.</p>

<p><a href="/assets/media_server/MQTT.drawio.png">
  <img src="/assets/media_server/MQTT.drawio.png" />
</a></p>

<h4 id="1-install-and-config-the-mosquitto-mqtt-broker-on-the-cloud-server">1. Install and Config the Mosquitto MQTT Broker on the Cloud Server</h4>

<p>Update packages, install, and configure Mosquitto MQTT broker:</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>sudo apt update
sudo apt install mosquitto mosquitto-clients -y

# Enable and start Mosquitto service
sudo systemctl enable mosquitto
sudo systemctl start mosquitto

# add password auth
sudo mosquitto_passwd -c /etc/mosquitto/passwd mymqttuser
# Enter password when prompted

</code></pre></div></div>

<p>Create Mosquitto config <code class="language-plaintext highlighter-rouge">/etc/mosquitto/conf.d/default.conf</code> as the “root” user:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>allow_anonymous false
password_file /etc/mosquitto/passwd
listener 1883 0.0.0.0
</code></pre></div></div>

<p>Restart Mosquitto for the config changes:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>sudo systemctl restart mosquitto
</code></pre></div></div>

<h4 id="2-fastapi-media-bridge-on-the-cloud-server">2. FastAPI Media Bridge on the Cloud Server</h4>

<p>Install FastAPI and MQTT client:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code># Create virtual environment:
python3 -m venv venv
source venv/bin/activate

# Install FastAPI and MQTT client
pip install fastapi uvicorn paho-mqtt
</code></pre></div></div>

<p>Python code <code class="language-plaintext highlighter-rouge">bridge.py</code> for the media bridge server. The APIs use the “x_api_key” header for the Authentication.</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>import os
from typing import Optional

from fastapi import FastAPI, HTTPException
from paho.mqtt import publish
import uvicorn

app = FastAPI()

MQTT_BROKER = os.getenv("MQTT_BROKER", "localhost")
MQTT_PORT = int(os.getenv("MQTT_PORT", "1883"))
MQTT_USER = os.getenv("MQTT_USER")
MQTT_PASS = os.getenv("MQTT_PASS")
API_KEY = os.getenv("API_KEY")

def _publish(command: str, robot_name: str) -&gt; None:
    auth: Optional[dict] = None
    if MQTT_USER and MQTT_PASS:
        auth = {"username": MQTT_USER, "password": MQTT_PASS}

    # Use robot-specific topic: robots/{robot_name}/stream
    topic = f"robots/{robot_name}/stream"

    try:
        publish.single(
            topic,
            command,
            hostname=MQTT_BROKER,
            port=MQTT_PORT,
            auth=auth,
        )
    except Exception as exc:  # noqa: BLE001
        raise HTTPException(status_code=502, detail=f"Failed to publish MQTT message: {exc}") from exc

def verify_api_key(x_api_key: Optional[str] = Header(None, alias="X-API-Key")) -&gt; None:
    """Verify API key authentication."""
    if not API_KEY:
        # If no API key is configured, skip authentication
        return
    
    if not x_api_key or x_api_key != API_KEY:
        raise HTTPException(
            status_code=401, 
            detail="Invalid or missing API key. Include X-API-Key header."
        )


@app.get("/")
def index():
    return {"msg": "Media bridge service running!"}


@app.post("/robots/{robot_name}/stream/start")
def start(robot_name: str, _: None = Depends(verify_api_key)):
    _publish("start", robot_name)
    return {"msg": f"Published start for robot: {robot_name}"}


@app.post("/robots/{robot_name}/stream/stop")
def stop(robot_name: str, _: None = Depends(verify_api_key)):
    _publish("stop", robot_name)
    return {"msg": f"Published stop for robot: {robot_name}"}
</code></pre></div></div>

<p>Start FastAPI media bridge server in the Python virtual environment:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>uvicorn bridge:app --host 0.0.0.0 --port 8000 --reload
</code></pre></div></div>

<p>Verify that the messages are sent by the bridge server:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>mosquitto_sub -h localhost -u mymqttuser -P &lt;mqtt passwd&gt; -t "robot/stream"
</code></pre></div></div>

<p>For security reasons, the Media bridge must have SSL enabled. I prefer to use Nginx and generate the SSL certificate with Certbot. Nginx acts as a reverse proxy, forwarding stream commands to the FastAPI application. This way, the FastAPI port 8000 remains hidden behind the firewall, and the cloud instance exposes only the default HTTP ports, 443 and 80.</p>

<h4 id="3-streaming-control-on-the-robot">3. Streaming Control on the Robot</h4>

<p>The Robot connects to the MQTT broker, waits for messages, and starts or stops GStreamer accordingly.</p>

<p>To begin, I install the “MQTT client” Python library on the robot.</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code># Create virtual environment:
python3 -m venv venv --system-site-packages
source venv/bin/activate

# install MQTT client
pip install paho-mqtt
</code></pre></div></div>

<p>Then I create <code class="language-plaintext highlighter-rouge">stream_control.py</code> that waits for messages from the “robot/stream” topic. If the message is “start”, it starts the video streaming on the robot and publishs it to the media server. If the message is “stop”, the robot stops streaming the video.</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>import os
import shlex
import subprocess

import paho.mqtt.client as mqtt

MQTT_BROKER = os.getenv("MQTT_BROKER")
MQTT_PORT = int(os.getenv("MQTT_PORT", "1883"))
MQTT_USER = os.getenv("MQTT_USER")
MQTT_PASS = os.getenv("MQTT_PASS")
ROBOT_NAME = os.getenv("ROBOT_NAME", "Robot1")

# Construct robot-specific topic: robots/{robot_name}/stream
MQTT_TOPIC = f"robots/{ROBOT_NAME}/stream"

MEDIA_SERVER = os.getenv("MEDIA_SERVER")
MEDIA_PUBLISH_USER = os.getenv("MEDIA_PUBLISH_USER")
MEDIA_PUBLISH_PASS = os.getenv("MEDIA_PUBLISH_PASS")

RTMP_URL = f"rtmp://{MEDIA_SERVER}/live/stream?user={MEDIA_PUBLISH_USER}&amp;pass={MEDIA_PUBLISH_PASS}"

GST_PIPELINE = f"""
libcamerasrc ! \
  video/x-raw,format=YUY2,width=640,height=480,framerate=30/1 ! \
  videoconvert ! \
  queue ! \
  x264enc tune=zerolatency speed-preset=ultrafast bitrate=1500 ! \
  queue ! \
  flvmux streamable=true name=mux ! \
  queue ! \
  rtmpsink location="{RTMP_URL}"
"""

gst_process = None

def on_connect(client, userdata, flags, rc):
    print(f"Connected with result code {rc}")
    print(f"Subscribing to topic: {MQTT_TOPIC}")
    client.subscribe(MQTT_TOPIC)

def on_message(client, userdata, msg):
    global gst_process
    payload = msg.payload.decode()
    print(f"Received: {payload}")

    if payload == "start":
        if gst_process is None:
            print("Starting GStreamer...")
            gst_process = subprocess.Popen(
                shlex.split(f"gst-launch-1.0 {GST_PIPELINE}"),
                stdout=subprocess.PIPE,
                stderr=subprocess.PIPE
            )
        else:
            print("Stream already running.")

    elif payload == "stop":
        if gst_process is not None:
            print("Stopping GStreamer...")
            gst_process.terminate()
            gst_process = None
        else:
            print("No stream to stop.")

client = mqtt.Client()
if MQTT_USER:
    client.username_pw_set(username=MQTT_USER, password=MQTT_PASS)
client.on_connect = on_connect
client.on_message = on_message

client.connect(MQTT_BROKER, MQTT_PORT, 60)
client.loop_forever()
</code></pre></div></div>

<p>Finally, let’s start the robot’s stream control in the Python virtual environment.</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>python stream_control.py
</code></pre></div></div>

<p>With that, the robot now waits for the start/stop command for video streaming.</p>

<h4 id="4-start-or-stop-streaming-using-the-droid-vision-app">4. Start or Stop Streaming using the Droid Vision App</h4>

<p>Last but not least, in the Droid Vision app, I configure the media server API key and tap “Go Live View” to start the streaming request on the robot. Closing the streaming view stops the video streaming from the robot.</p>

<p><a href="/assets/media_server/IMG_0599.PNG">
  <img src="/assets/media_server/IMG_0599.PNG" />
</a></p>

<h2 id="all-in-one-picture">All in One Picture</h2>
<p>Putting video relay and on-demand streaming together:</p>

<p><a href="/assets/media_server/remote_video_streaming.drawio.png">
  <img src="/assets/media_server/remote_video_streaming.drawio.png" />
</a></p>

<p>Cheers!</p>]]></content><author><name>Modular Machines LLC</name></author><category term="Droid_Vision" /><summary type="html"><![CDATA[]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://www.modularmachines.ai/assets/media_server/IMG_3341.jpeg" /><media:content medium="image" url="https://www.modularmachines.ai/assets/media_server/IMG_3341.jpeg" xmlns:media="http://search.yahoo.com/mrss/" /></entry><entry><title type="html">Build Backyard Garden Bridges</title><link href="https://www.modularmachines.ai/diy/2025/07/04/Garden-Bridge.html" rel="alternate" type="text/html" title="Build Backyard Garden Bridges" /><published>2025-07-04T13:45:28+00:00</published><updated>2025-07-04T13:45:28+00:00</updated><id>https://www.modularmachines.ai/diy/2025/07/04/Garden-Bridge</id><content type="html" xml:base="https://www.modularmachines.ai/diy/2025/07/04/Garden-Bridge.html"><![CDATA[<p>I live in the Hill Country of Austin, Texas. The weather here can be dry, but we experience thunderstorms throughout the year. When the storms hit, rainwater rushes down from the roof, flowing toward the lawn. Since the house is built on a hill, there is a consistent path that the rainwater travels on the lawn. Over time, the grass along this path is washed away, leaving bare soil, which doesn’t look appealing during sunny days.</p>

<p>A year ago, I had someone install a dry riverbed along the water path to help manage rainwater. The riverbed is made of rocks of different kinds and colors. It loops around the house and directs the water to the backyard and down the hill. Besides its practical use, the riverbed also adds visual interest to the backyard.</p>

<!-- The dry riverbed is made up of three types of rocks: large rocks about the size of two fists line the edges to shape it; small gray rocks form the foundation, and red and white rocks act as accents. We also randomly place some large rocks on top of the dry riverbed to create patterns and break up the uniform appearance. -->

<p><a href="/assets/garden_bridge/IMG_3331.jpeg">
  <img src="/assets/garden_bridge/IMG_3331.jpeg" />
</a></p>

<p><a href="/assets/garden_bridge/IMG_3275.jpeg">
  <img src="/assets/garden_bridge/IMG_3275.jpeg" />
</a></p>

<h2 id="garden-bridges">Garden Bridges</h2>

<p>The drainage problem has been solved, but it introduced a new issue. The yard is now split into two sections. To open the backyard gate, we must walk on the rocks, which disrupts the layered structure of the riverbed. To mow the lawn, my husband has to lift the lawnmower over the rocks. This makes me want to build a garden bridge that is large enough for a lawnmower and a person to cross easily, but small enough to stay roughly within the bounds of the dry riverbed.</p>

<h3 id="tools-and-materia">Tools and Materia</h3>

<p>I have some 2x6 construction heart redwood lumber from the previous decking project. I also kept pallet boards from the Tesla Powerwall and solar panels. The pallet wood is treated and sturdy, making it perfect for the bridge’s foundation. The California redwood is beautiful and unique in this Texas Hill Country since the coastal redwood is exclusive to the American West Coast and has to be transported to Texas. I also have galvanized nails and decking screws from the earlier project. So, I have most of the materials needed to build the garden bridge. I only need to buy the paint and brackets.</p>

<p>Regarding the tools, I need a miter saw to cut lumber to specific lengths. I also need a jigsaw for making curves, along with a drill, impact driver, sander, hammer, rafter square, tape measure, pencils, and a paintbrush. I purchased these tools during the decking project and plan to reuse them for future woodworking. I enjoy woodworking because it requires creativity, artistry, and engineering precision.</p>

<p><a href="/assets/garden_bridge/IMG_3320.jpeg">
  <img src="/assets/garden_bridge/IMG_3320.jpeg" width="350" />
</a> 
<a href="/assets/garden_bridge/IMG_3326.jpeg">
  <img src="/assets/garden_bridge/IMG_3326.jpeg" width="370" />
</a></p>

<h3 id="design-and-measurement">Design and Measurement</h3>

<p>A flat bridge is easy to build, but an arched one is much more visually appealing. To create an arch on the wood, I used a thin, flexible piece of metal and traced along it. I use small nails to hold the metal piece on the board, making it easy for me to trace. The 3B drawing pencil is best for marking on wood because of its darkness and precise lead.</p>

<p><a href="/assets/garden_bridge/IMG_3250.jpeg">
  <img src="/assets/garden_bridge/IMG_3250.jpeg" width="350" />
</a></p>

<p><a href="/assets/garden_bridge/IMG_3297.jpeg">
  <img src="/assets/garden_bridge/IMG_3297.jpeg" />
</a></p>

<p>I use a board to connect the two arched boards to create an H-shaped base for the bridge. My lawnmower is 23 inches wide, so I designed the surface area of the bridge to be 27 inches wide, with an inch overhang on each side of the bridge. Since the board is 1.5 inches thick, the cross board must be 22 inches long - <code class="language-plaintext highlighter-rouge">27 - 1*2 - 1.5*2 = 22</code>.</p>

<p>I cut the redwood boards to 27 inches, and one of the pallet boards to 22 inches long. Once cut, all boards need to be stained and sealed. I like to wash the boards before painting them because it removes dirt and mold and gives the wood a fresh, new look.</p>

<p><a href="/assets/garden_bridge/IMG_3269.jpeg">
  <img src="/assets/garden_bridge/IMG_3269.jpeg" />
</a></p>

<h3 id="installation">Installation</h3>

<p>To connect the two arched boards, I use galvanized nails with metal brackets. I learned this from my decking project, where galvanized nails are used for all joist hangers and joints because they protect against corrosion and offer superior shear strength, supporting the weight and resisting movement. Before driving in the nails, I pre-drilled the holes to prevent the wood from splitting. It also makes the nails go in easier. I asked ChatGPT, and it suggested using a 1/8 drill bit for the 10D galvanized nails.</p>

<p><a href="/assets/garden_bridge/IMG_3298.jpeg">
  <img src="/assets/garden_bridge/IMG_3298.jpeg" />
</a></p>

<p><a href="/assets/garden_bridge/IMG_3299.jpeg">
  <img src="/assets/garden_bridge/IMG_3299.jpeg" width="360" />
</a> 
<a href="/assets/garden_bridge/IMG_3300.jpeg">
  <img src="/assets/garden_bridge/IMG_3300.jpeg" width="360" />
</a></p>

<p>To create a precise one-inch overhang on both sides of the bridge, I marked one-inch on the boards so that when I install them onto the base, I can check if both sides are even. To secure the boards, I measure and pre-drill the holes to avoid splitting.</p>

<p><a href="/assets/garden_bridge/IMG_3301.jpeg">
  <img src="/assets/garden_bridge/IMG_3301.jpeg" />
</a> 
<a href="/assets/garden_bridge/IMG_3305.jpeg">
  <img src="/assets/garden_bridge/IMG_3305.jpeg" />
</a></p>

<p>I set the impact driver’s setting to “1” so it does not overpower the screws. Once the screws are in, I change the setting to “2” so I can push the screws slightly below the wood surface to create a finished look.</p>

<p><a href="/assets/garden_bridge/IMG_3304.jpeg">
  <img src="/assets/garden_bridge/IMG_3304.jpeg" />
</a></p>

<p><a href="/assets/garden_bridge/IMG_3306.jpeg">
  <img src="/assets/garden_bridge/IMG_3306.jpeg" />
</a></p>

<h3 id="backyard-time">Backyard Time</h3>
<p>Check out my backyard! I’ve constructed two bridges, one before and one after the gate.</p>

<p><a href="/assets/garden_bridge/IMG_3264.jpeg">
  <img src="/assets/garden_bridge/IMG_3264.jpeg" />
</a></p>

<p><a href="/assets/garden_bridge/IMG_3335.jpeg">
  <img src="/assets/garden_bridge/IMG_3335.jpeg" />
</a></p>

<p>Cheers!</p>]]></content><author><name>Modular Machines LLC</name></author><category term="DIY" /><summary type="html"><![CDATA[I live in the Hill Country of Austin, Texas. The weather here can be dry, but we experience thunderstorms throughout the year. When the storms hit, rainwater rushes down from the roof, flowing toward the lawn. Since the house is built on a hill, there is a consistent path that the rainwater travels on the lawn. Over time, the grass along this path is washed away, leaving bare soil, which doesn’t look appealing during sunny days.]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://www.modularmachines.ai/assets/garden_bridge/IMG_3264.jpeg" /><media:content medium="image" url="https://www.modularmachines.ai/assets/garden_bridge/IMG_3264.jpeg" xmlns:media="http://search.yahoo.com/mrss/" /></entry><entry><title type="html">MGRS GPS for iOS</title><link href="https://www.modularmachines.ai/gps/2025/06/01/MGRS-GPS.html" rel="alternate" type="text/html" title="MGRS GPS for iOS" /><published>2025-06-01T06:45:28+00:00</published><updated>2025-06-01T06:45:28+00:00</updated><id>https://www.modularmachines.ai/gps/2025/06/01/MGRS-GPS</id><content type="html" xml:base="https://www.modularmachines.ai/gps/2025/06/01/MGRS-GPS.html"><![CDATA[<p><a href="/assets/mgrs/iTunesArtwork@2x.png">
  <img src="/assets/mgrs/iTunesArtwork@2x.png" width="200" />
</a></p>

<p>MGRS GPS is an app that gives your location using the MGRS coordinate system, your heading, speed, and elevation.</p>

<p>Waypoints can save and display on the map as you navigate.</p>

<p>If enabled, waypoints are saved to iCloud and can be accessed from all your devices. Allowing you to do things like save on your iPhone and then edit on your iPad later.</p>

<p>All of your waypoints can be exported to a kml file allowing you to display them on applications like Google Earth.</p>

<p>Download the MGRS GPS from the <a href="https://apps.apple.com/us/app/mgrs-gps/id533047095">App Store</a>.</p>

<p><a href="/assets/mgrs/map_p.png">
  <img src="/assets/mgrs/map_p.png" width="235" />
</a>
<a href="/assets/mgrs/settings1.png">
  <img src="/assets/mgrs/settings1.png" width="235" />
</a> 
<a href="/assets/mgrs/settings2.png">
  <img src="/assets/mgrs/settings2.png" width="235" />
</a></p>

<p><a href="/assets/mgrs/map_h.png">
  <img src="/assets/mgrs/map_h.png" />
</a></p>]]></content><author><name>Modular Machines LLC</name></author><category term="GPS" /><summary type="html"><![CDATA[]]></summary></entry><entry><title type="html">Robot Auto Mapping using Nav2 SLAM Toolbox</title><link href="https://www.modularmachines.ai/security_robot/2025/05/26/SecurityRobot-Mapping.html" rel="alternate" type="text/html" title="Robot Auto Mapping using Nav2 SLAM Toolbox" /><published>2025-05-26T16:27:08+00:00</published><updated>2025-05-26T16:27:08+00:00</updated><id>https://www.modularmachines.ai/security_robot/2025/05/26/SecurityRobot-Mapping</id><content type="html" xml:base="https://www.modularmachines.ai/security_robot/2025/05/26/SecurityRobot-Mapping.html"><![CDATA[<p>I am always curious about how the Roomba vacuum automatically creates a map of the floor by driving around. In this blog, I will explain how to create the floor map using a mobile robot with the Nav2 SLAM Toolbox.</p>

<h2 id="get-to-know-the-slam-toolbox">Get to Know the SLAM Toolbox</h2>

<p>The Nav2 (Navigation 2) stack is a modular system that enables autonomous navigation for mobile robots in the ROS framework. The <strong>SLAM Toolbox</strong>, which stands for Simultaneous Localization and Mapping, plays a crucial role in the navigation stack and is part of ROS Navigation 2.</p>

<p>The SLAM Toolbox is a 2D mapping and localization system based on LiDAR data and odometry. It is the first two steps of the Nav2 pipeline, which are: </p>
<ol>
  <li><strong>Mapping</strong>: generating and publishing the map of the environment</li>
  <li><strong>Localization</strong>: using the saved map to localize the robot in the known environment</li>
  <li><strong>Planning</strong>: computing the path from the current pose to the goal</li>
  <li><strong>Control</strong>: following the path using velocity commands</li>
  <li><strong>Recovery</strong>: handling failure conditions, such as scenarios where the robot gets stuck</li>
</ol>

<h3 id="mapping">Mapping</h3>

<p>The SLAM Toolbox enables the robot to generate a map of an unknown environment while tracking its position on that map in real time. SLAM Toolbox builds a 2D occupancy grid map using the robot’s laser scans and motion data.</p>

<p>Here is the kitchen map created by my robot, R4, as it circled the kitchen using the <a href="https://apps.apple.com/us/app/droid-vision/id6737351549">Droid Vision app</a>.</p>

<p><a href="/assets/slam/DroidVision_SLAM2.png">
  <img src="/assets/slam/DroidVision_SLAM2.png" />
</a></p>

<p>The tool that displays the map is RViz2, which is a 3D visualization tool within the ROS 2 framework. RViz2 enables users to view and interact with a robot’s state, sensor data, and environment in a 3D space. It provides a window into the robot’s world, illustrating what the robot “sees” and how it is positioned.</p>

<p>The generated map can be saved in RViz2 using the “slam_toolbox” panel. This panel contains the “save” and “serialization” options, where the “serialization” format is intended for use by the SLAM Toolbox itself, while the “save” format allows the map to be utilized by other Nav2 localization tools, such as Adaptive Monte Carlo Localization (AMCL).</p>

<p>Mapping is the first step of the navigation pipeline, and SLAM Toolbox plays a crucial role in map generation.</p>

<h3 id="localization">Localization</h3>

<p>The SLAM Toolbox is also a localization system that identifies the robot’s position and orientation (pose) within its perceived world.</p>

<p>To explain the concept of “Localization” in ROS, we must discuss the ROS Frame and Transform.</p>

<h4 id="frame">Frame</h4>
<ul>
  <li>A <strong>Frame</strong> refers to a 3D coordinate system used to define the spatial location and orientation of an entity (like a robot, sensor, or object) in space.</li>
  <li>The <strong>odom Frame</strong> is the starting point of the robot’s trajectory. As the robot moves, its pose changes in the odom frame based on odometry data.</li>
  <li>The <strong>base_link Frame</strong> is a fixed frame attached to the robot’s chassis or main body. It serves as a reference point for other robot frames, such as wheels and lidar sensors.</li>
</ul>

<p>The ROS coordinate frame follows the Right-Hand Rule, where X goes forward from the robot’s center, Y goes to the left, and Z rises upward, perpendicular to the floor.</p>

<p>Here are the frames of my mobile robot, as seen in RViz, with red representing the X-axis, green representing the Y-axis, and blue representing the Z-axis. Every joint defined in the robot’s URDF has a frame.</p>

<p><a href="/assets/slam/rviz2.png">
  <img src="/assets/slam/rviz2.png" />
</a></p>

<h4 id="transform-chain">Transform chain</h4>
<p>What is Transform in ROS?</p>

<ul>
  <li><strong>Transform(TF)</strong> represents the position and orientation of a coordinate frame relative to another. It’s a fundamental concept in ROS that describes how objects and coordinate systems are positioned in 3D space, enabling systems to understand the relative positions of various components and their interactions.</li>
</ul>

<p>A typical Transform chain for mobile robots looks like this:</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>map → odom → base_link → laser_frame
</code></pre></div></div>

<p>Each transform has a purpose:</p>
<ul>
  <li>
    <p>The <strong>odom → base_link</strong> comes from the wheel encoder or other local sensors. It tracks how far the robot has moved from its starting point. It is fast to compute, continuous but accumulates drift over time. <br /><br />To correct for odometry drift, a higher-level system like SLAM Toolbox computes the robot’s true global pose and broadcasts.</p>
  </li>
  <li>
    <p>The <strong>map → odom</strong> comes from the SLAM Toolbox and provides global correction to the drifting odometry. <br /><br />The SLAM Toolbox estimates the robot’s pose in the global map frame based on laser scans and odometry data. It uses this information to compute a correction transform and broadcasts it as the map → odom transform. This allows the robot’s current global position to be determined as base_link in the map frame.</p>
  </li>
</ul>

<p>The “map → odom” transform is essential for:</p>
<ol>
  <li>RViz visualization: the robot appears in the correct position on the map.</li>
  <li>Path planning: the global planners can compute paths from the robot’s actual location.</li>
  <li>Localization: the robot can recognize when it returns to a previously visited location, known as loop closure.</li>
</ol>

<p>In summary, the SLAM Toolbox broadcasts the map → odom transform to anchor the robot’s local odometry in the global map frame. This corrects for drift and ensures accurate global navigation. Without this transform, Nav2 cannot correctly localize the robot or plan valid paths.</p>

<p>With the SLAM Toolbox for mapping and localization, the navigation server performs the remaining steps of planning, control, and recovery in the navigation pipeline.</p>

<h2 id="meet-robot-r4">Meet Robot R4</h2>
<p>This is my Differential Mobile Robot featuring two independently driven wheels and two caster wheels for balance. The Differential Bot moves forward or backward by rotating both wheels in the same direction, and it can rotate in place by spinning the wheels in opposite directions.</p>

<p>At the top of the robot is the RPLidar, which generates the scan data. I have a blog <a href="/security_robot/2025/02/08/SecurityRobot-RPLidar.html">RPLidar in ROS 2 Docker on Raspberry Pi</a> that details how to configure and run the Lidar nodes on a real robot. Please check that out.</p>

<p><a href="/assets/teleop/IMG_3220.jpeg">
  <img src="/assets/teleop/IMG_3220.jpeg" width="400" />
</a></p>

<h2 id="robot-mapping">Robot Mapping</h2>

<p>To run the robot that generates maps, we need a ROS environment that includes the robot module, the RPLidar driver, the navigation stack, the rosbridge suite, and many other dependencies. The most straightforward approach is to use a Docker container. Docker simplifies deployment and makes the environment portable, consistent, and shareable.</p>

<h3 id="a-docker-container">A Docker Container</h3>
<p>A Docker container is created from a Docker image, which serves as a template defining its structure and dependencies. Docker is widely used in enterprise software’s microservice architecture, but is relatively new in robotic applications. I have a blog, <a href="/security_robot/2025/02/08/SecurityRobot-RPLidar.html">RPLidar in ROS 2 Docker on Raspberry Pi</a>, which explains in detail how to use a Docker container in a robot.</p>

<p>The Dockerfile I prepared for my robot can be downloaded from here: <a href="/code/Dockerfile">Dockerfile</a> and <a href="/code/requirements.txt">requirements.txt</a>. The Docker image created from the Dockerfile includes all the necessary ROS 2 modules, Python libraries, and source code repositories. Access to the private Git repository is granted using an SSL key.</p>

<p>We can build the Docker image and start the Docker container using the following commands. Building the Docker image compiles the source code and creates the bot environment. You only need to build the image once and can reuse it to start the Docker container whenever needed.</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code># build the docker image "my_bot_image"
docker build -t my_bot_image .

# start the docker container from the image
docker run -it --network=host --ipc=host -v /dev:/dev \
    --device-cgroup-rule='c 188:* rmw' \
    --device-cgroup-rule='c 166:* rmw' \
    my_bot_image

</code></pre></div></div>

<h3 id="commands-to-run-the-robot-mapping">Commands to Run the Robot Mapping</h3>

<p>I launch the robot, the PRLidar, and the SLAM toolbox in the Raspberry Pi’s Docker container.</p>

<p>Additionally, I start the “rosbridge_server” because I use the Droid Vision app to view and control my robot remotely from my phone. Please see the article <a href="/droid_vision/2025/03/06/DroidVision-Teleop.html">Droid Vision with built-in Joystick and Keypad</a> for more information. If you prefer the traditional ROS teleop_twist_keyboard or teleop_twist_joy for remote control, feel free to use those and skip launching the rosbridge_server.</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code># 1. Start the robot itself
ros2 launch my_bot robot.launch.py

# 2. Start the robot's LiDAR scan
ros2 launch rplidar_ros rplidar_c1_launch.py serial_port:=/dev/ttyUSB0 frame_id:=laser_frame

# 3. Start the slam toolbox to create the map 
ros2 launch slam_toolbox online_async_launch.py use_sim_time:=false

# 4. Start the rosbridge_server for the Droid Vision Teleop
ros2 launch rosbridge_server rosbridge_websocket_launch.xml
</code></pre></div></div>

<p>On the Linux development machine, I launch RViz to visualize the robot in its world.</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code># 1. Launch RViz
rviz2 -d ~/dev/dev_ws/src/my_bot/config/view_bot_map.rviz
</code></pre></div></div>
<iframe width="800" height="468" src="https://youtube.com/embed/5HTosrSPC9A?autoplay=1&amp;mute=0">
</iframe>

<p>Cheers! :D</p>]]></content><author><name>Modular Machines LLC</name></author><category term="Security_Robot" /><summary type="html"><![CDATA[]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://www.modularmachines.ai/assets/slam/DroidVision_SLAM.png" /><media:content medium="image" url="https://www.modularmachines.ai/assets/slam/DroidVision_SLAM.png" xmlns:media="http://search.yahoo.com/mrss/" /></entry><entry><title type="html">Simplify Robot Cars using TB6612 Motor Shield</title><link href="https://www.modularmachines.ai/security_robot/2025/04/21/SecurityRobot-MotorShield.html" rel="alternate" type="text/html" title="Simplify Robot Cars using TB6612 Motor Shield" /><published>2025-04-21T14:27:08+00:00</published><updated>2025-04-21T14:27:08+00:00</updated><id>https://www.modularmachines.ai/security_robot/2025/04/21/SecurityRobot-MotorShield</id><content type="html" xml:base="https://www.modularmachines.ai/security_robot/2025/04/21/SecurityRobot-MotorShield.html"><![CDATA[<p>Recall from my blog “<a href="/security_robot/2025/02/22/SecurityRobot-Ros2_control.html">ROS 2 Control, Robot Control the Right Way</a>” that my robot uses an Arduino microcontroller. The Arduino cannot directly drive the motors because they require much more current than the Arduino can supply. Additionally, when the motors are turned off, they produce high voltages that could potentially damage the microcontroller pins. Therefore, the Arduino employs a DC motor driver to control the motors’ speed and direction.</p>

<h2 id="motor-driver">Motor Driver</h2>
<p>I select motor drivers based on the motor’s <a href="https://osepp.com/downloads/pdf/DC-Motor-Spec.pdf">specifications</a>, particularly its rated voltage and current. Additionally, I also consider the motor driver’s size and mounting methods. If I build a tiny robot, I must use a tiny motor driver.</p>

<p>To control the motor’s direction, we commonly use a design called an “H-Bridge”, which consists of four switches (typically transistors) arranged in an “H” pattern, with the motor connected between the center points of the left and right legs. By turning on specific pairs of switches, current flows through the motor in one direction or the other, allowing it to spin forward or backward.</p>

<p><a href="/assets/motor_driver/H_bridge.png">
  <img src="/assets/motor_driver/H_bridge.png" width="500" />
</a></p>

<p>To control motor speed, we use PWM (Pulse Width Modulation) on the motor driver’s PWM input pin. PWM works by rapidly switching the motor’s power on and off and adjusting the duty cycle — the percentage of time the signal is HIGH versus LOW. A higher duty cycle (e.g., 80%) means the motor receives more power and spins faster, while a lower duty cycle (e.g., 30%) slows it down. In Arduino, this is done using the analogWrite function, where the value ranges from 0 (off) to 255 (full speed).</p>

<p>Examples of H-Bridge Motor Drivers. All of them have two channels for controlling two DC motors.</p>
<ul>
  <li>L298N: supports motor voltage 5V - 35V and continuous current up to 2A.</li>
  <li>TB6612FNG: a tiny module. It supports motor voltage 4.5V - 13.5V and continuous current up to 1.2A.</li>
  <li>MX1508: an ideal choice for tiny motors. It supports motor voltage 2V - 10V and continuous current up to 1.5A.</li>
</ul>

<h3 id="l298n-dual-h-bridge">L298N Dual H-Bridge</h3>
<p>The L298N is a popular dual H-bridge motor driver that can control the speed and direction of two DC motors. It operates at voltages up to 35V and can handle up to 2A per channel, making it suitable for small to medium-sized motor control applications. It is very affordable, widely available, and easy to use.</p>

<p>The L298N uses bipolar junction transistors (BJTs) in its H-bridge design, which are less efficient than MOSFETs, resulting in more heat and significant voltage drop. Typically, a single bipolar transistor experiences a drop of approximately 0.7 volts. When two transistors are employed simultaneously in an H-Bridge configuration, the total drop is estimated to be around 1.4 volts. Consequently, if a 9-volt supply is applied to the H-Bridge, the motor will actually receive only 7.6 volts.</p>

<p>Another problem I face when using the L298N is that it is bulky and requires messy wiring to connect the necessary L298N pins to the Arduino. Here are the pins that must be connected:</p>
<ul>
  <li>EnableA, EnableB (connect to Arduino’s PWM pins to control the motor’s speed)</li>
  <li>IN1, IN2, IN3, IN4 (control the motor direction)</li>
  <li>5V logic voltage, GND</li>
</ul>

<p><a href="/assets/motor_driver/IMG_3119.jpeg">
  <img src="/assets/motor_driver/IMG_3119.jpeg" width="350" />
</a>
<a href="/assets/motor_driver/IMG_3079.jpeg">
  <img src="/assets/motor_driver/IMG_3079.jpeg" width="350" />
</a></p>

<h3 id="tb6612fng-dual-h-bridge">TB6612FNG Dual H-Bridge</h3>
<p>The voltage drop, heat generation, and bulky size led me to explore other Dural H-Bridge motor drivers.</p>

<p>TB6612FNG features a newer, more efficient design, utilizing MOSFETs for switching, which results in a lower voltage drop and higher efficiency. Overall, TB6612FNG is smaller, cooler, and more efficient, making it a better choice for most projects unless higher voltage tolerance is needed from L298N.</p>

<p>Regarding the wiring, TB6612FNG uses the same pins as L298N and can serve as a drop-in replacement for L298N.</p>

<p>Here are L298N and TB6612FNG side by side; TB6612FNG is less than one-fifth the size of L298N.</p>

<p><a href="/assets/motor_driver/IMG_3125.jpeg">
  <img src="/assets/motor_driver/IMG_3125.jpeg" width="350" />
</a></p>

<h2 id="motor-shield">Motor Shield</h2>
<p>An Arduino Motor Shield is an add-on board (or “shield”) that plugs directly onto an Arduino to make it easy to control DC motors, stepper motors, and servos. It contains motor driver chips, such as the L298N or TB6612FNG, that handle the high current and voltage required by motors, which the Arduino alone cannot supply.</p>

<p>The shield simplifies wiring and programming by using a standard pin layout, and many shields come with Arduino libraries to make coding easier.</p>

<h3 id="meet-the-osepp-tb6612-motor-shield">Meet the OSEPP TB6612 Motor Shield</h3>
<p>The OSEPP TB6612 Motor Shield uses two TB6612FNG dual H-bridge driver chips, allowing independent control of up to four DC motors (M1–M4). All pins from the shield are already connected to the Arduino once the shield is stacked onto the Arduino board, removing eight wires from the configuration.</p>

<p>Here are the pictures of the updated wiring. I replaced the L298N motor driver with the TB6612 motor shield. Once I connect the motors to the motor shield using the wire clips, I don’t need any additional wires to control the motor. The four wires on the motor shield are for the motor encoders, not for the motors themselves.</p>

<p>The motor shield has convenient “V” and “G” rails to provide the 5V Logic Voltage and GND for the encoders and sensors. However, they are NOT directly connected to Arduino’s 5V/GND rails, so be careful when using them. More details will be discussed in the next section on how to use them correctly.</p>

<p><a href="/assets/motor_driver/IMG_3098.jpeg">
  <img src="/assets/motor_driver/IMG_3098.jpeg" width="350" />
</a>
<a href="/assets/motor_driver/IMG_3099.jpeg">
  <img src="/assets/motor_driver/IMG_3099.jpeg" width="350" />
</a></p>

<h3 id="troubleshooting-lessons-common-pitfalls">Troubleshooting Lessons, Common Pitfalls</h3>
<p>Once the wires are cleaned up, the next step is to update the code to make it work again. I struggled with that for a couple of days, and here are some crucial details I learned about using the TB6612 Motor Shield.</p>

<h4 id="dedicated-arduino-digital-pins">Dedicated Arduino Digital Pins</h4>
<p>Unlike a standalone motor driver, the OSEPP TB6612 motor shield has a <a href="https://osepp.com/downloads/pdf/tb6612.pdf">Driver Schematic</a>, which indicates that each H-bridge connects to dedicated Arduino digital pins for direction and PWM control, with motor power supplied through a VIN terminal and routed internally to the driver chips’ VM pins. This means I need to update the Arduino sketch to use those specific pins for motor direction and PWM control.</p>

<p>The STBY (standby) pins are internally connected to 5V, keeping both motor drivers active by default. This design integrates neatly with Arduino UNO headers, requiring no external wiring for basic motor control.</p>

<p>Each type of shield has its schematic. Grok or ChatGPT are currently not aware of it and hallucinate heavily. For example, they believe the reason the code is not working with the TB6612 motor shield is that the STBY pin is not set to HIGH and insist I run a jumper wire to connect pin 10 to 5V since pin 10 is usually the STBY pin. :D</p>

<h4 id="pwm-with-one-direction-input">PWM with One DIRECTION Input</h4>
<p>From the OSEPP example code, I learned that I only use one pin to control the direction, unlike L298N or TB6612FNG, which use two input pins for the direction. Therefore, to stop the motor, I must set PWM to 0 rather than setting the input pins to LOW.</p>

<p>Here is the Arduino sketch I created to test the DC motor with the OSEPP TB6612 Motor Shield:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>// Motor M1 (Motor A)
int enA = 11;    // PWM speed control
int dirA = 8;     // DIRECTION control

// Motor M2 (Motor B)
int enB = 3;     // PWM speed control
int dirB = 12;    // DIRECTION control

void setup()
{
  // Set all the motor control pins to outputs
  pinMode(enA, OUTPUT);
  pinMode(enB, OUTPUT);
  pinMode(dirA, OUTPUT);
  pinMode(dirB, OUTPUT);

  Serial.begin(9600);
}

// Run the motors in both directions at a fixed speed
void loop()
{
  // Run Motor M1 and M2 forward
  Serial.println("Forward");
  digitalWrite(dirA, HIGH);
  analogWrite(enA, 150); // speed 0–255

  digitalWrite(dirB, HIGH);
  analogWrite(enB, 150); // speed 0–255
  delay(2000);

  // Reverse the motor directions
  Serial.println("Reverse");
  digitalWrite(dirA, LOW);
  digitalWrite(dirB, LOW);
  delay(2000);

  // Stop the motor
  Serial.println("Stop motors");
  analogWrite(enA, 0);
  analogWrite(enB, 0);
  delay(1000);
}

</code></pre></div></div>

<h4 id="0-volts-at-m1-m2">0 Volts at M1, M2?</h4>
<p>OSEPP TB6612 Motor Shield has a toggle switch. Even if the shield’s VIN terminal is connected to a battery and the shield is powered, the motors are NOT powered until the shield’s switch is turned on. I was stuck on this and could not figure out why the motor did not spin until I noticed the switch on the shield.</p>

<h4 id="the-power-puzzle-vin-vm-and-the-v-and-g-rails">The Power Puzzle: VIN, VM, and the “V” and “G” Rails</h4>
<p>The motor shield includes logic rails labeled “V” and GND rails labeled “G” for powering encoders or sensors and exposes signal pins alongside them for convenient 3-pin connections. However, the pins on those rails are not directly connected to Arduino’s 5V/GND rails. They are only powered when the shield’s VIN terminal is connected to the battery and the motor shield switch is turned on. Otherwise, the encoder signal lines become “floating” and produce oscillating HIGH/LOWs.</p>

<p>The alternative is to use Arduino 5V and GND for a clean and stable reference voltage, which is what I ended up choosing in my setup.</p>

<h3 id="update-to-the-ros-arduino-bridge-sketch">Update to the ROS Arduino Bridge Sketch</h3>
<p>I updated the <a href="https://github.com/jiayihoffman/ros_arduino_bridge/tree/MS_TB6612">ros_arduino_bridge</a> repository to support the TB6612 motor shield and modified the encoder_driver to use analog pins A2-A5 for encoder readings. This repository serves as the hardware interface for the ros2_control of my robot car.</p>

<p>Cheers!</p>]]></content><author><name>Modular Machines LLC</name></author><category term="Security_Robot" /><summary type="html"><![CDATA[]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://www.modularmachines.ai/assets/motor_driver/IMG_3098.jpeg" /><media:content medium="image" url="https://www.modularmachines.ai/assets/motor_driver/IMG_3098.jpeg" xmlns:media="http://search.yahoo.com/mrss/" /></entry><entry><title type="html">Enhance DC Motors using Motor Encoders</title><link href="https://www.modularmachines.ai/security_robot/2025/04/08/SecurityRobot-MotorEncoder.html" rel="alternate" type="text/html" title="Enhance DC Motors using Motor Encoders" /><published>2025-04-08T16:27:08+00:00</published><updated>2025-04-08T16:27:08+00:00</updated><id>https://www.modularmachines.ai/security_robot/2025/04/08/SecurityRobot-MotorEncoder</id><content type="html" xml:base="https://www.modularmachines.ai/security_robot/2025/04/08/SecurityRobot-MotorEncoder.html"><![CDATA[<p>In the ROS (Robot Operating System) ecosystem, a motor encoder plays a crucial role in enabling feedback control and odometry estimation for mobile robots. I will show you how to enhance standard DC motors using motor encoders.</p>

<iframe width="800" height="468" src="https://www.youtube.com/embed/NX-1zldg81s?autoplay=1&amp;mute=0">
</iframe>

<h2 id="what-is-a-motor-encoder">What is a Motor Encoder</h2>

<p>A motor encoder is a sensor attached to a motor shaft that measures the rotation of the shaft. It converts mechanical motion into digital signals that a controller can interpret to manage the motor’s speed.</p>

<p>Recall from my blog “<a href="/security_robot/2025/02/22/SecurityRobot-Ros2_control.html">ROS 2 Control, Robot Control the Right Way</a>” that I briefly touched on using motor encoders and ros2_control for a closed-loop robot system. In the following component diagram, the motor encoder contributes to the elements circled in red.</p>

<p><a href="/assets/motor_encoder/ros2_control.drawio.png">
  <img src="/assets/motor_encoder/ros2_control.drawio.png" />
</a></p>

<h3 id="the-purpose-of-motor-encoder-in-ros">The Purpose of Motor Encoder in ROS</h3>
<h4 id="odometry">Odometry</h4>

<p>In robotics, odometry (or odom for short) refers to the method of estimating a robot’s position and orientation over time based on sensor data from wheel encoders. In ROS, odometry is published as messages on the /odom topic. Specifically, encoders track how far each wheel has turned, and the ROS nodes utilize this data to estimate:</p>

<ul>
  <li>Distance traveled</li>
  <li>Robot’s pose (position and orientation)</li>
  <li>Velocity</li>
</ul>

<h4 id="feedback-for-motor-control">Feedback for Motor Control</h4>

<p>The “diff_drive_controller” in ros2_control relies on encoder feedback for:</p>
<ul>
  <li>Position control: Moving to a desired position.</li>
  <li>Velocity control: Maintaining a target speed.</li>
</ul>

<p>The encoder data is part of the closed-loop system that adjusts motor commands in real time.</p>

<h4 id="joint-state-publishing">Joint State Publishing</h4>

<p>The readings from the encoders assist in reporting the position, angle, and velocity of the left and right wheel joints through the /joint_states topic, allowing users to visualize the robot in tools like RViz. This is how my robot appears in RViz.</p>

<p><a href="/assets/motor_encoder/rviz2.png">
  <img src="/assets/motor_encoder/rviz2.png" />
</a></p>

<h2 id="upgrade-dc-motors-using-motor-encoders">Upgrade DC Motors using Motor Encoders</h2>
<p>I have several <a href="https://osepp.com/accessories/motors/143-ls-00041-high-torque-electric-motor-6v">OSEPP DC motors</a> that I used for different DIY robotics projects. These motors are high-quality, made from durable materials, and they provide high torque and a long motor shaft. They work well with the mechanical components of my robot.</p>

<p><a href="/assets/IMG_2910.jpeg">
  <img src="/assets/IMG_2910.jpeg" width="350" />
</a>
<a href="/assets/motor_encoder/IMG_2990.jpeg">
  <img src="/assets/motor_encoder/IMG_2990.jpeg" width="350" />
</a></p>

<p>I considered using a new pair of DC motors with built-in encoders to simplify the setup. However, the ones I found on Amazon do not have a long enough motor shaft to work with my robotic parts.</p>

<p><a href="/assets/motor_encoder/IMG_3065.jpeg">
  <img src="/assets/motor_encoder/IMG_3065.jpeg" width="600" />
</a></p>

<p>After careful consideration, I chose to reuse my existing DC motors and enhance them with the <a href="https://osepp.com/accessories/motors/114-motor-encoder">OSEPP Motor Encoder</a>. This motor encoder comes with a datasheet and detailed <a href="https://osepp.com/downloads/pdf/Encoder%20Assembly.pdf">assembly instruction</a>. Furthermore, I appreciate the build quality of OSEPP products.</p>

<h3 id="quadrature-encoders">Quadrature Encoders</h3>
<p>The OSEPP motor encoder is a quadrature encoder, a type of rotary encoder. It provides two output signals, Channel A and Channel B, that are 90° out of phase. This allows the system to determine:</p>
<ol>
  <li>Speed (the rate at which the motor is rotating).</li>
  <li>Direction (If A leads B → clockwise. If B leads A → counterclockwise).</li>
</ol>

<p><a href="/assets/motor_encoder/Assembly_Guide_Magnet.jpeg">
  <img src="/assets/motor_encoder/Assembly_Guide_Magnet.jpeg" width="350" />
</a>
<a href="/assets/motor_encoder/Assembly_Guide_A3144_Mount.jpeg">
  <img src="/assets/motor_encoder/Assembly_Guide_A3144_Mount.jpeg" width="350" />
</a></p>

<h3 id="test-the-encoder-installation">Test the Encoder Installation</h3>
<p>After following the assembly instructions, I installed the encoder mount next to the motor’s magnet. I wired the encoders to the Arduino, and now it’s time to test the encoders’ readings.</p>

<p>I used the following Arduino program to check the readings.</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>#define ENCA A4 // Pin 2, A4
#define ENCB A5 // Pin 3, A5

void setup() {
  Serial.begin(9600);
  pinMode(ENCA,INPUT);
  pinMode(ENCB,INPUT);
}

void loop() {
  int a = digitalRead(ENCA);
  int b = digitalRead(ENCB);
  Serial.print(a*5); 
  Serial.print(" ");
  Serial.print(b*5);
  Serial.println();
}
</code></pre></div></div>

<p>If everything is installed and wired correctly, we should see “value A” and “value B” interlacing in the Serial Plotter when I manually rotate the wheel.</p>

<p><a href="/assets/motor_encoder/reading.png">
  <img src="/assets/motor_encoder/reading.png" width="600" />
</a></p>

<p>If the plot appears as a flat line or shows only one value oscillating while rotating the wheel, the issue may be related to the wire connection or the positioning of the encoder.</p>

<p>In my case, the problem was that the encoder mount was too far from the magnet. Additionally, the two channels were angled relative to the magnet. The A3144 sensor’s Channels A and B must be close to and parallel to the magnet’s side to detect the magnetic field while the motor shaft spins. Here is a picture of a properly installed motor encoder mount.</p>

<p><a href="/assets/motor_encoder/IMG_3068.jpeg">
  <img src="/assets/motor_encoder/IMG_3068.jpeg" width="600" />
</a></p>

<h3 id="encoder-count-per-revolution">Encoder Count Per Revolution</h3>
<p>To use the encoder in ROS2 Control, an important parameter is enc_counts_per_rev, which refers to the encoder counts generated for one full revolution of the wheel or motor shaft.</p>

<p>This parameter is defined in “ros2_control.xacro” and is used in the “read” and “write” functions of the hardware interface “DiffDriveArduinoHardware” to calculate the velocity command sent to the motor, as well as to compute the robot’s current pose and velocity.</p>

<p>Here is the “ros2_control.xacro”:</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>&lt;?xml version="1.0"?&gt;
&lt;robot xmlns:xacro="http://www.ros.org/wiki/xacro"&gt;
    &lt;ros2_control name="RealRobot" type="system"&gt;
        &lt;hardware&gt;
            &lt;plugin&gt;diffdrive_arduino/DiffDriveArduinoHardware&lt;/plugin&gt;
			...
            &lt;param name="enc_counts_per_rev"&gt;2000&lt;/param&gt;
        &lt;/hardware&gt;
</code></pre></div></div>

<h4 id="compute-the-enc_counts_per_rev">Compute the enc_counts_per_rev</h4>

<p>To determine the enc_counts_per_rev, I wrote an Arduino program that outputs the count after I manually rotate the motor shaft for one full revolution.  I perform this rotation several times and calculate the average count, which gives me the enc_counts_per_rev of the motor.</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>#define ENC_A 2  // Encoder Channel A (Digital Pin 2)
#define ENC_B 3  // Encoder Channel B (Digital Pin 3)

volatile int pulseCount = 0;
bool counting = false;

void encoderISR() {
    if (counting) {
        pulseCount++;  // Increment count on each rising edge
    }
}

void setup() {
    Serial.begin(9600);
    pinMode(ENC_A, INPUT_PULLUP);
    pinMode(ENC_B, INPUT_PULLUP);
    
    // Attach interrupt for counting pulses
    attachInterrupt(digitalPinToInterrupt(ENC_A), encoderISR, RISING);
    
    Serial.println("Rotate the motor one full revolution and press ENTER.");
}

void loop() {
    if (Serial.available()) {
        Serial.read();  // Clear serial buffer
        
        pulseCount = 0;  // Reset pulse count
        counting = true;
        Serial.println("Start rotating the motor one full revolution...");
        
        delay(5000);  // Wait 5 seconds for manual rotation
        
        counting = false;
        Serial.print("Total pulses counted: ");
        Serial.println(pulseCount);
        
        Serial.print("Estimated CPR: ");
        Serial.println(pulseCount * 4);  // Multiply by 4 for quadrature encoders
        Serial.println("Press ENTER and rotate again.");
    }
}
</code></pre></div></div>

<h2 id="let-robot-dance-in-rviz">Let robot dance in RViz</h2>
<p>As mentioned earlier, one application of motor encoders is to help users visualize the robot using tools like RViz. Therefore, I created this robot dance clip. Cheers! :D</p>

<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code># start the robot on Raspberry Pi
ros2 launch my_bot robot.launch.py

# view the robot in rviz on the linux dev machine
rviz2 -d ~/dev/dev_ws/src/my_bot/config/view_bot.rviz
</code></pre></div></div>

<iframe width="800" height="468" src="https://www.youtube.com/embed/NX-1zldg81s?autoplay=1&amp;mute=0">
</iframe>]]></content><author><name>Modular Machines LLC</name></author><category term="Security_Robot" /><summary type="html"><![CDATA[]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://www.modularmachines.ai/assets/motor_encoder/IMG_3065.jpeg" /><media:content medium="image" url="https://www.modularmachines.ai/assets/motor_encoder/IMG_3065.jpeg" xmlns:media="http://search.yahoo.com/mrss/" /></entry></feed>