feat: update SLAM stack for Jetson Orin Nano Super (67 TOPS, JetPack 6)

Platform upgrade: Jetson Nano 4GB → Orin Nano Super 8GB (March 1, 2026)
All Nano-era constraints removed — power/rate/resolution limits obsolete.

Dockerfile: l4t-jetpack:r36.2.0 (JetPack 6 / Ubuntu 22.04 / CUDA 12.x),
  ROS2 Humble via native apt, added ros-humble-rtabmap-ros,
  ros-humble-v4l2-camera for future IMX219 CSI (Phase 2c)

New: slam_rtabmap.launch.py — Orin primary SLAM entry point
  RTAB-Map with subscribe_scan (RPLIDAR) + subscribe_rgbd (D435i)
  Replaces slam_toolbox as docker-compose default

New: config/rtabmap_params.yaml — Orin-optimized
  DetectionRate 10Hz, MaxFeatures 1000, Grid/3D true,
  TimeThr 0 (no limit), Mem/STMSize 0 (unlimited)

Updated: config/realsense_d435i.yaml — 848x480x30, pointcloud enabled
Updated: config/slam_toolbox_params.yaml — 10Hz rate, 1s map interval
Updated: SLAM-SETUP-PLAN.md — full rewrite for Orin: arch diagram,
  Phase 2c IMX219 plan (4x 160° CSI surround), 25W power budget

docker-compose.yml: image tag jetson-orin, default → slam_rtabmap.launch.py

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
This commit is contained in:
sl-perception 2026-02-28 21:13:55 -05:00
parent 8abd88e05d
commit 6b798cc890
7 changed files with 393 additions and 336 deletions

View File

@ -1,12 +1,15 @@
# Jetson Nano — ROS2 Humble dev container
# Base: JetPack 4.6 (L4T R32.6.1) + CUDA 10.2
# Arch: ARM64 (aarch64)
# Jetson Orin Nano Super — ROS2 Humble dev container
# Base: JetPack 6 (L4T R36.2.0) + CUDA 12.x / Ubuntu 22.04
#
# Hardware: Jetson Orin Nano Super 8GB (67 TOPS, 1024-core Ampere)
# Previous: Jetson Nano 4GB (JetPack 4.6 / L4T R32.6.1) — see git history
FROM nvcr.io/nvidia/l4t-base:r32.6.1
FROM nvcr.io/nvidia/l4t-jetpack:r36.2.0
LABEL maintainer="sl-jetson <saltylab>"
LABEL description="ROS2 Humble + SLAM stack for Jetson Nano self-balancing robot"
LABEL jetpack="4.6"
LABEL maintainer="sl-perception <saltylab>"
LABEL description="ROS2 Humble + SLAM stack for Jetson Orin Nano Super self-balancing robot"
LABEL jetpack="6.0"
LABEL l4t="r36.2.0"
LABEL ros_distro="humble"
ENV DEBIAN_FRONTEND=noninteractive
@ -15,32 +18,29 @@ ENV ROS_ROOT=/opt/ros/${ROS_DISTRO}
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1
# ── System deps ────────────────────────────────────────────────────────────────
# ── Locale ─────────────────────────────────────────────────────────────────────
RUN apt-get update && apt-get install -y --no-install-recommends \
locales tzdata \
&& locale-gen en_US.UTF-8 \
&& rm -rf /var/lib/apt/lists/*
ENV LANG=en_US.UTF-8
# ── System deps ─────────────────────────────────────────────────────────────────
RUN apt-get update && apt-get install -y --no-install-recommends \
# Build tools
build-essential cmake git wget curl \
# Python
python3-dev python3-pip python3-setuptools python3-wheel \
# Serial / I2C / SPI
i2c-tools libi2c-dev python3-smbus \
picocom minicom setserial \
# USB
usbutils libusb-1.0-0-dev \
# Misc
locales tzdata htop tmux nano \
# Networking
htop tmux nano \
net-tools iputils-ping \
&& rm -rf /var/lib/apt/lists/*
RUN locale-gen en_US.UTF-8
ENV LANG=en_US.UTF-8
# ── ROS2 Humble (from ROS2 apt repo — ARM64 build) ─────────────────────────────
# Note: official humble debs for ARM64/L4T are provided via NVIDIA Isaac ROS
# or via ros2-apt-source for 20.04 focal.
# ── ROS2 Humble (Ubuntu 22.04 native — standard apt, no ARM64 workarounds) ─────
RUN curl -sSL https://raw.githubusercontent.com/ros/rosdistro/master/ros.asc \
| apt-key add - && \
echo "deb [arch=arm64] http://packages.ros.org/ros2/ubuntu focal main" \
| gpg --dearmor -o /usr/share/keyrings/ros-archive-keyring.gpg && \
echo "deb [arch=arm64 signed-by=/usr/share/keyrings/ros-archive-keyring.gpg] \
http://packages.ros.org/ros2/ubuntu jammy main" \
> /etc/apt/sources.list.d/ros2.list && \
apt-get update && apt-get install -y --no-install-recommends \
ros-humble-ros-base \
@ -51,50 +51,45 @@ RUN curl -sSL https://raw.githubusercontent.com/ros/rosdistro/master/ros.asc \
&& rosdep init && rosdep update \
&& rm -rf /var/lib/apt/lists/*
# ── Nav / SLAM / sensor packages ──────────────────────────────────────────────
# ── SLAM / Nav / Sensor packages ──────────────────────────────────────────────
RUN apt-get update && apt-get install -y --no-install-recommends \
ros-humble-nav2-bringup \
ros-humble-rtabmap-ros \
ros-humble-slam-toolbox \
ros-humble-nav2-bringup \
ros-humble-robot-localization \
ros-humble-rplidar-ros \
ros-humble-realsense2-camera \
ros-humble-realsense2-description \
ros-humble-tf2-tools \
ros-humble-robot-state-publisher \
ros-humble-rqt \
ros-humble-rqt-common-plugins \
ros-humble-rviz2 \
ros-humble-rosbridge-server \
ros-humble-image-transport-plugins \
ros-humble-v4l2-camera \
&& rm -rf /var/lib/apt/lists/*
# ── GPIO / serial Python libs ──────────────────────────────────────────────────
# ── GPIO / serial Python libs ──────────────────────────────────────────────────
RUN pip3 install --no-cache-dir \
Jetson.GPIO \
pyserial \
smbus2 \
adafruit-blinka \
RPi.GPIO \
numpy \
scipy
Jetson.GPIO pyserial smbus2 numpy scipy
# ── RealSense SDK (librealsense2 ARM64) ───────────────────────────────────────
# Pre-built for L4T — install from Jetson Hacks script or apt source
# ── RealSense SDK (librealsense2 ARM64 for JetPack 6) ─────────────────────────
RUN apt-get update && apt-get install -y --no-install-recommends \
libssl-dev libusb-1.0-0-dev pkg-config libgtk-3-dev \
&& rm -rf /var/lib/apt/lists/*
# librealsense2 ARM64 wheel (NVIDIA-patched for L4T)
RUN pip3 install --no-cache-dir pyrealsense2
# ── Workspace setup ───────────────────────────────────────────────────────────
# ── Workspace setup ───────────────────────────────────────────────────────────
RUN mkdir -p /ros2_ws/src
WORKDIR /ros2_ws
# Source ROS2 in every shell
RUN echo "source /opt/ros/${ROS_DISTRO}/setup.bash" >> /root/.bashrc && \
echo "source /ros2_ws/install/local_setup.bash 2>/dev/null || true" >> /root/.bashrc && \
echo "export RMW_IMPLEMENTATION=rmw_cyclonedds_cpp" >> /root/.bashrc
# ── Entrypoint ────────────────────────────────────────────────────────────────
# ── Entrypoint ────────────────────────────────────────────────────────────────
COPY scripts/entrypoint.sh /entrypoint.sh
RUN chmod +x /entrypoint.sh

View File

@ -1,53 +1,33 @@
# RealSense D435i configuration — Jetson Nano (power-budget tuned)
# RealSense D435i configuration — Jetson Orin Nano Super
#
# Profile format: WxHxFPS
# Constraint: ~10W total budget. 640x480x15 saves ~0.4W vs 30fps.
# Previous Nano 4GB settings (640x480x15, no point cloud) replaced.
# Orin Nano Super has ample Ampere GPU headroom for 848x480x30fps + point cloud.
#
# Reference topics at these settings:
# /camera/color/image_raw 640x480 15 Hz RGB8
# /camera/depth/image_rect_raw 640x480 15 Hz Z16
# /camera/aligned_depth_to_color/image_raw 640x480 15 Hz Z16
# /camera/imu 6-axis ~200 Hz (accel@100Hz + gyro@400Hz fused)
# /camera/color/camera_info 640x480 15 Hz
# /camera/depth/camera_info 640x480 15 Hz
#
# Hardware IMU: Bosch BMI055
# Accelerometer native rate: 100 Hz
# Gyroscope native rate: 400 Hz
# unite_imu_method=2: linearly interpolates accel to match gyro timestamps
# Reference topics:
# /camera/color/image_raw 848x480 30 Hz RGB8
# /camera/depth/image_rect_raw 848x480 30 Hz Z16
# /camera/aligned_depth_to_color/image_raw 848x480 30 Hz Z16
# /camera/color/camera_info 848x480 30 Hz
# /camera/depth/camera_info 848x480 30 Hz
# /camera/imu ~400 Hz (gyro@400Hz + accel@100Hz fused)
# /camera/points PointCloud2 30 Hz
camera:
ros__parameters:
# ── Streams ──────────────────────────────────────────────────────────────
depth_module.profile: "640x480x15"
rgb_camera.profile: "640x480x15"
depth_module.profile: "848x480x30"
rgb_camera.profile: "848x480x30"
enable_depth: true
enable_color: true
enable_infra1: false # not needed for RGB-D SLAM
enable_infra1: false
enable_infra2: false
# ── IMU ──────────────────────────────────────────────────────────────────
enable_gyro: true
enable_accel: true
# 0=none 1=copy 2=linear_interpolation
# Use 2: aligns accel timestamps to gyro rate, required for sensor fusion
unite_imu_method: 2
unite_imu_method: 2 # linear_interpolation
# ── Alignment ────────────────────────────────────────────────────────────
# Projects depth pixels into RGB frame — required for rtabmap_ros rgbd input
align_depth.enable: true
pointcloud.enable: true # Orin Ampere GPU handles this without issue
# ── Point cloud ──────────────────────────────────────────────────────────
# Disabled: rtabmap_ros generates its own from aligned depth.
# Maxwell GPU cannot handle both simultaneously at budget.
pointcloud.enable: false
# ── TF ───────────────────────────────────────────────────────────────────
publish_tf: true
tf_publish_rate: 0.0 # 0 = publish static transforms only (no redundant timer)
# ── Device ───────────────────────────────────────────────────────────────
# Leave serial_no empty to auto-select first found device
# serial_no: ''
# device_type: d435i
tf_publish_rate: 0.0

View File

@ -0,0 +1,78 @@
# RTAB-Map configuration — Jetson Orin Nano Super
#
# Hardware context: 67 TOPS, 1024-core Ampere GPU, 8GB RAM
# All Nano-era constraints (2Hz cap, 400 features, no 3D) removed.
#
# Sensor inputs:
# /scan LaserScan from RPLIDAR A1M8 ~5.5 Hz
# /camera/color/image_raw RGB from RealSense D435i 30 Hz
# /camera/depth/image_rect_raw Depth from D435i 30 Hz
# /camera/color/camera_info Camera intrinsics 30 Hz
#
# Outputs:
# /rtabmap/map OccupancyGrid (2D occupancy)
# /rtabmap/cloud_map PointCloud2 (3D point cloud, enabled)
# /rtabmap/odom Odometry (visual-inertial odometry)
# /rtabmap/mapData Map data for serialization/loop closure
#
# Frame assumptions (match sensors.launch.py static TF):
# map → odom → base_link → laser
# → camera_link
rtabmap:
ros__parameters:
# ── Core rate and timing ───────────────────────────────────────────────────
# Process at 10Hz — far below D435i 30fps ceiling, still 5x faster than Nano
Rtabmap/DetectionRate: "10"
# No processing time limit — Orin has headroom (Nano was 700ms)
Rtabmap/TimeThr: "0"
# Update map every 5cm or ~3° rotation (was 10cm / 5°)
RGBD/LinearUpdate: "0.05"
RGBD/AngularUpdate: "0.05"
# ── Visual features ────────────────────────────────────────────────────────
# 1000 ORB features per frame (Nano was 400)
Kp/MaxFeatures: "1000"
Vis/MaxFeatures: "1000"
# ── Memory ────────────────────────────────────────────────────────────────
# Unlimited short-term memory — 8GB RAM (Nano was 30 keyframes)
Mem/STMSize: "0"
# Keep full map in working memory
Mem/IncrementalMemory: "true"
# ── Map generation ────────────────────────────────────────────────────────
# 3D occupancy grid enabled — Ampere GPU handles point cloud generation
Grid/3D: "true"
Grid/CellSize: "0.05" # 5cm voxels
Grid/RangeMin: "0.3" # ignore points closer than 30cm
Grid/RangeMax: "8.0" # clip at A1M8 reliable range
# ── Sensor subscriptions ──────────────────────────────────────────────────
# Subscribe to both RPLIDAR scan (fast 2D) and D435i depth (3D)
subscribe_scan: "true"
subscribe_rgbd: "true"
subscribe_odom_info: "false"
# ── Loop closure ──────────────────────────────────────────────────────────
# More aggressive loop closure — Orin can handle it
RGBD/LoopClosureReextractFeatures: "true"
Kp/IncrementalFlann: "true"
# ── Odometry ──────────────────────────────────────────────────────────────
# Use F2M (frame-to-map) visual odometry — more accurate on Orin
Odom/Strategy: "0" # 0=F2M, 1=F2F
Odom/ResetCountdown: "1"
OdomF2M/MaxSize: "2000"
# ── RGBD node input topics ─────────────────────────────────────────────────
rgb_topic: /camera/color/image_raw
depth_topic: /camera/depth/image_rect_raw
camera_info_topic: /camera/color/camera_info
depth_camera_info_topic: /camera/depth/camera_info
scan_topic: /scan
# ── Output ────────────────────────────────────────────────────────────────
# Publish 3D point cloud map
cloud_output_voxelized: "true"
cloud_voxel_size: "0.05" # match Grid/CellSize

View File

@ -1,44 +1,37 @@
# slam_toolbox — online async SLAM configuration
# Tuned for Jetson Nano 4GB (constrained CPU/RAM, indoor mapping)
# Jetson Orin Nano Super — all Nano rate caps removed.
#
# Primary SLAM is now RTAB-Map (slam_rtabmap.launch.py).
# slam_toolbox retained for LIDAR-only localization against pre-built maps.
#
# Input: /scan (LaserScan from RPLIDAR A1M8, ~5.5 Hz)
# Output: /map (OccupancyGrid, updated every map_update_interval seconds)
#
# Frame assumptions (must match sensors.launch.py static TF):
# map → odom → base_link → laser
# (odom not yet published — slam_toolbox handles this via scan matching)
# Output: /map (OccupancyGrid)
slam_toolbox:
ros__parameters:
# ── Frames ───────────────────────────────────────────────────────────────
odom_frame: odom
map_frame: map
base_frame: base_link
scan_topic: /scan
mode: mapping # 'mapping' or 'localization'
mode: mapping
# ── Map params ───────────────────────────────────────────────────────────
resolution: 0.05 # 5 cm/cell — good balance for A1M8 angular res
max_laser_range: 8.0 # clip to reliable range of A1M8 (spec: 12m)
map_update_interval: 5.0 # seconds between full map publishes (saves CPU)
minimum_travel_distance: 0.3 # only update after moving 30 cm
minimum_travel_heading: 0.3 # or rotating ~17°
resolution: 0.05
max_laser_range: 8.0
map_update_interval: 1.0 # 1s (was 5s on Nano)
minimum_travel_distance: 0.2
minimum_travel_heading: 0.2
# ── Performance (Nano-specific) ───────────────────────────────────────────
# Reduce scan processing rate to stay within ~3.5W CPU budget
minimum_time_interval: 0.5 # max 2 Hz scan processing (A1M8 is ~5.5 Hz)
minimum_time_interval: 0.1 # ~10Hz processing (was 0.5s on Nano)
transform_timeout: 0.2
tf_buffer_duration: 30.0
stack_size_to_use: 40000000 # 40 MB stack
enable_interactive_mode: false # disable interactive editing (saves CPU)
stack_size_to_use: 40000000
enable_interactive_mode: false
# ── Scan matching ─────────────────────────────────────────────────────────
use_scan_matching: true
use_scan_barycenter: true
scan_buffer_size: 10
scan_buffer_size: 20
scan_buffer_maximum_scan_distance: 10.0
# ── Loop closure ──────────────────────────────────────────────────────────
do_loop_closing: true
loop_match_minimum_chain_size: 10
loop_match_maximum_variance_coarse: 3.0
@ -46,31 +39,25 @@ slam_toolbox:
loop_match_minimum_response_fine: 0.45
loop_search_maximum_distance: 3.0
# ── Correlation (coarse scan matching) ───────────────────────────────────
correlation_search_space_dimension: 0.5
correlation_search_space_resolution: 0.01
correlation_search_space_smear_deviation: 0.1
# ── Loop search space ─────────────────────────────────────────────────────
loop_search_space_dimension: 8.0
loop_search_space_resolution: 0.05
loop_search_space_smear_deviation: 0.03
# ── Response expansion ────────────────────────────────────────────────────
link_match_minimum_response_fine: 0.1
link_scan_maximum_distance: 1.5
use_response_expansion: true
# ── Penalties (scan matcher quality thresholds) ───────────────────────────
distance_variance_penalty: 0.5
angle_variance_penalty: 1.0
fine_search_angle_offset: 0.00349 # ~0.2°
coarse_search_angle_offset: 0.349 # ~20°
coarse_angle_resolution: 0.0349 # ~2°
fine_search_angle_offset: 0.00349
coarse_search_angle_offset: 0.349
coarse_angle_resolution: 0.0349
minimum_angle_penalty: 0.9
minimum_distance_penalty: 0.5
# ── Solver ────────────────────────────────────────────────────────────────
solver_plugin: solver_plugins::CeresSolver
ceres_linear_solver: SPARSE_NORMAL_CHOLESKY
ceres_preconditioner: SCHUR_JACOBI

View File

@ -8,7 +8,7 @@ services:
# ── Core ROS2 + SLAM node ─────────────────────────────────────────────────
saltybot-ros2:
image: saltybot/ros2-humble:jetson-nano
image: saltybot/ros2-humble:jetson-orin
build:
context: .
dockerfile: Dockerfile
@ -51,7 +51,7 @@ services:
command: >
bash -c "
source /opt/ros/humble/setup.bash &&
ros2 launch saltybot_bringup slam.launch.py
ros2 launch saltybot_bringup slam_rtabmap.launch.py
"
# ── RPLIDAR driver node ────────────────────────────────────────────────────

View File

@ -0,0 +1,114 @@
"""
slam_rtabmap.launch.py RTAB-Map RGB-D + LIDAR SLAM (Orin primary entry point)
Replaces slam.launch.py (slam_toolbox) as the docker-compose default on Orin.
RTAB-Map chosen over slam_toolbox for Orin because:
- Native D435i depth + RPLIDAR sensor fusion
- 3D point cloud output (Ampere GPU handles it)
- Better loop closure with visual features
- Outputs occupancy map directly consumable by Nav2
Stack:
sensors.launch.py (RPLIDAR + RealSense D435i + static TF)
rtabmap_ros/rtabmap (RTAB-Map node with RGB-D + scan input)
RTAB-Map input topics:
/camera/color/image_raw 30 Hz (RGB from D435i)
/camera/depth/image_rect_raw 30 Hz (depth from D435i)
/camera/color/camera_info 30 Hz
/scan ~5.5 Hz (RPLIDAR A1M8)
RTAB-Map output topics:
/rtabmap/map OccupancyGrid (2D Nav2 input)
/rtabmap/cloud_map PointCloud2 (3D visualization)
/rtabmap/odom Odometry (visual-inertial)
Config: /config/rtabmap_params.yaml (mounted from jetson/config/)
Verify:
ros2 topic hz /rtabmap/map # ~1Hz map updates
ros2 topic hz /rtabmap/cloud_map # ~1Hz 3D cloud updates
ros2 topic hz /rtabmap/odom # ~10Hz odometry
"""
import os
from launch import LaunchDescription
from launch.actions import IncludeLaunchDescription, DeclareLaunchArgument
from launch.launch_description_sources import PythonLaunchDescriptionSource
from launch.substitutions import LaunchConfiguration
from launch_ros.actions import Node
from ament_index_python.packages import get_package_share_directory
RTABMAP_PARAMS_FILE = '/config/rtabmap_params.yaml'
def generate_launch_description():
use_sim_time_arg = DeclareLaunchArgument(
'use_sim_time',
default_value='false',
description='Use simulation clock (set true for rosbag playback)',
)
bringup_share = get_package_share_directory('saltybot_bringup')
sensors_launch = IncludeLaunchDescription(
PythonLaunchDescriptionSource(
os.path.join(bringup_share, 'launch', 'sensors.launch.py')
),
)
# RTAB-Map node (rtabmap_ros package)
# RGB-D + LIDAR mode: subscribe_scan=true, subscribe_rgbd=true
rtabmap_node = Node(
package='rtabmap_ros',
executable='rtabmap',
name='rtabmap',
output='screen',
parameters=[
RTABMAP_PARAMS_FILE,
{
'use_sim_time': LaunchConfiguration('use_sim_time'),
# Frame IDs — must match sensors.launch.py static TF
'frame_id': 'base_link',
'odom_frame_id': 'odom',
'map_frame_id': 'map',
},
],
remappings=[
# RGB-D inputs from RealSense
('rgb/image', '/camera/color/image_raw'),
('rgb/camera_info', '/camera/color/camera_info'),
('depth/image', '/camera/depth/image_rect_raw'),
# 2D LIDAR from RPLIDAR A1M8
('scan', '/scan'),
],
arguments=['--delete_db_on_start'], # fresh map each launch
)
# RTAB-Map visualization node (optional — comment out to save CPU)
rtabmap_viz_node = Node(
package='rtabmap_ros',
executable='rtabmapviz',
name='rtabmapviz',
output='screen',
parameters=[{
'use_sim_time': LaunchConfiguration('use_sim_time'),
'frame_id': 'base_link',
}],
remappings=[
('rgb/image', '/camera/color/image_raw'),
('rgb/camera_info', '/camera/color/camera_info'),
('depth/image', '/camera/depth/image_rect_raw'),
('scan', '/scan'),
],
# Only launch viz if DISPLAY is available
condition=None, # always launch; comment this node out if headless
)
return LaunchDescription([
use_sim_time_arg,
sensors_launch,
rtabmap_node,
# rtabmap_viz_node, # uncomment for rviz-style RTAB-Map visualization
])

View File

@ -1,264 +1,167 @@
# bd-wax: Jetson Nano + RealSense D435i + RPLIDAR SLAM Setup — Technical Plan
# SLAM Setup Plan — Jetson Orin Nano Super
**Bead:** bd-wax
**Phase:** 2 (lower priority than Phase 1 balance)
**Owner:** sl-perception
**Hardware:** Jetson Nano 4GB + Intel RealSense D435i + RPLIDAR A1M8
**Goal:** Indoor SLAM, mapping, and autonomous navigation (person-following)
**Bead:** bd-wax (plan), bd-a2j (sensor drivers done PR #17)
**Phase:** 2 | **Owner:** sl-perception
**Updated:** 2026-03-01 — revised for Jetson Orin Nano Super (replaces Nano 4GB)
> All Nano-era constraints (10W cap, 2Hz detection, 400 features, no 3D) are obsolete.
---
## Hardware Summary
## Hardware
| Component | Specs |
|-----------|-------|
| Jetson Nano 4GB | Quad-core ARM Cortex-A57 @ 1.43GHz, 128-core Maxwell GPU, 4GB LPDDR4 |
| RealSense D435i | Stereo depth (0.110m), 848×480 @ 90fps, BMI055 IMU (accel+gyro) |
| RPLIDAR A1M8 | 360° 2D LIDAR, 12m range, 8000 samples/s, ~5.5Hz scan rate |
| AI Brain | **Jetson Orin Nano Super 8GB** — 6-core A78AE, 1024-core Ampere, **67 TOPS**, JetPack 6 |
| Depth Cam | Intel RealSense D435i — 848×480 @ 90fps, BMI055 IMU |
| LIDAR | RPLIDAR A1M8 — 360° 2D, 12m range, ~5.5 Hz |
| Wide Cams | 4× IMX219 160° CSI — front/right/rear/left 90° intervals *(arriving)* |
| FC | STM32F722 — UART bridge `/dev/ttyACM0` @ 921600 |
---
## 1. OS & ROS2 Environment
## 1. OS & ROS2
### Recommended: Docker on JetPack 4.6 (Ubuntu 18.04 / L4T 32.x)
JetPack 6 = Ubuntu 22.04 → ROS2 Humble via **native apt** (no Docker workarounds needed).
Jetson Nano ships with JetPack 4.6 (Ubuntu 18.04). Native ROS2 Humble requires Ubuntu 22.04, so **Docker is the correct approach**:
```bash
sudo apt install ros-humble-desktop ros-humble-rtabmap-ros \
ros-humble-rplidar-ros ros-humble-realsense2-camera \
ros-humble-slam-toolbox ros-humble-nav2-bringup
```
- NVIDIA provides official ROS2 containers for Jetson via [dusty-nv/jetson-containers](https://github.com/dusty-nv/jetson-containers)
- Container: `dustynv/ros:humble-ros-base-l4t-r32.7.1` (arm64, CUDA 10.2)
- Compose file pins container, mounts device nodes (`/dev/video*`, `/dev/ttyUSB*`), and handles GPU access
**Alternative: JetPack 5.x (Ubuntu 20.04)** allows native ROS2 Foxy but requires flashing newer JetPack — only worth it if we need direct hardware access outside Docker.
**Decision:** Use JetPack 4.6 + Docker (Humble). Fastest path, avoids full OS reflash, proven for Nano.
Docker still supported for CI. Updated `Dockerfile` uses `nvcr.io/nvidia/l4t-jetpack:r36.2.0`.
---
## 2. SLAM Stack Recommendation
## 2. SLAM Stack
### Recommendation: RTAB-Map (primary) with ORB-SLAM3 as fallback
### Primary: RTAB-Map (RGB-D + 2D LIDAR fusion)
#### Why RTAB-Map
| Parameter | Nano 4GB (old) | **Orin Nano Super** |
|-----------|---------------|---------------------|
| Detection rate | 2 Hz | **10 Hz** |
| Visual features | 400 | **1000** |
| D435i profile | 640×480×15fps | **848×480×30fps** |
| 3D point cloud | disabled | **enabled** |
| Map type | 2D only | **2D + 3D** |
| Processing time limit | 700ms | **none** |
| Short-term memory | 30 keyframes | **unlimited** |
| Criterion | ORB-SLAM3 | RTAB-Map |
|-----------|-----------|----------|
| Sensor fusion (D435i + RPLIDAR) | D435i only | D435i + RPLIDAR natively |
| Output map type | Sparse 3D point cloud | Dense 2D occupancy + 3D point cloud |
| Nav2 compatibility | Needs wrapper | Direct occupancy map output |
| ATE RMSE accuracy | 0.009m | 0.019m |
| Nano resource usage | Lower | Higher (needs tuning) |
| Loop closure robustness | Good | Excellent |
Fusion: RPLIDAR `/scan` (fast 2D loop closure) + D435i depth (3D reconstruction + visual odometry).
RTAB-Map's `subscribe_scan` mode uses the RPLIDAR A1M8 as the primary 2D odometry front-end (fast, low-CPU) with the D435i providing depth for loop closure and 3D reconstruction. This hybrid approach is well-documented for Nano deployments.
**Note:** Cannot simultaneously use `subscribe_scan` (2D LIDAR) and `subscribe_scan_cloud` (3D point cloud) in RTAB-Map — use 2D mode.
#### ORB-SLAM3 Role
Keep ORB-SLAM3 as a lightweight visual odometry alternative if RTAB-Map proves too heavy for real-time operation. Can feed its odometry output into RTAB-Map's map management node.
### Secondary: slam_toolbox (LIDAR-only localization / pre-built map mode)
---
## 3. Software Architecture
## 3. Architecture
```
┌─────────────────────────────────────────────────────┐
│ Jetson Nano (Docker) │
│ │
│ ┌───────────────┐ ┌──────────────────────────┐ │
│ │ realsense2_ │ │ rplidar_ros2 │ │
│ │ camera node │ │ /scan (LaserScan) │ │
│ │ │ │ 5.5Hz, 360°, 12m range │ │
│ │ /depth/image │ └──────────┬───────────────┘ │
│ │ /color/image │ │ │
│ │ /imu │ │ │
│ └──────┬────────┘ │ │
│ │ │ │
│ └──────────┬─────────────┘ │
│ ▼ │
│ ┌──────────────────┐ │
│ │ rtabmap_ros │ │
│ │ │ │
│ │ subscribe_scan=true (RPLIDAR) │
│ │ subscribe_rgbd=true (D435i depth) │
│ │ │ │
│ │ → /map (OccupancyGrid) │
│ │ → /rtabmap/odom │
│ │ → /rtabmap/cloud_map (3D) │
│ └──────────┬────────┘ │
│ │ │
│ ▼ │
│ ┌──────────────────┐ │
│ │ nav2 stack │ (Phase 2b) │
│ │ (reduced freq) │ │
│ │ 5-10Hz costmap │ │
│ └──────────────────┘ │
└─────────────────────────────────────────────────────┘
Jetson Orin Nano Super (Ubuntu 22.04 / JetPack 6 / CUDA 12.x)
realsense2_camera rplidar_ros
848×480×30fps /scan ~5.5Hz 360°
/camera/color │
/camera/depth │
/camera/imu ~400Hz │
│ │
USB3 (D435i) USB2 (RPLIDAR)
└──────────┬─────────┘
rtabmap_ros
10Hz | 3D cloud | 1000 features
→ /rtabmap/map (OccupancyGrid)
→ /rtabmap/cloud_map (PointCloud2)
→ /rtabmap/odom (Odometry)
Nav2 stack (Phase 2b)
20Hz costmap
/cmd_vel → STM32
4× IMX219 CSI (Phase 2c — pending hardware)
front/right/rear/left 160°
→ panoramic stitch, person tracking
```
### ROS2 Node Graph
---
| Node | Package | Key Topics |
|------|---------|------------|
| `camera/realsense2_camera_node` | `realsense2_camera` | `/camera/depth/image_rect_raw`, `/camera/color/image_raw`, `/camera/imu` |
| `rplidar_node` | `rplidar_ros` | `/scan` |
| `rtabmap` | `rtabmap_ros` | `/map`, `/rtabmap/odom`, `/rtabmap/cloud_map` |
| `robot_state_publisher` | `robot_state_publisher` | `/tf` (static transforms) |
## 4. Phases
| Phase | Status | Description |
|-------|--------|-------------|
| 2a | ✅ Done (PR #17) | Sensor drivers — `saltybot_bringup` package |
| 2a+ | This PR | Orin update: Dockerfile JetPack 6, RTAB-Map launch + config |
| 2b | Pending | Nav2 integration (separate bead, after Phase 1 balance) |
| 2c | Pending | 4× IMX219 surround vision (new bead, after hardware arrives) |
---
## 4. Implementation Phases
## 5. RTAB-Map Config (Orin)
### Phase 2a — SLAM Bring-up (this bead, bd-wax)
**Deliverables:**
- [ ] Docker Compose file (`docker/slam/docker-compose.yml`)
- [ ] ROS2 Humble container with RTAB-Map + RealSense + RPLIDAR packages
- [ ] Launch file: `launch/slam.launch.py` (RTAB-Map + both sensor nodes)
- [ ] URDF/static TF: D435i and RPLIDAR positions on robot frame
- [ ] `README.md` for Jetson setup instructions
**Out of scope (Phase 2b):**
- Nav2 autonomous navigation
- Person-following
- Integration with STM32 motor commands (needs Phase 1 balance complete first)
### Phase 2b — Nav2 Integration (separate bead)
- Nav2 stack with pre-built occupancy maps
- 510Hz costmap updates (reduced from desktop default 20Hz)
- Velocity command bridge: Nav2 `/cmd_vel` → STM32 via serial/ROS2 bridge
- Person detection (YOLOv5 on Nano, or TensorRT-optimized)
---
## 5. Key Configuration Parameters
### RTAB-Map Tuning for Jetson Nano
Full config: `jetson/config/rtabmap_params.yaml`
```yaml
# rtabmap_ros params — power/performance tradeoffs for Nano
rtabmap:
Rtabmap/DetectionRate: "2" # Process keyframes at 2Hz (not every frame)
Rtabmap/TimeThr: "700" # Max processing time 700ms per iteration
Kp/MaxFeatures: "400" # Reduce keypoints from default 1000
Vis/MaxFeatures: "400"
RGBD/LinearUpdate: "0.1" # Only update map if moved 10cm
RGBD/AngularUpdate: "0.09" # Or rotated ~5 degrees
Mem/STMSize: "30" # Short-term memory limit (saves RAM)
Grid/3D: "false" # Use 2D occupancy only (lighter)
Grid/RangeMin: "0.5" # Ignore points closer than 0.5m
Grid/RangeMax: "5.0" # Limit to 5m (A1M8 is reliable to ~8m)
```
### RealSense D435i Launch Params
```yaml
# realsense2_camera — reduce load on Nano
depth_module.profile: "640x480x15" # 15fps depth (not 90fps)
rgb_camera.profile: "640x480x15" # 15fps color
enable_gyro: true
enable_accel: true
unite_imu_method: 2 # Publish unified /camera/imu topic
align_depth.enable: true
```
### RPLIDAR
```yaml
serial_port: "/dev/ttyUSB0"
serial_baudrate: 115200
scan_mode: "Standard" # A1M8 only supports Standard mode
frame_id: "laser_frame"
Rtabmap/DetectionRate: "10" # was 2 on Nano
Kp/MaxFeatures: "1000" # was 400
RGBD/LinearUpdate: "0.05" # 5cm (was 10cm)
RGBD/AngularUpdate: "0.05" # ~3° (was 5°)
Grid/3D: "true" # 3D cloud enabled (was false)
Rtabmap/TimeThr: "0" # no limit (was 700ms)
Mem/STMSize: "0" # unlimited (was 30)
```
---
## 6. Static TF / URDF Robot Frame
Robot coordinate frame (`base_link`) transforms needed:
## 6. 4× IMX219 Layout (Phase 2c)
```
base_link
├── laser_frame (RPLIDAR A1M8 — top center of robot)
├── camera_link (RealSense D435i — front, ~camera_height above base)
│ ├── camera_depth_frame
│ └── camera_imu_frame
└── imu_link (STM32 IMU — FC board location)
FRONT (CSI0) 160°
LEFT (CSI3) × RIGHT (CSI1)
REAR (CSI2) 160°
```
Transforms will be defined in `urdf/saltybot.urdf.xacro` with measured offsets once hardware is physically mounted.
90° between cameras, 160° FOV → ~70° overlap at each boundary, full 360° coverage.
ROS topics (planned): `/camera/{front,right,rear,left}/image_raw` @ 30Hz,
`/camera/panoramic/image_raw` @ 15Hz (stitched equirectangular).
---
## 7. Power Budget Concern
## 7. Power Budget (Orin Nano Super)
Jetson Nano 4GB maximum draw: **10W** (5V/2A barrel jack mode)
| Scenario | Total |
|----------|-------|
| SLAM active (RTAB-Map + D435i + RPLIDAR) | ~16W |
| + 4× IMX219 | ~17W |
| + Nav2 + TensorRT person detection | ~22W |
| Component | Estimated Draw |
|-----------|---------------|
| Jetson Nano (under SLAM load) | 78W |
| RealSense D435i (USB3) | 1.5W |
| RPLIDAR A1M8 (USB) | 0.5W |
| **Total** | **~10W** |
**Risk:** Marginal power budget. Recommend:
1. Use 5V/4A supply (requires Jetson J48 header, not USB-C)
2. Disable Jetson display output (`sudo systemctl disable gdm3`)
3. Use `nvpmodel -m 1` (5W mode) during mapping-only tasks
4. External powered USB hub for peripherals
Orin Nano Super TDP: **25W max**. Recommended PSU: 5V 5A (25W) from robot buck converter.
No power gating needed. Run `sudo nvpmodel -m 0 && sudo jetson_clocks` for full performance.
---
## 8. Milestone Checklist
## 8. Milestones
- [ ] Flash JetPack 4.6 on Nano (or verify existing install)
- [ ] Install Docker + NVIDIA Container Runtime on Nano
- [ ] Pull `dustynv/ros:humble-ros-base-l4t-r32.7.1` container
- [ ] Verify D435i recognized: `realsense-viewer` or `rs-enumerate-devices`
- [ ] Verify RPLIDAR port: `ls /dev/ttyUSB*`
- [ ] Build/pull `realsense2_camera` ROS2 package in container
- [ ] Build/pull `rplidar_ros` ROS2 package in container
- [ ] Build/pull `rtabmap_ros` ROS2 package in container
- [ ] Test individual sensor topics with `ros2 topic echo`
- [ ] Run SLAM launch file — verify `/map` published
- [ ] Record rosbag for offline tuning
- [ ] Document measured TF offsets (physical mount positions)
- [ ] Write Phase 2b bead for Nav2 integration
- [ ] Flash JetPack 6 on Orin (arriving March 1)
- [ ] `sudo apt install ros-humble-desktop ros-humble-rtabmap-ros ...`
- [ ] Verify D435i: `lsusb | grep "8086:0b3a"`
- [ ] Verify RPLIDAR: `ls /dev/rplidar`
- [ ] `colcon build --packages-select saltybot_bringup`
- [ ] `ros2 launch saltybot_bringup sensors.launch.py` — verify topics
- [ ] `ros2 launch saltybot_bringup slam_rtabmap.launch.py` — verify `/rtabmap/map`
- [ ] `ros2 topic hz /rtabmap/cloud_map` — verify 3D cloud
- [ ] Record rosbag, monitor `tegrastats` for thermal headroom
- [ ] Update static TF with real mount measurements
- [ ] Open bead: Phase 2b Nav2
- [ ] Open bead: Phase 2c IMX219 (after hardware arrives)
---
## 9. Repo Structure (to be created)
## 9. References
```
saltylab-firmware/
└── projects/
└── saltybot/
├── SLAM-SETUP-PLAN.md ← this file
├── SALTYLAB.md ← main design doc (TBD)
└── slam/
├── docker/
│ ├── docker-compose.yml
│ └── Dockerfile.slam
├── launch/
│ └── slam.launch.py
├── config/
│ ├── rtabmap_params.yaml
│ └── realsense_params.yaml
└── urdf/
└── saltybot.urdf.xacro
```
---
## 10. References
- [dusty-nv/jetson-containers](https://github.com/dusty-nv/jetson-containers) — official NVIDIA ROS2 Docker containers for Jetson
- [rtabmap_ros](https://github.com/introlab/rtabmap_ros) — RTAB-Map ROS2 wrapper
- [realsense-ros](https://github.com/IntelRealSense/realsense-ros) — Intel RealSense ROS2 wrapper
- [rplidar_ros](https://github.com/Slamtec/rplidar_ros) — Slamtec RPLIDAR ROS2 package
- [reedjacobp/JetsonBot](https://github.com/reedjacobp/JetsonBot) — Nano + D435i + RPLIDAR reference implementation
- [RTAB-Map D435 + RPLidar discussion](https://answers.ros.org/question/367019/using-2-d435-cameras-and-an-rplidar-with-rtabmap/)
- [Comparative SLAM evaluation (2024)](https://arxiv.org/html/2401.02816v1)
- [Jetson Orin Nano Super](https://www.nvidia.com/en-us/autonomous-machines/embedded-systems/jetson-orin/)
- [JetPack 6 / L4T R36](https://developer.nvidia.com/embedded/jetpack)
- [dusty-nv/jetson-containers JetPack 6](https://github.com/dusty-nv/jetson-containers)
- [rtabmap_ros ROS2](https://github.com/introlab/rtabmap_ros)
- [realsense-ros](https://github.com/IntelRealSense/realsense-ros)
- [IMX219 / nvarguscamerasrc on Jetson](https://developer.ridgerun.com/wiki/index.php/NVIDIA_Jetson_ISP_Control)