Fuses wheel, visual, and IMU odometry using complementary filtering. Publishes fused /odom (nav_msgs/Odometry) and broadcasts odom→base_link TF at 50Hz. Sensor Fusion Strategy: - Wheel odometry: High-frequency accurate linear displacement (weight: 0.6) - Visual odometry: Loop closure and long-term drift correction (weight: 0.3) - IMU: High-frequency attitude and angular velocity (weight: 0.1) Complementary Filter Architecture: - Fast loop (IMU): High-frequency attitude updates, angular velocity - Slow loop (Vision): Low-frequency position/orientation correction - Integration: Velocity-based position updates with covariance weighting - Dropout handling: Continues with available sources if sensors drop Fusion Algorithm: 1. Extract velocities from wheel odometry (most reliable linear) 2. Apply IMU angular velocity (highest frequency rotation) 3. Update orientation from IMU with blending 4. Integrate velocities to position (wheel odometry frame) 5. Apply visual odometry drift correction (low-frequency) 6. Update covariances based on available measurements 7. Publish fused odometry with full covariance matrices Published Topics: - /odom (nav_msgs/Odometry) - Fused pose/twist with covariance - /saltybot/odom_fusion_info (std_msgs/String) - JSON debug info TF Broadcasts: - odom→base_link - Position (x, y) and orientation (yaw) Subscribed Topics: - /saltybot/wheel_odom (nav_msgs/Odometry) - Wheel encoder odometry - /rtab_map/odom (nav_msgs/Odometry) - Visual/SLAM odometry - /imu/data (sensor_msgs/Imu) - IMU data Package: saltybot_odom_fusion Entry point: odom_fusion_node Frequency: 50Hz (20ms cycle) Features: ✓ Multi-source odometry fusion ✓ Complementary filtering with configurable weights ✓ Full covariance matrices for uncertainty tracking ✓ TF2 transform broadcasting ✓ Sensor dropout handling ✓ JSON telemetry with fusion status ✓ Configurable normalization of weights Tests: 20+ unit tests covering: - OdomState initialization and covariances - Subscription handling for all three sensors - Position integration from velocity - Angular velocity updates - Velocity blending from multiple sources - Drift correction from visual odometry - Covariance updates based on measurement availability - Quaternion to Euler angle conversion - Realistic fusion scenarios (straight line, circles, drift correction) - Sensor dropout and recovery Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
Jetson Nano — AI/SLAM Platform Setup
Self-balancing robot: Jetson Nano dev environment for ROS2 Humble + SLAM stack.
Stack
| Component | Version / Part |
|---|---|
| Platform | Jetson Nano 4GB |
| JetPack | 4.6 (L4T R32.6.1, CUDA 10.2) |
| ROS2 | Humble Hawksbill |
| DDS | CycloneDDS |
| SLAM | slam_toolbox |
| Nav | Nav2 |
| Depth camera | Intel RealSense D435i |
| LiDAR | RPLIDAR A1M8 |
| MCU bridge | STM32F722 (USB CDC @ 921600) |
Quick Start
# 1. Host setup (once, on fresh JetPack 4.6)
sudo bash scripts/setup-jetson.sh
# 2. Build Docker image
bash scripts/build-and-run.sh build
# 3. Start full stack
bash scripts/build-and-run.sh up
# 4. Open ROS2 shell
bash scripts/build-and-run.sh shell
Docs
docs/pinout.md— GPIO/I2C/UART pinout for all peripheralsdocs/power-budget.md— 10W power envelope analysis
Files
jetson/
├── Dockerfile # L4T base + ROS2 Humble + SLAM packages
├── docker-compose.yml # Multi-service stack (ROS2, RPLIDAR, D435i, STM32)
├── README.md # This file
├── docs/
│ ├── pinout.md # GPIO/I2C/UART pinout reference
│ └── power-budget.md # Power budget analysis (10W envelope)
└── scripts/
├── entrypoint.sh # Docker container entrypoint
├── setup-jetson.sh # Host setup (udev, Docker, nvpmodel)
└── build-and-run.sh # Build/run helper
Power Budget (Summary)
| Scenario | Total |
|---|---|
| Idle | 2.9W |
| Nominal (SLAM active) | ~10.2W |
| Peak | 15.4W |
Target: 10W (MAXN nvpmodel). Use RPLIDAR standby + 640p D435i for compliance.
See docs/power-budget.md for full analysis.