Adds rosbridge_suite to the Jetson stack so the browser dashboard can
subscribe to ROS2 topics via roslibjs over ws://jetson:9090.
docker-compose.yml
New service: saltybot-rosbridge
- Runs saltybot_bringup/launch/rosbridge.launch.py
- network_mode: host → port 9090 directly reachable on Jetson LAN
- Depends on saltybot-ros2, stm32-bridge, csi-cameras
saltybot_bringup/launch/rosbridge.launch.py
- rosbridge_websocket node (port 9090, params from rosbridge_params.yaml)
- 4× image_transport/republish nodes: compress CSI camera streams
/camera/<name>/image_raw → /camera/<name>/image_raw/compressed (JPEG 75%)
saltybot_bringup/config/rosbridge_params.yaml
Whitelisted topics:
/map /scan /tf /tf_static
/saltybot/imu /saltybot/balance_state
/cmd_vel
/person/*
/camera/*/image_raw/compressed
max_message_size: 10 MB (OccupancyGrid headroom)
saltybot_bringup/SENSORS.md
Added rosbridge connection section with roslibjs snippet,
topic reference table, bandwidth estimates, and throttle_rate tips.
saltybot_bringup/package.xml
Added exec_depend: rosbridge_server, image_transport,
image_transport_plugins (all already installed in Docker image).
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
186 lines
5.6 KiB
Markdown
186 lines
5.6 KiB
Markdown
# saltybot_bringup — Sensor Topic Reference
|
||
|
||
ROS2 Humble | ROS_DOMAIN_ID=42 | DDS: CycloneDDS
|
||
|
||
---
|
||
|
||
## rosbridge WebSocket (web dashboard)
|
||
|
||
The web dashboard connects to ROS2 topics via rosbridge WebSocket on **port 9090**.
|
||
|
||
```bash
|
||
# Start rosbridge (and compressed camera republishers):
|
||
ros2 launch saltybot_bringup rosbridge.launch.py
|
||
|
||
# Or via docker-compose:
|
||
docker compose up -d rosbridge
|
||
```
|
||
|
||
**Browser connection (roslibjs):**
|
||
```javascript
|
||
var ros = new ROSLIB.Ros({ url: 'ws://jetson.local:9090' });
|
||
ros.on('connection', () => console.log('connected'));
|
||
ros.on('error', (e) => console.error('error', e));
|
||
ros.on('close', () => console.log('disconnected'));
|
||
```
|
||
|
||
**Whitelisted topics** (configured in `config/rosbridge_params.yaml`):
|
||
|
||
| Topic | Type | Notes |
|
||
|-------|------|-------|
|
||
| `/map` | `nav_msgs/OccupancyGrid` | Use `throttle_rate: 5000` (0.2 Hz) |
|
||
| `/scan` | `sensor_msgs/LaserScan` | 5.5 Hz, 1440 pts |
|
||
| `/tf` | `tf2_msgs/TFMessage` | Use `throttle_rate: 100` (10 Hz) |
|
||
| `/tf_static` | `tf2_msgs/TFMessage` | One-shot on connect |
|
||
| `/saltybot/imu` | `sensor_msgs/Imu` | 50 Hz, pitch/roll/yaw |
|
||
| `/saltybot/balance_state` | `std_msgs/String` | JSON balance controller state |
|
||
| `/cmd_vel` | `geometry_msgs/Twist` | Subscribe to monitor Nav2 commands |
|
||
| `/person/target` | (custom) | Person tracking target |
|
||
| `/person/detections` | (custom) | Person detection results |
|
||
| `/camera/front/image_raw/compressed` | `sensor_msgs/CompressedImage` | JPEG 75%, 640×480 |
|
||
| `/camera/left/image_raw/compressed` | `sensor_msgs/CompressedImage` | JPEG 75%, 640×480 |
|
||
| `/camera/rear/image_raw/compressed` | `sensor_msgs/CompressedImage` | JPEG 75%, 640×480 |
|
||
| `/camera/right/image_raw/compressed` | `sensor_msgs/CompressedImage` | JPEG 75%, 640×480 |
|
||
|
||
**Bandwidth tips:**
|
||
- Raw 640×480 YUYV = ~900 KB/frame. rosbridge republishes JPEG-compressed ≈ 15–25 KB/frame.
|
||
- Always use `throttle_rate` for cameras in roslibjs (e.g. `throttle_rate: 200` = max 5 fps).
|
||
- `/map` is large (~100 KB); use `throttle_rate: 5000` or subscribe only on demand.
|
||
|
||
**Verify connection:**
|
||
```bash
|
||
# Check WebSocket is listening
|
||
ss -tlnp | grep 9090
|
||
|
||
# List topics visible to rosbridge
|
||
ros2 topic list | grep -E 'map|scan|imu|camera|person|cmd_vel'
|
||
```
|
||
|
||
---
|
||
|
||
## Build & Run
|
||
|
||
```bash
|
||
# From inside the container (or ros2_ws root on host after sourcing ROS2):
|
||
cd /ros2_ws
|
||
colcon build --packages-select saltybot_bringup --symlink-install
|
||
source install/local_setup.bash
|
||
|
||
# Sensors only (verify data before SLAM):
|
||
ros2 launch saltybot_bringup sensors.launch.py
|
||
|
||
# Full SLAM stack (saltybot-ros2 docker-compose service uses this):
|
||
ros2 launch saltybot_bringup slam.launch.py
|
||
|
||
# Individual drivers:
|
||
ros2 launch saltybot_bringup realsense.launch.py
|
||
ros2 launch saltybot_bringup rplidar.launch.py
|
||
```
|
||
|
||
---
|
||
|
||
## Topic Reference
|
||
|
||
### RPLIDAR A1M8
|
||
|
||
| Topic | Type | Rate | Notes |
|
||
|-------|------|------|-------|
|
||
| `/scan` | `sensor_msgs/LaserScan` | ~5.5 Hz | 360°, 1440 pts/scan, 12m range |
|
||
|
||
```
|
||
angle_min: -π (-180°)
|
||
angle_max: +π (+180°)
|
||
range_min: 0.15 m
|
||
range_max: 12.00 m
|
||
scan_time: ~0.182 s
|
||
frame_id: laser
|
||
```
|
||
|
||
### Intel RealSense D435i
|
||
|
||
| Topic | Type | Rate | Notes |
|
||
|-------|------|------|-------|
|
||
| `/camera/color/image_raw` | `sensor_msgs/Image` | 15 Hz | RGB8, 640×480 |
|
||
| `/camera/color/camera_info` | `sensor_msgs/CameraInfo` | 15 Hz | Intrinsics |
|
||
| `/camera/depth/image_rect_raw` | `sensor_msgs/Image` | 15 Hz | Z16, 640×480 |
|
||
| `/camera/depth/camera_info` | `sensor_msgs/CameraInfo` | 15 Hz | Intrinsics |
|
||
| `/camera/aligned_depth_to_color/image_raw` | `sensor_msgs/Image` | 15 Hz | Depth in color frame |
|
||
| `/camera/imu` | `sensor_msgs/Imu` | ~200 Hz | Accel + gyro fused (linear_interpolation) |
|
||
|
||
### TF Tree
|
||
|
||
```
|
||
map
|
||
└── odom (published by slam_toolbox once moving)
|
||
└── base_link (robot body frame)
|
||
├── laser (RPLIDAR A1M8) ← static, 10cm above base
|
||
└── camera_link (RealSense D435i) ← static, 5cm fwd, 15cm up
|
||
├── camera_depth_frame
|
||
├── camera_color_frame
|
||
└── camera_imu_frame
|
||
```
|
||
|
||
**Note:** Static TF values in `sensors.launch.py` are placeholders.
|
||
Update after physical mount with real measurements.
|
||
|
||
---
|
||
|
||
## Verification Commands
|
||
|
||
```bash
|
||
# List all active topics
|
||
ros2 topic list
|
||
|
||
# Check data rates
|
||
ros2 topic hz /scan
|
||
ros2 topic hz /camera/color/image_raw
|
||
ros2 topic hz /camera/imu
|
||
|
||
# Spot-check data
|
||
ros2 topic echo /scan --once
|
||
ros2 topic echo /camera/imu --once
|
||
|
||
# TF tree
|
||
ros2 run tf2_tools view_frames
|
||
ros2 run tf2_ros tf2_echo base_link laser
|
||
|
||
# SLAM map topic (after slam.launch.py)
|
||
ros2 topic hz /map
|
||
ros2 topic echo /map --no-arr # metadata without raw data flood
|
||
```
|
||
|
||
---
|
||
|
||
## Troubleshooting
|
||
|
||
### RPLIDAR not found
|
||
```bash
|
||
# Check udev symlink
|
||
ls -la /dev/rplidar
|
||
# If missing, fall back to:
|
||
ros2 launch saltybot_bringup rplidar.launch.py serial_port:=/dev/ttyUSB0
|
||
# Install udev rule (host, not container):
|
||
sudo cp /path/to/jetson/docs/99-saltybot.rules /etc/udev/rules.d/
|
||
sudo udevadm control --reload-rules && sudo udevadm trigger
|
||
```
|
||
|
||
### RealSense not detected
|
||
```bash
|
||
# Check USB3 port (blue port on Nano)
|
||
lsusb | grep "8086:0b3a"
|
||
# Expected: Bus 002 Device 00X: ID 8086:0b3a Intel Corp. Intel RealSense D435i
|
||
|
||
# Install udev rules (host, not container):
|
||
sudo cp /etc/udev/rules.d/99-realsense-libusb.rules /etc/udev/rules.d/
|
||
sudo udevadm control --reload-rules && sudo udevadm trigger
|
||
```
|
||
|
||
### IMU not publishing
|
||
```bash
|
||
# Verify enable_gyro and enable_accel are true
|
||
ros2 param get /camera/camera enable_gyro
|
||
ros2 param get /camera/camera enable_accel
|
||
# Check unite_imu_method (should be 2)
|
||
ros2 param get /camera/camera unite_imu_method
|
||
```
|