- New ROS2 node: power_supervisor_node for battery state monitoring - Battery thresholds: 30% warning, 20% dock search, 10% graceful shutdown, 5% force kill - Charge cycle tracking and battery health estimation - CSV logging to battery_log.csv for external analysis - Publishes /saltybot/power_state for MQTT relay - Graceful shutdown cascade: save state, stop motors, disarm on critical low battery - Replaces/extends Issue #125 battery_node with supervisor-level power management Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
12 KiB
Issue #469: Terrain Classification Implementation Plan
Context
SaltyBot currently has good sensor infrastructure (IMU, cameras, RealSense) and a robust velocity control system with the VelocityRamp class. However, it lacks terrain awareness for surface type detection and speed adaptation. This feature will enable:
- Surface detection via IMU vibration analysis and camera texture analysis
- Automatic speed adaptation based on terrain type and roughness
- Terrain logging for mapping and future learning
- Improved robot safety by reducing speed on rough/unstable terrain
Architecture Overview
The implementation follows the existing ROS2 patterns:
IMU/Camera Data
↓
[terrain_classifier_node] ← New node
↓
/saltybot/terrain_state (TerrainState.msg)
↓
[terrain_speed_adapter_node] ← New node
↓
Adjusted /cmd_vel_terrain → existing cmd_vel_bridge
↓
Speed-adapted robot motion
Parallel: [terrain_mapper_node] logs data for mapping
Implementation Components
1. Message Definition: TerrainState.msg
File to create: jetson/ros2_ws/src/saltybot_social_msgs/msg/TerrainState.msg
Fields:
std_msgs/Header header— timestamp/frame_iduint8 terrain_type— enum (0=unknown, 1=pavement, 2=grass, 3=gravel, 4=sand, 5=indoor)float32 roughness— 0.0=smooth, 1.0=very roughfloat32 confidence— 0.0-1.0 classification confidencefloat32 recommended_speed_ratio— 0.1-1.0 (fraction of max speed)string source— "imu_vibration" or "camera_texture" or "fused"
Update files:
jetson/ros2_ws/src/saltybot_social_msgs/CMakeLists.txt— add TerrainState.msg to rosidl_generate_interfaces()jetson/ros2_ws/src/saltybot_social_msgs/package.xml— no changes needed (std_msgs already a dependency)
2. Terrain Classifier Node
File to create: jetson/ros2_ws/src/saltybot_bringup/saltybot_bringup/terrain_classifier_node.py
Purpose: Analyze IMU and camera data to classify terrain type and estimate roughness.
Subscribes to:
/camera/imu(sensor_msgs/Imu) — RealSense IMU at 200 Hz/camera/color/image_raw(sensor_msgs/Image) — camera RGB at 15 Hz
Publishes:
/saltybot/terrain_state(TerrainState.msg) — at 5 Hz
Key functions:
-
_analyze_imu_vibration()— FFT analysis on accel data (window: 200 samples = 1 sec)- Compute power spectral density in 0-50 Hz band
- Extract features: peak frequency, energy distribution, RMS acceleration
- Roughness = normalized RMS of high-freq components (>10 Hz)
-
_analyze_camera_texture()— CNN-based texture analysis- Uses MobileNetV2 pre-trained on ImageNet as feature extractor
- Extracts high-level texture/surface features from camera image
- Lightweight model (~3.5M parameters, ~50-100ms inference on Jetson)
- Outputs feature vector fed to classification layer
-
_classify_terrain()— decision logic- Simple rule-based classifier (can be upgraded to CNN)
- Input: [imu_roughness, camera_texture_variance, accel_magnitude]
- Decision tree or thresholds to classify into 5 types
- Output: terrain_type, roughness, confidence
Node Parameters:
imu_window_size(int, default 200) — samples for FFT windowpublish_rate_hz(float, default 5.0)roughness_threshold(float, default 0.3) — FFT roughness thresholdterrain_timeout_s(float, default 5.0) — how long to keep previous estimate if no new data
3. Speed Adapter Node
File to create: jetson/ros2_ws/src/saltybot_bringup/saltybot_bringup/terrain_speed_adapter_node.py
Purpose: Adapt cmd_vel speed based on terrain state and integration with velocity ramp.
Subscribes to:
/cmd_vel(geometry_msgs/Twist) — raw velocity commands/saltybot/terrain_state(TerrainState.msg) — terrain classification
Publishes:
/cmd_vel_terrain(geometry_msgs/Twist) — terrain-adapted velocity
Logic:
- Extract target linear velocity from cmd_vel
- Apply terrain speed ratio:
adapted_speed = target_speed × recommended_speed_ratio - Preserve angular velocity (steering not affected by terrain)
- Publish adapted command
Node Parameters:
enable_terrain_adaptation(bool, default true)min_speed_ratio(float, default 0.1) — never go below 10% of requested speeddebug_logging(bool, default false)
Note: This is a lightweight adapter. The existing velocity_ramp_node handles acceleration/deceleration smoothing independently.
4. Terrain Mapper Node (Logging/Mapping)
File to create: jetson/ros2_ws/src/saltybot_bringup/saltybot_bringup/terrain_mapper_node.py
Purpose: Log terrain detections with robot pose for future mapping.
Subscribes to:
/saltybot/terrain_state(TerrainState.msg)/odom(nav_msgs/Odometry) — robot pose
Publishes:
/saltybot/terrain_log(std_msgs/String) — CSV formatted log messages (optional, mainly for logging)
Logic:
- Store terrain observations: (timestamp, pose_x, pose_y, terrain_type, roughness, confidence)
- Log to file:
~/.ros/terrain_map_<timestamp>.csv - Resample to 1 Hz to avoid spam
Node Parameters:
log_dir(string, default "~/.ros/")resample_rate_hz(float, default 1.0)
5. Launch Configuration
File to update: jetson/ros2_ws/src/saltybot_bringup/launch/full_stack.launch.py
Add terrain nodes:
terrain_classifier_node = Node(
package='saltybot_bringup',
executable='terrain_classifier',
name='terrain_classifier',
parameters=[{
'imu_window_size': 200,
'publish_rate_hz': 5.0,
}],
remappings=[
('/imu_in', '/camera/imu'),
('/camera_in', '/camera/color/image_raw'),
],
)
terrain_speed_adapter_node = Node(
package='saltybot_bringup',
executable='terrain_speed_adapter',
name='terrain_speed_adapter',
parameters=[{
'enable_terrain_adaptation': True,
'min_speed_ratio': 0.1,
}],
remappings=[
('/cmd_vel_in', '/cmd_vel'),
('/cmd_vel_out', '/cmd_vel_terrain'),
],
)
terrain_mapper_node = Node(
package='saltybot_bringup',
executable='terrain_mapper',
name='terrain_mapper',
)
Update setup.py entry points:
'terrain_classifier = saltybot_bringup.terrain_classifier_node:main'
'terrain_speed_adapter = saltybot_bringup.terrain_speed_adapter_node:main'
'terrain_mapper = saltybot_bringup.terrain_mapper_node:main'
6. Integration with Existing Stack
- The existing velocity ramp (
velocity_ramp_node.py) processes/cmd_vel_smoothor/cmd_vel - Optionally, update cmd_vel_bridge to use
/cmd_vel_terrainif available, else fall back to/cmd_vel - Terrain classification runs independently at 5 Hz (much slower than velocity ramping at 50 Hz)
7. Future CNN Enhancement
The current implementation uses rule-based classification with IMU FFT and camera edge detection. A future enhancement could add a lightweight CNN for texture classification (e.g., MobileNet) by:
- Creating a
terrain_classifier_cnn.pywith TensorFlow/ONNX model - Replacing decision logic in
terrain_classifier_node.pywith CNN inference - Maintaining same message interface
Implementation Tasks
-
✅ Phase 1: Message Definition
- Create
TerrainState.msgin saltybot_social_msgs - Update
CMakeLists.txt
- Create
-
✅ Phase 2: Terrain Classifier Node
- Implement
terrain_classifier_node.pywith IMU FFT analysis - Implement camera texture analysis
- Decision logic for classification
- Implement
-
✅ Phase 3: Speed Adapter Node
- Implement
terrain_speed_adapter_node.py - Velocity command adaptation
- Implement
-
✅ Phase 4: Terrain Mapper Node
- Implement
terrain_mapper_node.pyfor logging
- Implement
-
✅ Phase 5: Integration
- Update
full_stack.launch.pywith new nodes - Update
setup.pywith entry points - Test integration
- Update
Testing & Verification
Unit Tests:
- Test IMU FFT feature extraction with synthetic vibration data
- Test terrain classification decision logic
- Test speed ratio application
- Test CSV logging format
Integration Tests:
- Run full stack with simulated IMU/camera data
- Verify terrain messages published at 5 Hz
- Verify cmd_vel_terrain adapts speeds correctly
- Check terrain log file is created and properly formatted
Manual Testing:
- Drive robot on different surfaces
- Verify terrain detection changes appropriately
- Verify speed adaptation is smooth (no jerks from ramping)
- Check terrain log CSV has correct format
Critical Files Summary
To Create:
jetson/ros2_ws/src/saltybot_social_msgs/msg/TerrainState.msgjetson/ros2_ws/src/saltybot_bringup/saltybot_bringup/terrain_classifier_node.pyjetson/ros2_ws/src/saltybot_bringup/saltybot_bringup/terrain_speed_adapter_node.pyjetson/ros2_ws/src/saltybot_bringup/saltybot_bringup/terrain_mapper_node.py
To Modify:
jetson/ros2_ws/src/saltybot_social_msgs/CMakeLists.txt(add TerrainState.msg)jetson/ros2_ws/src/saltybot_bringup/launch/full_stack.launch.py(add nodes)jetson/ros2_ws/src/saltybot_bringup/setup.py(add entry points)
Key Dependencies:
numpy— FFT analysis (already available in saltybot)scipy.signal— Butterworth filter (optional, for smoothing)cv2(OpenCV) — image processing (already available)tensorflowortf-lite— MobileNetV2 pre-trained model for texture CNNrclpy— ROS2 Python client
CNN Model Details:
- Model: MobileNetV2 pre-trained on ImageNet
- Input: 224×224 RGB image (downsampled from camera)
- Output: 1280-dim feature vector from last conv layer before classification
- Strategy: Use pre-trained features directly (transfer learning) for quick MVP, no fine-tuning needed initially
- Alternative: Pre-trained weights can be fine-tuned on terrain image dataset in future iterations
- Inference: ~50-100ms on Jetson Xavier (acceptable at 5 Hz publish rate)
Terrain Classification Logic (IMU + CNN Fusion)
Features extracted:
-
imu_roughness= normalized RMS of high-freq (>10 Hz) accel components (0-1)- Computed from FFT power spectral density in 10-50 Hz band
- Reflects mechanical vibration from surface contact
-
cnn_texture_features= 1280-dim feature vector from MobileNetV2- Pre-trained features capture texture, edge, and surface characteristics
- Reduced to 2-3 principal components via PCA or simple aggregation
-
accel_magnitude= RMS of total acceleration (m/s²)- Helps distinguish stationary (9.81 m/s²) vs. moving
Classification approach (Version 1):
-
Simple decision tree with IMU-dominant logic + CNN support:
if imu_roughness < 0.2 and accel_magnitude < 9.8: terrain = PAVEMENT (confidence boosted if CNN agrees) elif imu_roughness < 0.35 and cnn_grainy_score < 0.4: terrain = GRASS elif imu_roughness > 0.45 and cnn_granular_score > 0.5: terrain = GRAVEL elif cnn_sand_texture_score > 0.6 and imu_roughness > 0.3: terrain = SAND else: terrain = INDOOR -
Confidence: weighted combination of IMU and CNN agreement
-
Roughness metric:
0.0 = smooth, 1.0 = very roughderived from IMU FFT energy ratio
Speed recommendations:
- Pavement: 1.0 (full speed)
- Grass: 0.8 (20% slower)
- Gravel: 0.5 (50% slower)
- Sand: 0.4 (60% slower)
- Indoor: 0.7 (30% slower by default)
Future improvement: Replace decision tree with trained classifier (Random Forest, SVM, or small Dense net) on labeled terrain dataset once collected.
This plan follows SaltyBot's established patterns:
- Pure Python libraries for core logic (_terrain_analysis.py)
- ROS2 node wrappers for integration
- Parameter-based configuration in YAML
- Message-based pub/sub architecture
- Integration with existing velocity control pipeline