sl-jetson ce6d5ee249
Some checks failed
social-bot integration tests / Lint (flake8 + pep257) (push) Failing after 9s
social-bot integration tests / Core integration tests (mock sensors, no GPU) (push) Has been skipped
social-bot integration tests / Lint (flake8 + pep257) (pull_request) Failing after 2s
social-bot integration tests / Core integration tests (mock sensors, no GPU) (pull_request) Has been skipped
social-bot integration tests / Latency profiling (GPU, Orin) (push) Has been cancelled
social-bot integration tests / Latency profiling (GPU, Orin) (pull_request) Has been cancelled
feat(social): multi-camera gesture recognition — MediaPipe Hands + Pose (Issue #140)
Delivers Issue #140 (P2): real-time gesture detection from 4 CSI cameras via
MediaPipe Hands and Pose, publishing classified gestures on /social/gestures.

New files:
- saltybot_social_msgs/msg/Gesture.msg + GestureArray.msg — ROS2 message types
- saltybot_social/gesture_classifier.py — pure-Python geometric classifier
  (stop_palm, thumbs_up/down, point, come_here, follow, wave, arms_up,
   arms_spread, crouch); WaveDetector temporal sliding-window oscillation tracker
- saltybot_social/gesture_node.py — ROS2 node; round-robin multi-camera
  _FrameBuffer, lazy MediaPipe init, person-ID correlation via PersonState
- saltybot_social/test/test_gesture_classifier.py — 70 unit tests, all passing
- saltybot_social/config/gesture_params.yaml — tuned defaults for Orin Nano
- saltybot_social/launch/gesture.launch.py — all params overridable at launch

Modified:
- saltybot_social_msgs/CMakeLists.txt — register Gesture + GestureArray msgs
- saltybot_social/setup.py — add gesture_node console_scripts entry point

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-02 10:10:54 -05:00

68 lines
2.5 KiB
Python

"""gesture.launch.py — Launch gesture_node for multi-camera gesture recognition (Issue #140)."""
from launch import LaunchDescription
from launch.actions import DeclareLaunchArgument
from launch.substitutions import LaunchConfiguration, PathJoinSubstitution
from launch_ros.actions import Node
from launch_ros.substitutions import FindPackageShare
def generate_launch_description() -> LaunchDescription:
pkg = FindPackageShare("saltybot_social")
params_file = PathJoinSubstitution([pkg, "config", "gesture_params.yaml"])
return LaunchDescription([
DeclareLaunchArgument(
"params_file",
default_value=params_file,
description="Path to gesture_node parameter YAML",
),
DeclareLaunchArgument(
"process_fps",
default_value="10.0",
description="Target gesture detection rate (Hz)",
),
DeclareLaunchArgument(
"model_complexity",
default_value="0",
description="MediaPipe model complexity: 0=lite, 1=full, 2=heavy",
),
DeclareLaunchArgument(
"pose_enabled",
default_value="true",
description="Enable MediaPipe Pose body-gesture detection",
),
DeclareLaunchArgument(
"hands_enabled",
default_value="true",
description="Enable MediaPipe Hands hand-gesture detection",
),
DeclareLaunchArgument(
"enabled_gestures",
default_value="",
description="Comma-separated gesture allowlist; empty = all",
),
DeclareLaunchArgument(
"min_confidence",
default_value="0.60",
description="Minimum detection confidence threshold",
),
Node(
package="saltybot_social",
executable="gesture_node",
name="gesture_node",
output="screen",
parameters=[
LaunchConfiguration("params_file"),
{
"process_fps": LaunchConfiguration("process_fps"),
"model_complexity": LaunchConfiguration("model_complexity"),
"pose_enabled": LaunchConfiguration("pose_enabled"),
"hands_enabled": LaunchConfiguration("hands_enabled"),
"enabled_gestures": LaunchConfiguration("enabled_gestures"),
"min_confidence": LaunchConfiguration("min_confidence"),
},
],
),
])