[P1] Gesture recognition — hand/body gesture commands via RealSense #454

Closed
opened 2026-03-05 09:17:43 -05:00 by sl-jetson · 0 comments
Collaborator

Goal

Recognize hand and body gestures from RealSense RGB for non-verbal commands.

Gestures

  • Wave: greeting trigger (start encounter or acknowledge)
  • Palm up (stop): emergency stop
  • Beckoning: come here
  • Thumbs up: positive feedback / confirm
  • Pointing: look/go in direction
  • Arms crossed: back off / give space

Requirements

  • ROS2 node using MediaPipe Hands + Pose on Orin (GPU accelerated)
  • Publish /saltybot/gesture (Gesture.msg: gesture_type, confidence, hand_position)
  • Minimum confidence threshold 0.7 to prevent false positives
  • Integrate with voice command router as alternative input
  • Works at 2-5m range from RealSense
  • 10+ fps processing on Orin
  • Configurable: enable/disable specific gestures
## Goal Recognize hand and body gestures from RealSense RGB for non-verbal commands. ## Gestures - Wave: greeting trigger (start encounter or acknowledge) - Palm up (stop): emergency stop - Beckoning: come here - Thumbs up: positive feedback / confirm - Pointing: look/go in direction - Arms crossed: back off / give space ## Requirements - ROS2 node using MediaPipe Hands + Pose on Orin (GPU accelerated) - Publish /saltybot/gesture (Gesture.msg: gesture_type, confidence, hand_position) - Minimum confidence threshold 0.7 to prevent false positives - Integrate with voice command router as alternative input - Works at 2-5m range from RealSense - 10+ fps processing on Orin - Configurable: enable/disable specific gestures
Sign in to join this conversation.
No Label
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: seb/saltylab-firmware#454
No description provided.