Classifies facial expressions into neutral/happy/surprised/angry/sad
using geometric rules over MediaPipe Face Mesh landmarks — no ML model
required at runtime.
Rules
-----
surprised: brow_raise > 0.12 AND eye_open > 0.07 AND mouth_open > 0.07
happy: smile > 0.025 (lip corners above lip midpoint)
angry: brow_furl > 0.02 AND smile < 0.01
sad: smile < -0.025 AND brow_furl < 0.015
neutral: default
Changes
-------
- saltybot_scene_msgs/msg/FaceEmotion.msg — per-face emotion + features
- saltybot_scene_msgs/msg/FaceEmotionArray.msg
- saltybot_scene_msgs/CMakeLists.txt — register new msgs
- _face_emotion.py — pure-Python: FaceLandmarks, compute_features,
classify_emotion, detect_emotion, from_mediapipe
- face_emotion_node.py — subscribes /camera/color/image_raw,
publishes /saltybot/face_emotions (≤15 fps)
- test/test_face_emotion.py — 48 tests, all passing
- setup.py — add face_emotion entry point
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
9 lines
457 B
Plaintext
9 lines
457 B
Plaintext
std_msgs/Header header
|
||
uint32 face_id # track ID or 0-based detection index
|
||
string emotion # 'neutral' | 'happy' | 'surprised' | 'angry' | 'sad'
|
||
float32 confidence # 0.0–1.0
|
||
float32 mouth_open # mouth height / face height (0=closed)
|
||
float32 smile # lip-corner elevation (positive=smile, negative=frown)
|
||
float32 brow_raise # inner-brow to eye-top gap / face height (positive=raised)
|
||
float32 eye_open # eye height / face height
|