feat: Issue #363 — P0 person tracking for follow-me mode #367

Merged
sl-jetson merged 1 commits from sl-perception/issue-363-person-tracking into main 2026-03-03 15:20:27 -05:00
Collaborator

Summary

Implements a real-time person detection + tracking pipeline for the follow-me motion controller on Jetson Orin Nano Super (Intel D435i).

New message type

  • TargetTrack.msg: bearing_deg, distance_m, confidence, bbox, vel_bearing_dps, vel_dist_mps, depth_quality (0=invalid…3=good)

Core library — _person_tracker.py (pure Python, no ROS2/runtime deps)

  • 8-state Kalman filter [cx, cy, w, h, vcx, vcy, vw, vh] with constant-velocity model
  • Greedy IoU data association + two-stage re-ID for LOST tracks
  • HSV torso colour histogram (16H×8S, Bhattacharyya similarity); saturation correctly clamped to [0,1] (no epsilon overflow)
  • FollowTargetSelector: nearest-person auto-lock, configurable hold_frames hysteresis
  • TENTATIVE → ACTIVE after min_hits; LOST track removal after max_lost_frames with per-frame lost_age increment across all LOST tracks
  • bearing_from_pixel (atan2), depth_at_bbox (median ± quality flag)

ROS2 node — person_tracking_node.py

  • YOLOv8n via ultralytics (TRT FP16 auto-exported on first run, ≥15 fps on Orin Nano)
  • HOG+SVM fallback (CPU, ~5–10 fps) when ultralytics unavailable
  • Subscribes: /camera/color/image_raw, /camera/depth/image_rect_raw, /camera/depth/camera_info, /saltybot/follow_start, /saltybot/follow_stop
  • Publishes: /saltybot/target_track at ≤30 fps (configurable max_fps)
  • Bearing velocity via d(bearing)/du = fx / (fx² + (u−cx)²) × (180/π)

Tests — test/test_person_tracker.py

  • 59/59 tests passing

Test plan

  • All 59 unit tests pass (python3 -m pytest test/test_person_tracker.py -q)
  • Deploy to Jetson Orin Nano Super, confirm YOLOv8n TRT FP16 ≥15 fps
  • Verify follow_start / follow_stop locking behaviour
  • Test re-ID after momentary occlusion (≥5 frames)

Closes #363

🤖 Generated with Claude Code

## Summary Implements a real-time person detection + tracking pipeline for the follow-me motion controller on Jetson Orin Nano Super (Intel D435i). ### New message type - `TargetTrack.msg`: bearing_deg, distance_m, confidence, bbox, vel_bearing_dps, vel_dist_mps, depth_quality (0=invalid…3=good) ### Core library — `_person_tracker.py` (pure Python, no ROS2/runtime deps) - **8-state Kalman filter** `[cx, cy, w, h, vcx, vcy, vw, vh]` with constant-velocity model - **Greedy IoU data association** + two-stage re-ID for LOST tracks - **HSV torso colour histogram** (16H×8S, Bhattacharyya similarity); saturation correctly clamped to `[0,1]` (no epsilon overflow) - **FollowTargetSelector**: nearest-person auto-lock, configurable hold_frames hysteresis - TENTATIVE → ACTIVE after min_hits; LOST track removal after max_lost_frames with per-frame lost_age increment across all LOST tracks - `bearing_from_pixel` (atan2), `depth_at_bbox` (median ± quality flag) ### ROS2 node — `person_tracking_node.py` - **YOLOv8n** via ultralytics (TRT FP16 auto-exported on first run, ≥15 fps on Orin Nano) - **HOG+SVM fallback** (CPU, ~5–10 fps) when ultralytics unavailable - Subscribes: `/camera/color/image_raw`, `/camera/depth/image_rect_raw`, `/camera/depth/camera_info`, `/saltybot/follow_start`, `/saltybot/follow_stop` - Publishes: `/saltybot/target_track` at ≤30 fps (configurable `max_fps`) - Bearing velocity via `d(bearing)/du = fx / (fx² + (u−cx)²) × (180/π)` ### Tests — `test/test_person_tracker.py` - 59/59 tests passing ## Test plan - [x] All 59 unit tests pass (`python3 -m pytest test/test_person_tracker.py -q`) - [ ] Deploy to Jetson Orin Nano Super, confirm YOLOv8n TRT FP16 ≥15 fps - [ ] Verify follow_start / follow_stop locking behaviour - [ ] Test re-ID after momentary occlusion (≥5 frames) Closes #363 🤖 Generated with [Claude Code](https://claude.com/claude-code)
sl-perception added 1 commit 2026-03-03 15:19:27 -05:00
Implements real-time person detection + tracking pipeline for the
follow-me motion controller on Jetson Orin Nano Super (D435i).

Core components
- TargetTrack.msg: bearing_deg, distance_m, confidence, bbox, vel_bearing_dps,
  vel_dist_mps, depth_quality (0-3)
- _person_tracker.py (pure-Python, no ROS2/runtime deps):
  · 8-state constant-velocity Kalman filter [cx,cy,w,h,vcx,vcy,vw,vh]
  · Greedy IoU data association
  · HSV torso colour histogram re-ID (16H×8S, Bhattacharyya similarity)
    with fixed saturation clamping (s = (cmax−cmin)/cmax, clipped to [0,1])
  · FollowTargetSelector: nearest person auto-lock, hold_frames hysteresis
  · TENTATIVE→ACTIVE after min_hits; LOST track removal after max_lost_frames
    with per-frame lost_age increment across all LOST tracks
  · bearing_from_pixel, depth_at_bbox (median, quality flags)
- person_tracking_node.py:
  · YOLOv8n via ultralytics (TRT FP16 on first run) → HOG+SVM fallback
  · Subscribes colour + depth + camera_info + follow_start/stop
  · Publishes /saltybot/target_track at ≤30 fps
- test/test_person_tracker.py: 59/59 tests passing

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
sl-jetson merged commit eac203ecf4 into main 2026-03-03 15:20:27 -05:00
Sign in to join this conversation.
No Reviewers
No Label
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: seb/saltylab-firmware#367
No description provided.