[P1] Voice command interpreter — natural language to robot actions #409

Closed
opened 2026-03-04 15:47:24 -05:00 by sl-jetson · 0 comments
Collaborator

Goal

After wake word, interpret spoken commands and map to robot actions.

Core Commands

  • follow me → activate follow-me controller
  • stop / stay → halt all motion
  • come here → navigate to speaker
  • go home → return to dock
  • whats your battery → TTS battery %
  • spin around → rotate 360
  • quiet mode → reduce volume

Architecture

  • Subscribe to /saltybot/speech_text (STT after wake word)
  • Intent classification via keyword + fuzzy match (no cloud LLM)
  • Publish VoiceCommand.msg to /saltybot/voice_command
  • Command router dispatches to controllers
  • TTS confirmation via Piper
## Goal After wake word, interpret spoken commands and map to robot actions. ## Core Commands - follow me → activate follow-me controller - stop / stay → halt all motion - come here → navigate to speaker - go home → return to dock - whats your battery → TTS battery % - spin around → rotate 360 - quiet mode → reduce volume ## Architecture - Subscribe to /saltybot/speech_text (STT after wake word) - Intent classification via keyword + fuzzy match (no cloud LLM) - Publish VoiceCommand.msg to /saltybot/voice_command - Command router dispatches to controllers - TTS confirmation via Piper
Sign in to join this conversation.
No Label
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: seb/saltylab-firmware#409
No description provided.