[P1] Android/Termux OpenClaw node — phone as SaltyBot sensor + AI compute node #420

Closed
opened 2026-03-04 22:59:14 -05:00 by sl-jetson · 0 comments
Collaborator

Goal

Run an OpenClaw ROS2 node on an Android phone (mounted on SaltyBot) via Termux, providing:

  • Phone camera as additional sensor (wide-angle, different perspective than RealSense)
  • Phone GPS as location source
  • Phone IMU as supplementary motion data
  • Local LLM inference via OpenClaw for conversational AI
  • Microphone array for directional audio

Architecture

  • Termux with termux-api package for sensor access
  • ROS2 bridge: WebSocket or rosbridge to Jetson Orin
  • OpenClaw node: local small LLM for conversation, intent understanding
  • Sensor publisher nodes: camera, GPS, IMU, mic
  • Phone connects to Orin via WiFi direct or USB tethering

Requirements

  1. Termux bootstrap script: install Python, ROS2 deps, OpenClaw, termux-api
  2. Camera node: publish phone camera as /phone/camera/image_raw (CompressedImage)
  3. GPS node: publish /phone/gps (NavSatFix) via termux-api
  4. IMU node: publish /phone/imu (Imu) via termux-api sensors
  5. OpenClaw chat node: subscribe to /saltybot/speech_text, run local LLM, publish /saltybot/chat_response
  6. Bridge config: connect to Orin rosbridge at ws://saltylab-orin:9090
  7. Auto-start: Termux:Boot script to launch all nodes on phone boot
  8. Power management: monitor battery, reduce inference when low

Phone Hardware

  • TBD (likely Pixel or Samsung with good GPU for inference)
  • USB-C connection to SaltyBot for power + data

Depends On

  • Voice command interpreter (#409 merged)
  • Speech pipeline on Orin
## Goal Run an OpenClaw ROS2 node on an Android phone (mounted on SaltyBot) via Termux, providing: - Phone camera as additional sensor (wide-angle, different perspective than RealSense) - Phone GPS as location source - Phone IMU as supplementary motion data - Local LLM inference via OpenClaw for conversational AI - Microphone array for directional audio ## Architecture - Termux with termux-api package for sensor access - ROS2 bridge: WebSocket or rosbridge to Jetson Orin - OpenClaw node: local small LLM for conversation, intent understanding - Sensor publisher nodes: camera, GPS, IMU, mic - Phone connects to Orin via WiFi direct or USB tethering ## Requirements 1. **Termux bootstrap script**: install Python, ROS2 deps, OpenClaw, termux-api 2. **Camera node**: publish phone camera as /phone/camera/image_raw (CompressedImage) 3. **GPS node**: publish /phone/gps (NavSatFix) via termux-api 4. **IMU node**: publish /phone/imu (Imu) via termux-api sensors 5. **OpenClaw chat node**: subscribe to /saltybot/speech_text, run local LLM, publish /saltybot/chat_response 6. **Bridge config**: connect to Orin rosbridge at ws://saltylab-orin:9090 7. **Auto-start**: Termux:Boot script to launch all nodes on phone boot 8. **Power management**: monitor battery, reduce inference when low ## Phone Hardware - TBD (likely Pixel or Samsung with good GPU for inference) - USB-C connection to SaltyBot for power + data ## Depends On - Voice command interpreter (#409 merged) - Speech pipeline on Orin
Sign in to join this conversation.
No Label
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: seb/saltylab-firmware#420
No description provided.