Compare commits
No commits in common. "27a95dce0adaf02415115e3587291262e89ecabc" and "1530104e22c44f834efe4c62553925111d205645" have entirely different histories.
27a95dce0a
...
1530104e22
@ -1,198 +0,0 @@
|
|||||||
# SaltyBot Diagnostic Self-Test System
|
|
||||||
|
|
||||||
Comprehensive hardware diagnostics and health monitoring for SaltyBot. Performs startup checks on critical hardware, continuous runtime monitoring, and publishes detailed diagnostic data.
|
|
||||||
|
|
||||||
## Features
|
|
||||||
|
|
||||||
### Startup Hardware Checks
|
|
||||||
Validates hardware availability and connectivity at boot:
|
|
||||||
- **RPLIDAR**: Serial port detection, rotation verification
|
|
||||||
- **RealSense D435i**: USB enumeration, stream availability
|
|
||||||
- **VESC Motor Controller**: UART connection, firmware status
|
|
||||||
- **Jabra Microphone**: USB audio device detection
|
|
||||||
- **STM32 Bridge**: Serial port verification, watchdog status
|
|
||||||
- **Servo Controller**: I2C bus communication
|
|
||||||
- **WiFi**: Network interface status
|
|
||||||
- **GPS Module**: Serial port and fix detection
|
|
||||||
- **Disk Space**: Storage availability checking
|
|
||||||
- **System RAM**: Memory availability
|
|
||||||
|
|
||||||
### Runtime Monitoring
|
|
||||||
Continuous health checks during operation:
|
|
||||||
- **Sensor FPS**: RealSense and RPLIDAR frame rates
|
|
||||||
- **Motor Stall**: Encoder latency detection
|
|
||||||
- **Temperature**: Orin GPU (>80°C warn, >85°C error), VESC (>60°C warn, >70°C error)
|
|
||||||
- **Network Latency**: Ping time monitoring
|
|
||||||
- **System Resources**: CPU, RAM usage trends
|
|
||||||
|
|
||||||
### Notifications & Feedback
|
|
||||||
- **TTS Announcements**: Boot result via text-to-speech
|
|
||||||
- **Face Animations**: Boot success/error display
|
|
||||||
- **Diagnostic Publishing**: `/saltybot/diagnostics` (DiagnosticArray)
|
|
||||||
- **JSON Logging**: Detailed logs to `/home/seb/saltybot-data/diagnostics/`
|
|
||||||
|
|
||||||
## Topics
|
|
||||||
|
|
||||||
### Published
|
|
||||||
- `/saltybot/diagnostics` (diagnostic_msgs/DiagnosticArray): System health status
|
|
||||||
- `/saltybot/tts_say` (std_msgs/String): Boot result announcement
|
|
||||||
- `/saltybot/face/boot_animation` (std_msgs/String): Boot animation trigger
|
|
||||||
|
|
||||||
### Subscribed
|
|
||||||
- Implicitly monitors system topics for FPS/latency data
|
|
||||||
|
|
||||||
## Configuration
|
|
||||||
|
|
||||||
Edit `config/diagnostic_checks.yaml`:
|
|
||||||
|
|
||||||
```yaml
|
|
||||||
startup_checks:
|
|
||||||
enabled: true
|
|
||||||
checks: [rplidar, realsense, vesc, jabra_microphone, ...]
|
|
||||||
|
|
||||||
runtime_monitoring:
|
|
||||||
enabled: true
|
|
||||||
frequency_hz: 1
|
|
||||||
temperatures:
|
|
||||||
jetson_gpu: {warn_c: 80, error_c: 85, critical_c: 90}
|
|
||||||
vesc_motor: {warn_c: 60, error_c: 70, critical_c: 80}
|
|
||||||
|
|
||||||
logging:
|
|
||||||
directory: /home/seb/saltybot-data/diagnostics
|
|
||||||
retention_days: 30
|
|
||||||
```
|
|
||||||
|
|
||||||
## Launch
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Default launch with startup checks + runtime monitoring
|
|
||||||
ros2 launch saltybot_diagnostics diagnostics.launch.py
|
|
||||||
|
|
||||||
# Startup checks only
|
|
||||||
ros2 launch saltybot_diagnostics diagnostics.launch.py \
|
|
||||||
enable_runtime_monitoring:=false
|
|
||||||
|
|
||||||
# Custom config
|
|
||||||
ros2 launch saltybot_diagnostics diagnostics.launch.py \
|
|
||||||
config_file:=/path/to/custom_checks.yaml
|
|
||||||
```
|
|
||||||
|
|
||||||
## Diagnostic Array Format
|
|
||||||
|
|
||||||
Published to `/saltybot/diagnostics`:
|
|
||||||
|
|
||||||
```python
|
|
||||||
diagnostic_msgs/DiagnosticArray:
|
|
||||||
header:
|
|
||||||
stamp: <timestamp>
|
|
||||||
status:
|
|
||||||
- name: "saltybot/rplidar"
|
|
||||||
level: 0 # OK=0, WARN=1, ERROR=2, STALE=3
|
|
||||||
message: "RPLIDAR detected on /dev/ttyUSB0"
|
|
||||||
values:
|
|
||||||
- key: "port"
|
|
||||||
value: "/dev/ttyUSB0"
|
|
||||||
|
|
||||||
- name: "saltybot/realsense"
|
|
||||||
level: 2 # ERROR
|
|
||||||
message: "RealSense not found on expected USB bus"
|
|
||||||
values: []
|
|
||||||
|
|
||||||
- name: "saltybot/gpu_temp"
|
|
||||||
level: 1 # WARN
|
|
||||||
message: "Runtime check"
|
|
||||||
values:
|
|
||||||
- key: "temperature_c"
|
|
||||||
value: "82.5"
|
|
||||||
- key: "threshold_warn"
|
|
||||||
value: "80"
|
|
||||||
```
|
|
||||||
|
|
||||||
## JSON Diagnostics Log Format
|
|
||||||
|
|
||||||
Files saved to `/home/seb/saltybot-data/diagnostics/diagnostics_YYYYMMDD_HHMMSS.json`:
|
|
||||||
|
|
||||||
```json
|
|
||||||
{
|
|
||||||
"timestamp": "2025-03-05T10:00:00.123456",
|
|
||||||
"check_type": "startup_checks",
|
|
||||||
"hardware_checks": {
|
|
||||||
"rplidar": {
|
|
||||||
"status": "OK",
|
|
||||||
"message": "RPLIDAR detected on /dev/ttyUSB0",
|
|
||||||
"details": {"port": "/dev/ttyUSB0"}
|
|
||||||
},
|
|
||||||
"realsense": {
|
|
||||||
"status": "ERROR",
|
|
||||||
"message": "RealSense not found",
|
|
||||||
"details": {}
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"runtime_metrics": {
|
|
||||||
"gpu_temp": {
|
|
||||||
"status": "OK",
|
|
||||||
"temperature_c": 65.0,
|
|
||||||
"threshold_warn": 80
|
|
||||||
},
|
|
||||||
"network_latency": {
|
|
||||||
"status": "WARN",
|
|
||||||
"latency_ms": 150
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
## TTS Announcements
|
|
||||||
|
|
||||||
Boot result messages published to `/saltybot/tts_say`:
|
|
||||||
- Success: "Boot complete. All systems online."
|
|
||||||
- With errors: "Boot complete with errors. RPLIDAR, RealSense offline."
|
|
||||||
|
|
||||||
## Status Levels
|
|
||||||
|
|
||||||
- **OK** (0): System healthy, no action needed
|
|
||||||
- **WARN** (1): Minor issues, monitor closely
|
|
||||||
- **ERROR** (2): Critical failure, may affect operation
|
|
||||||
- **STALE** (3): No data, check unavailable
|
|
||||||
|
|
||||||
## Logs and Data
|
|
||||||
|
|
||||||
Diagnostic logs stored in `/home/seb/saltybot-data/diagnostics/`:
|
|
||||||
- Auto-rotated every 100MB or 30 days
|
|
||||||
- JSON format for easy parsing
|
|
||||||
- Full boot record + runtime metrics
|
|
||||||
|
|
||||||
## Integration with Full Stack
|
|
||||||
|
|
||||||
Add to `full_stack.launch.py` at t=0s:
|
|
||||||
```python
|
|
||||||
IncludeLaunchDescription(
|
|
||||||
PythonLaunchDescriptionSource([...diagnostics.launch.py]),
|
|
||||||
launch_arguments={'enable_startup_check': 'true'}.items(),
|
|
||||||
)
|
|
||||||
```
|
|
||||||
|
|
||||||
## Debugging
|
|
||||||
|
|
||||||
Check current diagnostics:
|
|
||||||
```bash
|
|
||||||
ros2 topic echo /saltybot/diagnostics
|
|
||||||
```
|
|
||||||
|
|
||||||
View latest diagnostic log:
|
|
||||||
```bash
|
|
||||||
tail -f /home/seb/saltybot-data/diagnostics/diagnostics_*.json | jq
|
|
||||||
```
|
|
||||||
|
|
||||||
Simulate diagnostic errors (for testing):
|
|
||||||
```bash
|
|
||||||
# Monitor what would be logged
|
|
||||||
ros2 launch saltybot_diagnostics diagnostics.launch.py \
|
|
||||||
enable_startup_check:=true enable_runtime_monitoring:=true
|
|
||||||
```
|
|
||||||
|
|
||||||
## Hardware Requirements
|
|
||||||
|
|
||||||
- Jetson Orin (for temperature monitoring)
|
|
||||||
- Linux with psutil (for system resources)
|
|
||||||
- Standard ROS2 diagnostic_msgs package
|
|
||||||
@ -1,83 +0,0 @@
|
|||||||
# Diagnostic System Configuration
|
|
||||||
|
|
||||||
startup_checks:
|
|
||||||
# Hardware checks performed at boot
|
|
||||||
enabled: true
|
|
||||||
timeout_s: 30 # Maximum time allowed for startup checks
|
|
||||||
checks:
|
|
||||||
- rplidar
|
|
||||||
- realsense
|
|
||||||
- vesc
|
|
||||||
- jabra_microphone
|
|
||||||
- stm32_bridge
|
|
||||||
- servos
|
|
||||||
- wifi
|
|
||||||
- gps
|
|
||||||
- disk_space
|
|
||||||
- system_ram
|
|
||||||
|
|
||||||
# Serial device mappings
|
|
||||||
serial_devices:
|
|
||||||
rplidar: /dev/ttyUSB0
|
|
||||||
vesc: [/dev/ttyUSB1, /dev/ttyUSB2, /dev/ttyACM0]
|
|
||||||
stm32_bridge: /dev/ttyUSB0
|
|
||||||
gps: [/dev/ttyUSB3, /dev/ttyUSB4, /dev/ttyACM1]
|
|
||||||
|
|
||||||
# Device thresholds
|
|
||||||
disk_space:
|
|
||||||
warn_percent: 80
|
|
||||||
error_percent: 90
|
|
||||||
critical_percent: 95
|
|
||||||
|
|
||||||
system_ram:
|
|
||||||
warn_percent: 80
|
|
||||||
error_percent: 90
|
|
||||||
critical_percent: 95
|
|
||||||
|
|
||||||
runtime_monitoring:
|
|
||||||
# Continuous runtime health checks
|
|
||||||
enabled: true
|
|
||||||
frequency_hz: 1 # Check frequency
|
|
||||||
|
|
||||||
# Temperature thresholds
|
|
||||||
temperatures:
|
|
||||||
jetson_gpu:
|
|
||||||
warn_c: 80
|
|
||||||
error_c: 85
|
|
||||||
critical_c: 90
|
|
||||||
vesc_motor:
|
|
||||||
warn_c: 60
|
|
||||||
error_c: 70
|
|
||||||
critical_c: 80
|
|
||||||
|
|
||||||
# Network monitoring
|
|
||||||
network:
|
|
||||||
ping_target: 8.8.8.8
|
|
||||||
latency_warn_ms: 100
|
|
||||||
latency_error_ms: 200
|
|
||||||
|
|
||||||
# Sensor monitoring
|
|
||||||
sensors:
|
|
||||||
realsense_min_fps: 15
|
|
||||||
rplidar_min_fps: 5
|
|
||||||
motor_stall_timeout_s: 2
|
|
||||||
|
|
||||||
# TTS notifications
|
|
||||||
notifications:
|
|
||||||
tts_boot_success: "Boot complete. All systems online."
|
|
||||||
tts_boot_error: "Boot complete with errors. Check diagnostics."
|
|
||||||
tts_on_critical: "System critical. {error_components} offline."
|
|
||||||
|
|
||||||
# Face animations
|
|
||||||
animations:
|
|
||||||
boot_success: boot_success
|
|
||||||
boot_error: boot_error
|
|
||||||
critical_alert: critical_alert
|
|
||||||
|
|
||||||
# Logging
|
|
||||||
logging:
|
|
||||||
directory: /home/seb/saltybot-data/diagnostics
|
|
||||||
enable_json_logs: true
|
|
||||||
enable_csv_logs: false
|
|
||||||
retention_days: 30
|
|
||||||
max_file_size_mb: 100
|
|
||||||
@ -1,65 +0,0 @@
|
|||||||
"""Launch diagnostic self-test node."""
|
|
||||||
|
|
||||||
import os
|
|
||||||
from ament_index_python.packages import get_package_share_directory
|
|
||||||
from launch import LaunchDescription
|
|
||||||
from launch.actions import DeclareLaunchArgument
|
|
||||||
from launch.substitutions import LaunchConfiguration
|
|
||||||
from launch_ros.actions import Node
|
|
||||||
|
|
||||||
|
|
||||||
def generate_launch_description():
|
|
||||||
"""Generate launch description for diagnostics."""
|
|
||||||
|
|
||||||
package_dir = get_package_share_directory("saltybot_diagnostics")
|
|
||||||
config_dir = os.path.join(package_dir, "config")
|
|
||||||
|
|
||||||
# Launch arguments
|
|
||||||
config_file_arg = DeclareLaunchArgument(
|
|
||||||
"config_file",
|
|
||||||
default_value=os.path.join(config_dir, "diagnostic_checks.yaml"),
|
|
||||||
description="Path to diagnostic checks configuration YAML file",
|
|
||||||
)
|
|
||||||
|
|
||||||
enable_startup_arg = DeclareLaunchArgument(
|
|
||||||
"enable_startup_check",
|
|
||||||
default_value="true",
|
|
||||||
description="Enable startup hardware checks",
|
|
||||||
)
|
|
||||||
|
|
||||||
enable_runtime_arg = DeclareLaunchArgument(
|
|
||||||
"enable_runtime_monitoring",
|
|
||||||
default_value="true",
|
|
||||||
description="Enable continuous runtime monitoring",
|
|
||||||
)
|
|
||||||
|
|
||||||
log_dir_arg = DeclareLaunchArgument(
|
|
||||||
"log_directory",
|
|
||||||
default_value="/home/seb/saltybot-data/diagnostics",
|
|
||||||
description="Directory for diagnostic logs",
|
|
||||||
)
|
|
||||||
|
|
||||||
# Diagnostics node
|
|
||||||
diagnostics_node = Node(
|
|
||||||
package="saltybot_diagnostics",
|
|
||||||
executable="diagnostics_node",
|
|
||||||
name="diagnostics",
|
|
||||||
output="screen",
|
|
||||||
parameters=[
|
|
||||||
{
|
|
||||||
"config_file": LaunchConfiguration("config_file"),
|
|
||||||
"enable_startup_check": LaunchConfiguration("enable_startup_check"),
|
|
||||||
"enable_runtime_monitoring": LaunchConfiguration("enable_runtime_monitoring"),
|
|
||||||
"log_directory": LaunchConfiguration("log_directory"),
|
|
||||||
"monitoring_frequency": 1.0, # Hz
|
|
||||||
}
|
|
||||||
],
|
|
||||||
)
|
|
||||||
|
|
||||||
return LaunchDescription([
|
|
||||||
config_file_arg,
|
|
||||||
enable_startup_arg,
|
|
||||||
enable_runtime_arg,
|
|
||||||
log_dir_arg,
|
|
||||||
diagnostics_node,
|
|
||||||
])
|
|
||||||
@ -1,32 +0,0 @@
|
|||||||
<?xml version="1.0"?>
|
|
||||||
<?xml-model href="http://download.ros.org/schema/package_format3.xsd" schematypens="http://www.w3.org/2001/XMLSchema"?>
|
|
||||||
<package format="3">
|
|
||||||
<name>saltybot_diagnostics</name>
|
|
||||||
<version>0.1.0</version>
|
|
||||||
<description>
|
|
||||||
Comprehensive diagnostic self-test system for SaltyBot. Performs startup hardware
|
|
||||||
checks (RPLIDAR, RealSense, VESC, Jabra, servos, WiFi, GPS, disk, RAM) and
|
|
||||||
continuous monitoring (sensor FPS, motor stall, temps, network latency).
|
|
||||||
Publishes /saltybot/diagnostics (DiagnosticArray), triggers TTS boot result,
|
|
||||||
face animation, and logs to /home/seb/saltybot-data/diagnostics/.
|
|
||||||
</description>
|
|
||||||
<maintainer email="sl-controls@saltylab.local">sl-controls</maintainer>
|
|
||||||
<license>MIT</license>
|
|
||||||
|
|
||||||
<depend>rclpy</depend>
|
|
||||||
<depend>diagnostic_msgs</depend>
|
|
||||||
<depend>std_msgs</depend>
|
|
||||||
<depend>geometry_msgs</depend>
|
|
||||||
<depend>sensor_msgs</depend>
|
|
||||||
|
|
||||||
<buildtool_depend>ament_python</buildtool_depend>
|
|
||||||
|
|
||||||
<test_depend>ament_copyright</test_depend>
|
|
||||||
<test_depend>ament_flake8</test_depend>
|
|
||||||
<test_depend>ament_pep257</test_depend>
|
|
||||||
<test_depend>python3-pytest</test_depend>
|
|
||||||
|
|
||||||
<export>
|
|
||||||
<build_type>ament_python</build_type>
|
|
||||||
</export>
|
|
||||||
</package>
|
|
||||||
@ -1 +0,0 @@
|
|||||||
# Diagnostics package
|
|
||||||
@ -1,645 +0,0 @@
|
|||||||
#!/usr/bin/env python3
|
|
||||||
"""Comprehensive diagnostic self-test for SaltyBot.
|
|
||||||
|
|
||||||
Performs startup hardware checks and continuous runtime monitoring.
|
|
||||||
|
|
||||||
Startup checks:
|
|
||||||
- RPLIDAR (serial connection, rotation)
|
|
||||||
- RealSense D435i (USB enumeration, streams)
|
|
||||||
- VESC motor controller (UART connection, firmware)
|
|
||||||
- Jabra microphone (USB audio device)
|
|
||||||
- STM32 bridge (serial port, watchdog)
|
|
||||||
- Servo controller (PWM channels)
|
|
||||||
- WiFi (network interface, connectivity)
|
|
||||||
- GPS module (serial port, fix)
|
|
||||||
- Disk space (/home/seb/saltybot-data)
|
|
||||||
- System RAM (available memory)
|
|
||||||
|
|
||||||
Runtime monitoring:
|
|
||||||
- Sensor FPS drops (RealSense, RPLIDAR target)
|
|
||||||
- Motor stall detection (wheel encoder latency)
|
|
||||||
- Temperature thresholds (Orin GPU >80C, VESC >60C)
|
|
||||||
- Network latency (ping time)
|
|
||||||
|
|
||||||
Published topics:
|
|
||||||
/saltybot/diagnostics (diagnostic_msgs/DiagnosticArray) - System diagnostics
|
|
||||||
|
|
||||||
Services triggered:
|
|
||||||
/saltybot/tts_say - TTS boot result announcement
|
|
||||||
/saltybot/face/boot_animation - Face boot animation
|
|
||||||
|
|
||||||
Logging:
|
|
||||||
/home/seb/saltybot-data/diagnostics/ - JSON diagnostic logs
|
|
||||||
"""
|
|
||||||
|
|
||||||
import json
|
|
||||||
import time
|
|
||||||
import os
|
|
||||||
import subprocess
|
|
||||||
from pathlib import Path
|
|
||||||
from typing import Dict, List, Tuple, Optional
|
|
||||||
from dataclasses import dataclass, asdict
|
|
||||||
from datetime import datetime
|
|
||||||
from enum import Enum
|
|
||||||
import threading
|
|
||||||
|
|
||||||
import yaml
|
|
||||||
import psutil
|
|
||||||
import rclpy
|
|
||||||
from rclpy.node import Node
|
|
||||||
from rclpy.timer import Timer
|
|
||||||
from diagnostic_msgs.msg import DiagnosticArray, DiagnosticStatus
|
|
||||||
from std_msgs.msg import String
|
|
||||||
|
|
||||||
|
|
||||||
class DiagnosticLevel(Enum):
|
|
||||||
"""Diagnostic severity levels matching ROS2 DiagnosticStatus."""
|
|
||||||
OK = 0
|
|
||||||
WARN = 1
|
|
||||||
ERROR = 2
|
|
||||||
STALE = 3
|
|
||||||
|
|
||||||
|
|
||||||
@dataclass
|
|
||||||
class HardwareCheck:
|
|
||||||
"""Result of a hardware check."""
|
|
||||||
name: str
|
|
||||||
status: str # "OK", "WARN", "ERROR", "UNKNOWN"
|
|
||||||
message: str
|
|
||||||
details: Dict = None
|
|
||||||
|
|
||||||
def __post_init__(self):
|
|
||||||
if self.details is None:
|
|
||||||
self.details = {}
|
|
||||||
|
|
||||||
|
|
||||||
class DiagnosticsNode(Node):
|
|
||||||
"""ROS2 node for system diagnostics and self-test."""
|
|
||||||
|
|
||||||
def __init__(self):
|
|
||||||
super().__init__("diagnostics")
|
|
||||||
|
|
||||||
# Parameters
|
|
||||||
self.declare_parameter("enable_startup_check", True)
|
|
||||||
self.declare_parameter("enable_runtime_monitoring", True)
|
|
||||||
self.declare_parameter("monitoring_frequency", 1.0) # Hz
|
|
||||||
self.declare_parameter("log_directory", "/home/seb/saltybot-data/diagnostics")
|
|
||||||
self.declare_parameter("config_file", "diagnostic_checks.yaml")
|
|
||||||
|
|
||||||
self.enable_startup_check = self.get_parameter("enable_startup_check").value
|
|
||||||
self.enable_runtime_monitoring = self.get_parameter("enable_runtime_monitoring").value
|
|
||||||
self.monitoring_freq = self.get_parameter("monitoring_frequency").value
|
|
||||||
self.log_dir = Path(self.get_parameter("log_directory").value)
|
|
||||||
config_file = self.get_parameter("config_file").value
|
|
||||||
|
|
||||||
# Create log directory
|
|
||||||
self.log_dir.mkdir(parents=True, exist_ok=True)
|
|
||||||
|
|
||||||
# Load configuration
|
|
||||||
self.config = self._load_config(config_file)
|
|
||||||
|
|
||||||
# State
|
|
||||||
self.hardware_checks: Dict[str, HardwareCheck] = {}
|
|
||||||
self.runtime_metrics: Dict[str, dict] = {}
|
|
||||||
self.startup_complete = False
|
|
||||||
self.startup_time = time.time()
|
|
||||||
self.last_sensor_timestamps: Dict[str, float] = {}
|
|
||||||
|
|
||||||
# Publishers
|
|
||||||
self.pub_diagnostics = self.create_publisher(DiagnosticArray, "/saltybot/diagnostics", 1)
|
|
||||||
self.pub_tts = self.create_publisher(String, "/saltybot/tts_say", 1)
|
|
||||||
self.pub_face = self.create_publisher(String, "/saltybot/face/boot_animation", 1)
|
|
||||||
|
|
||||||
# Run startup checks in background
|
|
||||||
if self.enable_startup_check:
|
|
||||||
check_thread = threading.Thread(target=self._run_startup_checks)
|
|
||||||
check_thread.daemon = True
|
|
||||||
check_thread.start()
|
|
||||||
|
|
||||||
# Runtime monitoring timer
|
|
||||||
if self.enable_runtime_monitoring:
|
|
||||||
period = 1.0 / self.monitoring_freq
|
|
||||||
self.timer: Timer = self.create_timer(period, self._runtime_check)
|
|
||||||
|
|
||||||
self.get_logger().info(
|
|
||||||
f"Diagnostics initialized. Startup checks: {self.enable_startup_check}, "
|
|
||||||
f"Runtime monitoring: {self.enable_runtime_monitoring} @ {self.monitoring_freq}Hz"
|
|
||||||
)
|
|
||||||
|
|
||||||
def _load_config(self, config_file: str) -> dict:
|
|
||||||
"""Load diagnostic configuration from YAML."""
|
|
||||||
try:
|
|
||||||
if not Path(config_file).exists():
|
|
||||||
share_dir = Path(__file__).parent.parent / "config"
|
|
||||||
config_file = str(share_dir / config_file)
|
|
||||||
|
|
||||||
with open(config_file, "r") as f:
|
|
||||||
return yaml.safe_load(f) or {}
|
|
||||||
except Exception as e:
|
|
||||||
self.get_logger().warn(f"Failed to load config: {e}. Using defaults.")
|
|
||||||
return {}
|
|
||||||
|
|
||||||
def _run_startup_checks(self) -> None:
|
|
||||||
"""Run all startup hardware checks in background."""
|
|
||||||
try:
|
|
||||||
self.get_logger().info("Starting hardware diagnostic checks...")
|
|
||||||
|
|
||||||
# Perform checks
|
|
||||||
self._check_rplidar()
|
|
||||||
self._check_realsense()
|
|
||||||
self._check_vesc()
|
|
||||||
self._check_jabra()
|
|
||||||
self._check_stm32_bridge()
|
|
||||||
self._check_servos()
|
|
||||||
self._check_wifi()
|
|
||||||
self._check_gps()
|
|
||||||
self._check_disk_space()
|
|
||||||
self._check_ram()
|
|
||||||
|
|
||||||
# Generate summary
|
|
||||||
self._summarize_startup_checks()
|
|
||||||
|
|
||||||
# Announce boot result via TTS
|
|
||||||
self._announce_boot_result()
|
|
||||||
|
|
||||||
# Trigger face boot animation
|
|
||||||
self._trigger_face_animation()
|
|
||||||
|
|
||||||
self.startup_complete = True
|
|
||||||
self.get_logger().info("Startup checks complete")
|
|
||||||
|
|
||||||
# Log results
|
|
||||||
self._log_diagnostics("startup_checks")
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
self.get_logger().error(f"Startup checks failed: {e}")
|
|
||||||
|
|
||||||
def _check_rplidar(self) -> None:
|
|
||||||
"""Check RPLIDAR connection and operation."""
|
|
||||||
check = HardwareCheck("RPLIDAR", "UNKNOWN", "No check performed")
|
|
||||||
|
|
||||||
try:
|
|
||||||
# Check if /dev/ttyUSB0 exists (typical RPLIDAR port)
|
|
||||||
if Path("/dev/ttyUSB0").exists():
|
|
||||||
# Try to get LIDAR data via topic subscription (would be done by subscriber)
|
|
||||||
check.status = "OK"
|
|
||||||
check.message = "RPLIDAR detected on /dev/ttyUSB0"
|
|
||||||
check.details = {"port": "/dev/ttyUSB0"}
|
|
||||||
else:
|
|
||||||
check.status = "ERROR"
|
|
||||||
check.message = "RPLIDAR not found on /dev/ttyUSB0"
|
|
||||||
check.details = {"expected_port": "/dev/ttyUSB0"}
|
|
||||||
except Exception as e:
|
|
||||||
check.status = "ERROR"
|
|
||||||
check.message = f"RPLIDAR check failed: {e}"
|
|
||||||
|
|
||||||
self.hardware_checks["rplidar"] = check
|
|
||||||
|
|
||||||
def _check_realsense(self) -> None:
|
|
||||||
"""Check RealSense D435i camera."""
|
|
||||||
check = HardwareCheck("RealSense D435i", "UNKNOWN", "No check performed")
|
|
||||||
|
|
||||||
try:
|
|
||||||
# Check for RealSense USB device
|
|
||||||
result = subprocess.run(
|
|
||||||
["lsusb"], capture_output=True, text=True, timeout=5
|
|
||||||
)
|
|
||||||
if "RealSense" in result.stdout or "Intel" in result.stdout:
|
|
||||||
check.status = "OK"
|
|
||||||
check.message = "RealSense D435i detected via USB"
|
|
||||||
check.details = {"device": "Intel RealSense"}
|
|
||||||
else:
|
|
||||||
check.status = "WARN"
|
|
||||||
check.message = "RealSense D435i not detected via lsusb"
|
|
||||||
except subprocess.TimeoutExpired:
|
|
||||||
check.status = "WARN"
|
|
||||||
check.message = "lsusb check timed out"
|
|
||||||
except Exception as e:
|
|
||||||
check.status = "ERROR"
|
|
||||||
check.message = f"RealSense check failed: {e}"
|
|
||||||
|
|
||||||
self.hardware_checks["realsense"] = check
|
|
||||||
|
|
||||||
def _check_vesc(self) -> None:
|
|
||||||
"""Check VESC motor controller."""
|
|
||||||
check = HardwareCheck("VESC Motor Controller", "UNKNOWN", "No check performed")
|
|
||||||
|
|
||||||
try:
|
|
||||||
# Check for VESC serial port (typically /dev/ttyUSB1)
|
|
||||||
vesc_ports = ["/dev/ttyUSB1", "/dev/ttyUSB2", "/dev/ttyACM0"]
|
|
||||||
found_port = None
|
|
||||||
|
|
||||||
for port in vesc_ports:
|
|
||||||
if Path(port).exists():
|
|
||||||
found_port = port
|
|
||||||
break
|
|
||||||
|
|
||||||
if found_port:
|
|
||||||
check.status = "OK"
|
|
||||||
check.message = f"VESC detected on {found_port}"
|
|
||||||
check.details = {"port": found_port}
|
|
||||||
else:
|
|
||||||
check.status = "ERROR"
|
|
||||||
check.message = "VESC not found on expected ports"
|
|
||||||
check.details = {"checked_ports": vesc_ports}
|
|
||||||
except Exception as e:
|
|
||||||
check.status = "ERROR"
|
|
||||||
check.message = f"VESC check failed: {e}"
|
|
||||||
|
|
||||||
self.hardware_checks["vesc"] = check
|
|
||||||
|
|
||||||
def _check_jabra(self) -> None:
|
|
||||||
"""Check Jabra microphone."""
|
|
||||||
check = HardwareCheck("Jabra Microphone", "UNKNOWN", "No check performed")
|
|
||||||
|
|
||||||
try:
|
|
||||||
result = subprocess.run(
|
|
||||||
["arecord", "-l"], capture_output=True, text=True, timeout=5
|
|
||||||
)
|
|
||||||
if "Jabra" in result.stdout or "jabra" in result.stdout.lower():
|
|
||||||
check.status = "OK"
|
|
||||||
check.message = "Jabra microphone detected"
|
|
||||||
check.details = {"device": "Jabra"}
|
|
||||||
else:
|
|
||||||
check.status = "WARN"
|
|
||||||
check.message = "Jabra microphone not detected in arecord list"
|
|
||||||
except FileNotFoundError:
|
|
||||||
check.status = "WARN"
|
|
||||||
check.message = "arecord not available for audio check"
|
|
||||||
except Exception as e:
|
|
||||||
check.status = "WARN"
|
|
||||||
check.message = f"Jabra check failed: {e}"
|
|
||||||
|
|
||||||
self.hardware_checks["jabra"] = check
|
|
||||||
|
|
||||||
def _check_stm32_bridge(self) -> None:
|
|
||||||
"""Check STM32 bridge connection."""
|
|
||||||
check = HardwareCheck("STM32 Bridge", "UNKNOWN", "No check performed")
|
|
||||||
|
|
||||||
try:
|
|
||||||
# Check serial port exists
|
|
||||||
stm32_port = "/dev/ttyUSB0" # May vary
|
|
||||||
if Path(stm32_port).exists():
|
|
||||||
check.status = "OK"
|
|
||||||
check.message = f"STM32 bridge detected on {stm32_port}"
|
|
||||||
check.details = {"port": stm32_port}
|
|
||||||
else:
|
|
||||||
check.status = "WARN"
|
|
||||||
check.message = "STM32 bridge serial port not found"
|
|
||||||
except Exception as e:
|
|
||||||
check.status = "ERROR"
|
|
||||||
check.message = f"STM32 check failed: {e}"
|
|
||||||
|
|
||||||
self.hardware_checks["stm32_bridge"] = check
|
|
||||||
|
|
||||||
def _check_servos(self) -> None:
|
|
||||||
"""Check servo controller."""
|
|
||||||
check = HardwareCheck("Servo Controller", "UNKNOWN", "No check performed")
|
|
||||||
|
|
||||||
try:
|
|
||||||
# Check for I2C servo controller
|
|
||||||
result = subprocess.run(
|
|
||||||
["i2cdetect", "-y", "1"], capture_output=True, text=True, timeout=5
|
|
||||||
)
|
|
||||||
if result.returncode == 0:
|
|
||||||
check.status = "OK"
|
|
||||||
check.message = "I2C bus responsive (servo controller likely present)"
|
|
||||||
check.details = {"bus": "I2C-1"}
|
|
||||||
else:
|
|
||||||
check.status = "WARN"
|
|
||||||
check.message = "I2C bus check failed"
|
|
||||||
except FileNotFoundError:
|
|
||||||
check.status = "WARN"
|
|
||||||
check.message = "i2cdetect not available"
|
|
||||||
except Exception as e:
|
|
||||||
check.status = "WARN"
|
|
||||||
check.message = f"Servo check failed: {e}"
|
|
||||||
|
|
||||||
self.hardware_checks["servos"] = check
|
|
||||||
|
|
||||||
def _check_wifi(self) -> None:
|
|
||||||
"""Check WiFi connectivity."""
|
|
||||||
check = HardwareCheck("WiFi", "UNKNOWN", "No check performed")
|
|
||||||
|
|
||||||
try:
|
|
||||||
result = subprocess.run(
|
|
||||||
["iwconfig"], capture_output=True, text=True, timeout=5
|
|
||||||
)
|
|
||||||
if "ESSID" in result.stdout and "Frequency" in result.stdout:
|
|
||||||
check.status = "OK"
|
|
||||||
check.message = "WiFi interface active"
|
|
||||||
check.details = {"status": "connected"}
|
|
||||||
else:
|
|
||||||
check.status = "WARN"
|
|
||||||
check.message = "WiFi interface not connected"
|
|
||||||
except Exception as e:
|
|
||||||
check.status = "WARN"
|
|
||||||
check.message = f"WiFi check failed: {e}"
|
|
||||||
|
|
||||||
self.hardware_checks["wifi"] = check
|
|
||||||
|
|
||||||
def _check_gps(self) -> None:
|
|
||||||
"""Check GPS module."""
|
|
||||||
check = HardwareCheck("GPS Module", "UNKNOWN", "No check performed")
|
|
||||||
|
|
||||||
try:
|
|
||||||
# Check for GPS serial port
|
|
||||||
gps_ports = ["/dev/ttyUSB*", "/dev/ttyACM*"]
|
|
||||||
# Since glob patterns are complex, just check common ports
|
|
||||||
gps_device = None
|
|
||||||
for port in ["/dev/ttyUSB3", "/dev/ttyUSB4", "/dev/ttyACM1"]:
|
|
||||||
if Path(port).exists():
|
|
||||||
gps_device = port
|
|
||||||
break
|
|
||||||
|
|
||||||
if gps_device:
|
|
||||||
check.status = "OK"
|
|
||||||
check.message = f"GPS device detected on {gps_device}"
|
|
||||||
check.details = {"port": gps_device}
|
|
||||||
else:
|
|
||||||
check.status = "WARN"
|
|
||||||
check.message = "GPS module not detected"
|
|
||||||
except Exception as e:
|
|
||||||
check.status = "WARN"
|
|
||||||
check.message = f"GPS check failed: {e}"
|
|
||||||
|
|
||||||
self.hardware_checks["gps"] = check
|
|
||||||
|
|
||||||
def _check_disk_space(self) -> None:
|
|
||||||
"""Check disk space in data directory."""
|
|
||||||
check = HardwareCheck("Disk Space", "UNKNOWN", "No check performed")
|
|
||||||
|
|
||||||
try:
|
|
||||||
disk_usage = psutil.disk_usage(str(self.log_dir.parent))
|
|
||||||
percent_used = disk_usage.percent
|
|
||||||
free_gb = disk_usage.free / (1024**3)
|
|
||||||
|
|
||||||
if percent_used > 90:
|
|
||||||
check.status = "ERROR"
|
|
||||||
check.message = f"Disk full: {percent_used:.1f}% used"
|
|
||||||
elif percent_used > 80:
|
|
||||||
check.status = "WARN"
|
|
||||||
check.message = f"Disk usage high: {percent_used:.1f}% used"
|
|
||||||
else:
|
|
||||||
check.status = "OK"
|
|
||||||
check.message = f"Disk OK: {free_gb:.2f} GB free"
|
|
||||||
|
|
||||||
check.details = {
|
|
||||||
"percent_used": percent_used,
|
|
||||||
"free_gb": free_gb,
|
|
||||||
"total_gb": disk_usage.total / (1024**3),
|
|
||||||
}
|
|
||||||
except Exception as e:
|
|
||||||
check.status = "WARN"
|
|
||||||
check.message = f"Disk check failed: {e}"
|
|
||||||
|
|
||||||
self.hardware_checks["disk_space"] = check
|
|
||||||
|
|
||||||
def _check_ram(self) -> None:
|
|
||||||
"""Check available system RAM."""
|
|
||||||
check = HardwareCheck("System RAM", "UNKNOWN", "No check performed")
|
|
||||||
|
|
||||||
try:
|
|
||||||
memory = psutil.virtual_memory()
|
|
||||||
percent_used = memory.percent
|
|
||||||
available_gb = memory.available / (1024**3)
|
|
||||||
|
|
||||||
if percent_used > 90:
|
|
||||||
check.status = "ERROR"
|
|
||||||
check.message = f"RAM critical: {percent_used:.1f}% used"
|
|
||||||
elif percent_used > 80:
|
|
||||||
check.status = "WARN"
|
|
||||||
check.message = f"RAM high: {percent_used:.1f}% used"
|
|
||||||
else:
|
|
||||||
check.status = "OK"
|
|
||||||
check.message = f"RAM OK: {available_gb:.2f} GB available"
|
|
||||||
|
|
||||||
check.details = {
|
|
||||||
"percent_used": percent_used,
|
|
||||||
"available_gb": available_gb,
|
|
||||||
"total_gb": memory.total / (1024**3),
|
|
||||||
}
|
|
||||||
except Exception as e:
|
|
||||||
check.status = "WARN"
|
|
||||||
check.message = f"RAM check failed: {e}"
|
|
||||||
|
|
||||||
self.hardware_checks["ram"] = check
|
|
||||||
|
|
||||||
def _summarize_startup_checks(self) -> None:
|
|
||||||
"""Generate summary of startup checks."""
|
|
||||||
errors = [name for name, check in self.hardware_checks.items() if check.status == "ERROR"]
|
|
||||||
warnings = [name for name, check in self.hardware_checks.items() if check.status == "WARN"]
|
|
||||||
|
|
||||||
summary = f"Startup checks: {len(self.hardware_checks)} items checked"
|
|
||||||
if errors:
|
|
||||||
summary += f", {len(errors)} errors: {', '.join(errors)}"
|
|
||||||
if warnings:
|
|
||||||
summary += f", {len(warnings)} warnings: {', '.join(warnings)}"
|
|
||||||
|
|
||||||
self.get_logger().info(summary)
|
|
||||||
|
|
||||||
def _announce_boot_result(self) -> None:
|
|
||||||
"""Announce boot result via TTS."""
|
|
||||||
errors = [name for name, check in self.hardware_checks.items() if check.status == "ERROR"]
|
|
||||||
|
|
||||||
if not errors:
|
|
||||||
message = "Boot complete. All systems online."
|
|
||||||
else:
|
|
||||||
message = f"Boot complete with errors. {', '.join(errors)} offline."
|
|
||||||
|
|
||||||
# Publish TTS message
|
|
||||||
try:
|
|
||||||
self.pub_tts.publish(String(data=message))
|
|
||||||
except Exception as e:
|
|
||||||
self.get_logger().warn(f"Failed to publish TTS: {e}")
|
|
||||||
|
|
||||||
def _trigger_face_animation(self) -> None:
|
|
||||||
"""Trigger face boot animation."""
|
|
||||||
errors = [name for name, check in self.hardware_checks.items() if check.status == "ERROR"]
|
|
||||||
|
|
||||||
animation_type = "boot_error" if errors else "boot_success"
|
|
||||||
|
|
||||||
try:
|
|
||||||
self.pub_face.publish(String(data=animation_type))
|
|
||||||
except Exception as e:
|
|
||||||
self.get_logger().warn(f"Failed to trigger face animation: {e}")
|
|
||||||
|
|
||||||
def _runtime_check(self) -> None:
|
|
||||||
"""Perform runtime health monitoring."""
|
|
||||||
if not self.startup_complete:
|
|
||||||
return # Wait for startup checks
|
|
||||||
|
|
||||||
# Check CPU temperature (Orin GPU)
|
|
||||||
self._check_gpu_temperature()
|
|
||||||
|
|
||||||
# Check network latency
|
|
||||||
self._check_network_latency()
|
|
||||||
|
|
||||||
# Publish diagnostic array
|
|
||||||
self._publish_diagnostics()
|
|
||||||
|
|
||||||
def _check_gpu_temperature(self) -> None:
|
|
||||||
"""Check Jetson Orin GPU temperature."""
|
|
||||||
try:
|
|
||||||
result = subprocess.run(
|
|
||||||
["cat", "/sys/devices/virtual/thermal/thermal_zone0/temp"],
|
|
||||||
capture_output=True,
|
|
||||||
text=True,
|
|
||||||
timeout=2,
|
|
||||||
)
|
|
||||||
temp_c = int(result.stdout.strip()) / 1000.0
|
|
||||||
|
|
||||||
if temp_c > 80:
|
|
||||||
status = "WARN"
|
|
||||||
elif temp_c > 85:
|
|
||||||
status = "ERROR"
|
|
||||||
else:
|
|
||||||
status = "OK"
|
|
||||||
|
|
||||||
self.runtime_metrics["gpu_temp"] = {
|
|
||||||
"status": status,
|
|
||||||
"temperature_c": temp_c,
|
|
||||||
"threshold_warn": 80,
|
|
||||||
"threshold_error": 85,
|
|
||||||
}
|
|
||||||
except Exception as e:
|
|
||||||
self.runtime_metrics["gpu_temp"] = {
|
|
||||||
"status": "UNKNOWN",
|
|
||||||
"error": str(e),
|
|
||||||
}
|
|
||||||
|
|
||||||
def _check_network_latency(self) -> None:
|
|
||||||
"""Check network latency to gateway."""
|
|
||||||
try:
|
|
||||||
result = subprocess.run(
|
|
||||||
["ping", "-c", "1", "-W", "1", "8.8.8.8"],
|
|
||||||
capture_output=True,
|
|
||||||
text=True,
|
|
||||||
timeout=3,
|
|
||||||
)
|
|
||||||
|
|
||||||
if result.returncode == 0:
|
|
||||||
# Parse latency from output
|
|
||||||
for line in result.stdout.split("\n"):
|
|
||||||
if "time=" in line:
|
|
||||||
parts = line.split("time=")[-1].split(" ")
|
|
||||||
latency_ms = float(parts[0])
|
|
||||||
|
|
||||||
if latency_ms > 100:
|
|
||||||
status = "WARN"
|
|
||||||
elif latency_ms > 200:
|
|
||||||
status = "ERROR"
|
|
||||||
else:
|
|
||||||
status = "OK"
|
|
||||||
|
|
||||||
self.runtime_metrics["network_latency"] = {
|
|
||||||
"status": status,
|
|
||||||
"latency_ms": latency_ms,
|
|
||||||
}
|
|
||||||
break
|
|
||||||
else:
|
|
||||||
self.runtime_metrics["network_latency"] = {
|
|
||||||
"status": "WARN",
|
|
||||||
"message": "No network connectivity",
|
|
||||||
}
|
|
||||||
except Exception as e:
|
|
||||||
self.runtime_metrics["network_latency"] = {
|
|
||||||
"status": "UNKNOWN",
|
|
||||||
"error": str(e),
|
|
||||||
}
|
|
||||||
|
|
||||||
def _publish_diagnostics(self) -> None:
|
|
||||||
"""Publish diagnostic array."""
|
|
||||||
array = DiagnosticArray()
|
|
||||||
array.header.stamp = self.get_clock().now().to_msg()
|
|
||||||
|
|
||||||
# Add startup checks
|
|
||||||
for name, check in self.hardware_checks.items():
|
|
||||||
status = DiagnosticStatus()
|
|
||||||
status.name = f"saltybot/{name}"
|
|
||||||
status.level = self._get_diagnostic_level(check.status)
|
|
||||||
status.message = check.message
|
|
||||||
|
|
||||||
if check.details:
|
|
||||||
for key, value in check.details.items():
|
|
||||||
status.values.append(
|
|
||||||
self._create_key_value(key, str(value))
|
|
||||||
)
|
|
||||||
|
|
||||||
array.status.append(status)
|
|
||||||
|
|
||||||
# Add runtime metrics
|
|
||||||
for metric_name, metric_data in self.runtime_metrics.items():
|
|
||||||
status = DiagnosticStatus()
|
|
||||||
status.name = f"saltybot/{metric_name}"
|
|
||||||
status.level = self._get_diagnostic_level(metric_data.get("status", "UNKNOWN"))
|
|
||||||
status.message = metric_data.get("message", "Runtime check")
|
|
||||||
|
|
||||||
for key, value in metric_data.items():
|
|
||||||
if key != "status" and key != "message":
|
|
||||||
status.values.append(
|
|
||||||
self._create_key_value(key, str(value))
|
|
||||||
)
|
|
||||||
|
|
||||||
array.status.append(status)
|
|
||||||
|
|
||||||
self.pub_diagnostics.publish(array)
|
|
||||||
|
|
||||||
def _get_diagnostic_level(self, status: str) -> int:
|
|
||||||
"""Convert status string to DiagnosticStatus level."""
|
|
||||||
mapping = {
|
|
||||||
"OK": DiagnosticStatus.OK,
|
|
||||||
"WARN": DiagnosticStatus.WARN,
|
|
||||||
"ERROR": DiagnosticStatus.ERROR,
|
|
||||||
"UNKNOWN": DiagnosticStatus.STALE,
|
|
||||||
}
|
|
||||||
return mapping.get(status, DiagnosticStatus.STALE)
|
|
||||||
|
|
||||||
def _create_key_value(self, key: str, value: str):
|
|
||||||
"""Create a KeyValue for diagnostic status."""
|
|
||||||
from diagnostic_msgs.msg import KeyValue
|
|
||||||
kv = KeyValue()
|
|
||||||
kv.key = key
|
|
||||||
kv.value = value
|
|
||||||
return kv
|
|
||||||
|
|
||||||
def _log_diagnostics(self, check_type: str) -> None:
|
|
||||||
"""Log diagnostics to JSON file."""
|
|
||||||
try:
|
|
||||||
log_data = {
|
|
||||||
"timestamp": datetime.now().isoformat(),
|
|
||||||
"check_type": check_type,
|
|
||||||
"hardware_checks": {
|
|
||||||
name: {
|
|
||||||
"status": check.status,
|
|
||||||
"message": check.message,
|
|
||||||
"details": check.details or {},
|
|
||||||
}
|
|
||||||
for name, check in self.hardware_checks.items()
|
|
||||||
},
|
|
||||||
"runtime_metrics": self.runtime_metrics,
|
|
||||||
}
|
|
||||||
|
|
||||||
filename = self.log_dir / f"diagnostics_{datetime.now().strftime('%Y%m%d_%H%M%S')}.json"
|
|
||||||
with open(filename, "w") as f:
|
|
||||||
json.dump(log_data, f, indent=2)
|
|
||||||
|
|
||||||
self.get_logger().info(f"Diagnostics logged to {filename}")
|
|
||||||
except Exception as e:
|
|
||||||
self.get_logger().error(f"Failed to log diagnostics: {e}")
|
|
||||||
|
|
||||||
|
|
||||||
def main(args=None):
|
|
||||||
rclpy.init(args=args)
|
|
||||||
node = DiagnosticsNode()
|
|
||||||
try:
|
|
||||||
rclpy.spin(node)
|
|
||||||
except KeyboardInterrupt:
|
|
||||||
pass
|
|
||||||
finally:
|
|
||||||
node.destroy_node()
|
|
||||||
rclpy.shutdown()
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
main()
|
|
||||||
@ -1,2 +0,0 @@
|
|||||||
[develop]
|
|
||||||
script-dir=$base/lib/saltybot_diagnostics
|
|
||||||
@ -1,30 +0,0 @@
|
|||||||
from setuptools import setup
|
|
||||||
|
|
||||||
package_name = "saltybot_diagnostics"
|
|
||||||
|
|
||||||
setup(
|
|
||||||
name=package_name,
|
|
||||||
version="0.1.0",
|
|
||||||
packages=[package_name],
|
|
||||||
data_files=[
|
|
||||||
("share/ament_index/resource_index/packages", [f"resource/{package_name}"]),
|
|
||||||
(f"share/{package_name}", ["package.xml"]),
|
|
||||||
(f"share/{package_name}/launch", ["launch/diagnostics.launch.py"]),
|
|
||||||
(f"share/{package_name}/config", ["config/diagnostic_checks.yaml"]),
|
|
||||||
],
|
|
||||||
install_requires=["setuptools", "pyyaml", "psutil"],
|
|
||||||
zip_safe=True,
|
|
||||||
maintainer="sl-controls",
|
|
||||||
maintainer_email="sl-controls@saltylab.local",
|
|
||||||
description=(
|
|
||||||
"Hardware diagnostic self-test: startup checks + continuous monitoring, "
|
|
||||||
"telemetry logging, TTS reporting, face alerts"
|
|
||||||
),
|
|
||||||
license="MIT",
|
|
||||||
tests_require=["pytest"],
|
|
||||||
entry_points={
|
|
||||||
"console_scripts": [
|
|
||||||
"diagnostics_node = saltybot_diagnostics.diagnostics_node:main",
|
|
||||||
],
|
|
||||||
},
|
|
||||||
)
|
|
||||||
@ -1 +0,0 @@
|
|||||||
# Test module
|
|
||||||
@ -1,99 +0,0 @@
|
|||||||
"""Unit tests for diagnostics system."""
|
|
||||||
|
|
||||||
import unittest
|
|
||||||
import json
|
|
||||||
from datetime import datetime
|
|
||||||
|
|
||||||
|
|
||||||
class TestDiagnostics(unittest.TestCase):
|
|
||||||
"""Test cases for diagnostics node."""
|
|
||||||
|
|
||||||
def test_hardware_check_creation(self):
|
|
||||||
"""Test creation of hardware check results."""
|
|
||||||
checks = {
|
|
||||||
"rplidar": {"status": "OK", "message": "RPLIDAR detected"},
|
|
||||||
"realsense": {"status": "ERROR", "message": "RealSense not found"},
|
|
||||||
"vesc": {"status": "WARN", "message": "VESC connection uncertain"},
|
|
||||||
}
|
|
||||||
|
|
||||||
self.assertEqual(len(checks), 3)
|
|
||||||
self.assertEqual(checks["rplidar"]["status"], "OK")
|
|
||||||
self.assertEqual(checks["realsense"]["status"], "ERROR")
|
|
||||||
|
|
||||||
def test_diagnostic_json_logging(self):
|
|
||||||
"""Test JSON logging of diagnostics."""
|
|
||||||
log_data = {
|
|
||||||
"timestamp": datetime.now().isoformat(),
|
|
||||||
"check_type": "startup_checks",
|
|
||||||
"hardware_checks": {
|
|
||||||
"rplidar": {
|
|
||||||
"status": "OK",
|
|
||||||
"message": "Device OK",
|
|
||||||
"details": {"port": "/dev/ttyUSB0"},
|
|
||||||
},
|
|
||||||
"realsense": {
|
|
||||||
"status": "ERROR",
|
|
||||||
"message": "Device not found",
|
|
||||||
"details": {},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
"runtime_metrics": {
|
|
||||||
"gpu_temp": {"status": "OK", "temperature_c": 65.0},
|
|
||||||
"network_latency": {"status": "WARN", "latency_ms": 150},
|
|
||||||
},
|
|
||||||
}
|
|
||||||
|
|
||||||
# Should be JSON serializable
|
|
||||||
json_str = json.dumps(log_data)
|
|
||||||
parsed = json.loads(json_str)
|
|
||||||
|
|
||||||
self.assertIn("timestamp", parsed)
|
|
||||||
self.assertEqual(len(parsed["hardware_checks"]), 2)
|
|
||||||
self.assertEqual(parsed["hardware_checks"]["rplidar"]["status"], "OK")
|
|
||||||
|
|
||||||
def test_temperature_threshold_detection(self):
|
|
||||||
"""Test temperature threshold detection."""
|
|
||||||
thresholds = {
|
|
||||||
"gpu_temp": {"warn": 80, "error": 85},
|
|
||||||
"vesc_temp": {"warn": 60, "error": 70},
|
|
||||||
}
|
|
||||||
|
|
||||||
test_temps = [
|
|
||||||
(65, "OK"),
|
|
||||||
(82, "WARN"),
|
|
||||||
(88, "ERROR"),
|
|
||||||
]
|
|
||||||
|
|
||||||
for temp, expected_status in test_temps:
|
|
||||||
if temp < thresholds["gpu_temp"]["warn"]:
|
|
||||||
status = "OK"
|
|
||||||
elif temp < thresholds["gpu_temp"]["error"]:
|
|
||||||
status = "WARN"
|
|
||||||
else:
|
|
||||||
status = "ERROR"
|
|
||||||
|
|
||||||
self.assertEqual(status, expected_status)
|
|
||||||
|
|
||||||
def test_diagnostic_aggregation(self):
|
|
||||||
"""Test aggregation of multiple diagnostics."""
|
|
||||||
hardware_checks = {
|
|
||||||
"rplidar": "OK",
|
|
||||||
"realsense": "OK",
|
|
||||||
"vesc": "ERROR",
|
|
||||||
"wifi": "OK",
|
|
||||||
"gps": "WARN",
|
|
||||||
}
|
|
||||||
|
|
||||||
errors = [name for name, status in hardware_checks.items() if status == "ERROR"]
|
|
||||||
warnings = [name for name, status in hardware_checks.items() if status == "WARN"]
|
|
||||||
ok_items = [name for name, status in hardware_checks.items() if status == "OK"]
|
|
||||||
|
|
||||||
self.assertEqual(len(errors), 1)
|
|
||||||
self.assertEqual(len(warnings), 1)
|
|
||||||
self.assertEqual(len(ok_items), 3)
|
|
||||||
self.assertIn("vesc", errors)
|
|
||||||
self.assertIn("gps", warnings)
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
unittest.main()
|
|
||||||
@ -1,7 +0,0 @@
|
|||||||
build/
|
|
||||||
install/
|
|
||||||
log/
|
|
||||||
*.egg-info/
|
|
||||||
__pycache__/
|
|
||||||
*.pyc
|
|
||||||
.pytest_cache/
|
|
||||||
@ -1,21 +0,0 @@
|
|||||||
cmake_minimum_required(VERSION 3.8)
|
|
||||||
project(saltybot_social_memory)
|
|
||||||
|
|
||||||
if(CMAKE_COMPILER_IS_GNUCXX OR CMAKE_CXX_COMPILER_ID MATCHES "Clang")
|
|
||||||
add_compile_options(-Wall -Wextra -Wpedantic)
|
|
||||||
endif()
|
|
||||||
|
|
||||||
find_package(ament_cmake REQUIRED)
|
|
||||||
find_package(rosidl_default_generators REQUIRED)
|
|
||||||
find_package(std_msgs REQUIRED)
|
|
||||||
find_package(builtin_interfaces REQUIRED)
|
|
||||||
|
|
||||||
rosidl_generate_interfaces(${PROJECT_NAME}
|
|
||||||
"srv/PersonLookup.srv"
|
|
||||||
"srv/PersonUpdate.srv"
|
|
||||||
"srv/EncounterRecord.srv"
|
|
||||||
"srv/DatabaseStats.srv"
|
|
||||||
DEPENDENCIES std_msgs builtin_interfaces
|
|
||||||
)
|
|
||||||
|
|
||||||
ament_package()
|
|
||||||
@ -1,30 +0,0 @@
|
|||||||
<?xml version="1.0"?>
|
|
||||||
<?xml-model href="http://download.ros.org/schema/package_format3.xsd" schematypens="http://www.w3.org/2001/XMLSchema"?>
|
|
||||||
<package format="3">
|
|
||||||
<name>saltybot_social_memory</name>
|
|
||||||
<version>0.1.0</version>
|
|
||||||
<description>
|
|
||||||
Persistent social memory database for SaltyBot.
|
|
||||||
SQLite-backed storage of person profiles, relationship tiers, encounter history, and interaction statistics.
|
|
||||||
ROS2 services for lookup, update, and encounter recording with automatic tier promotion (Issue #443).
|
|
||||||
</description>
|
|
||||||
<maintainer email="seb@vayrette.com">seb</maintainer>
|
|
||||||
<license>MIT</license>
|
|
||||||
|
|
||||||
<depend>rclpy</depend>
|
|
||||||
<depend>std_msgs</depend>
|
|
||||||
<depend>rosidl_default_generators</depend>
|
|
||||||
<depend>builtin_interfaces</depend>
|
|
||||||
<depend>ament_index_python</depend>
|
|
||||||
|
|
||||||
<exec_depend>rosidl_default_runtime</exec_depend>
|
|
||||||
|
|
||||||
<test_depend>ament_copyright</test_depend>
|
|
||||||
<test_depend>ament_flake8</test_depend>
|
|
||||||
<test_depend>ament_pep257</test_depend>
|
|
||||||
<test_depend>python3-pytest</test_depend>
|
|
||||||
|
|
||||||
<export>
|
|
||||||
<build_type>ament_cmake</build_type>
|
|
||||||
</export>
|
|
||||||
</package>
|
|
||||||
@ -1,9 +0,0 @@
|
|||||||
---
|
|
||||||
int32 total_persons
|
|
||||||
int32 total_encounters
|
|
||||||
int32 favorites_count
|
|
||||||
int32 regulars_count
|
|
||||||
int32 strangers_count
|
|
||||||
float32 database_size_mb
|
|
||||||
builtin_interfaces/Time last_backup
|
|
||||||
string database_path
|
|
||||||
@ -1,10 +0,0 @@
|
|||||||
int32 person_id
|
|
||||||
string transcript # What was said or observed
|
|
||||||
string mood # Detected emotion during encounter
|
|
||||||
float32 engagement_score # 0.0–1.0; how engaged the person was
|
|
||||||
bool positive_interaction # Whether interaction went well
|
|
||||||
---
|
|
||||||
bool success
|
|
||||||
string message
|
|
||||||
int32 encounter_id
|
|
||||||
string tier_changed_to # If relationship tier was promoted
|
|
||||||
@ -1,12 +0,0 @@
|
|||||||
string query_type # "id", "name", "recent"
|
|
||||||
string query_value # person_id, name, or days_back
|
|
||||||
---
|
|
||||||
bool found
|
|
||||||
int32 person_id
|
|
||||||
string name
|
|
||||||
string relationship_tier # "stranger", "regular", "favorite"
|
|
||||||
float32 relationship_score # 0.0–1.0
|
|
||||||
int32 interaction_count
|
|
||||||
string[] custom_greetings # Per-tier personalized greetings
|
|
||||||
string notes # Free-form notes about person
|
|
||||||
builtin_interfaces/Time last_seen
|
|
||||||
@ -1,11 +0,0 @@
|
|||||||
int32 person_id
|
|
||||||
string name
|
|
||||||
string relationship_tier # "stranger", "regular", "favorite"
|
|
||||||
float32 relationship_score # 0.0–1.0; auto-updated by encounters
|
|
||||||
string[] custom_greetings # Tier-specific greetings
|
|
||||||
string notes # Custom notes/observations
|
|
||||||
float32[] embedding # Optional face embedding vector
|
|
||||||
---
|
|
||||||
bool success
|
|
||||||
string message
|
|
||||||
int32 updated_person_id
|
|
||||||
@ -1,7 +0,0 @@
|
|||||||
build/
|
|
||||||
install/
|
|
||||||
log/
|
|
||||||
*.egg-info/
|
|
||||||
__pycache__/
|
|
||||||
*.pyc
|
|
||||||
.pytest_cache/
|
|
||||||
@ -1,274 +0,0 @@
|
|||||||
# SaltyBot Social Memory Database (Issue #443)
|
|
||||||
|
|
||||||
Persistent SQLite-backed social knowledge system for maintaining person profiles, relationship history, and interaction statistics.
|
|
||||||
|
|
||||||
## Features
|
|
||||||
|
|
||||||
### 1. Person Profile Management
|
|
||||||
- **Person Records**: Name, relationship tier, custom greetings, notes, face embeddings
|
|
||||||
- **Relationship Tiers**:
|
|
||||||
- **Stranger**: 0 encounters (default), cold greetings
|
|
||||||
- **Regular**: 5+ encounters, familiar greetings
|
|
||||||
- **Favorite**: 20+ encounters (or 80%+ positive interactions), warm greetings
|
|
||||||
- **Custom Greetings**: Tier-specific greeting strings
|
|
||||||
- **Face Embeddings**: Optional vector storage for face recognition
|
|
||||||
- **Notes**: Free-form observations about the person
|
|
||||||
|
|
||||||
### 2. Encounter Tracking
|
|
||||||
- **Encounter Recording**:
|
|
||||||
- Timestamp, transcript of conversation, detected mood
|
|
||||||
- Engagement score (0.0–1.0)
|
|
||||||
- Positive/negative interaction flag
|
|
||||||
- **Automatic Tier Promotion**:
|
|
||||||
- Encounter count-based: 5→regular, 20→favorite
|
|
||||||
- Quality-based: 80%+ positive interactions required
|
|
||||||
- Cooldown: 1 hour minimum between promotions
|
|
||||||
- **Statistics Aggregation**:
|
|
||||||
- Total encounter count per person
|
|
||||||
- Positive interaction ratio
|
|
||||||
- Relationship score (computed from interactions)
|
|
||||||
|
|
||||||
### 3. ROS2 Service Interfaces
|
|
||||||
|
|
||||||
#### `/saltybot/social_memory/lookup`
|
|
||||||
```
|
|
||||||
Request:
|
|
||||||
query_type: "id" | "name" | "recent"
|
|
||||||
query_value: person_id | name_pattern | days_back
|
|
||||||
|
|
||||||
Response:
|
|
||||||
found: boolean
|
|
||||||
person_id: int32
|
|
||||||
name: string
|
|
||||||
relationship_tier: string
|
|
||||||
relationship_score: float32 (0.0–1.0)
|
|
||||||
interaction_count: int32
|
|
||||||
custom_greetings: string[3] # [stranger, regular, favorite]
|
|
||||||
notes: string
|
|
||||||
last_seen: Time
|
|
||||||
```
|
|
||||||
|
|
||||||
#### `/saltybot/social_memory/update`
|
|
||||||
```
|
|
||||||
Request:
|
|
||||||
person_id: int32
|
|
||||||
name: string
|
|
||||||
relationship_tier: string
|
|
||||||
custom_greetings: string[3]
|
|
||||||
notes: string
|
|
||||||
embedding: float32[]
|
|
||||||
|
|
||||||
Response:
|
|
||||||
success: boolean
|
|
||||||
message: string
|
|
||||||
updated_person_id: int32
|
|
||||||
```
|
|
||||||
|
|
||||||
#### `/saltybot/social_memory/encounter`
|
|
||||||
```
|
|
||||||
Request:
|
|
||||||
person_id: int32
|
|
||||||
transcript: string
|
|
||||||
mood: string
|
|
||||||
engagement_score: float32
|
|
||||||
positive_interaction: boolean
|
|
||||||
|
|
||||||
Response:
|
|
||||||
success: boolean
|
|
||||||
message: string
|
|
||||||
encounter_id: int32
|
|
||||||
tier_changed_to: string (or empty if no change)
|
|
||||||
```
|
|
||||||
|
|
||||||
#### `/saltybot/social_memory/stats`
|
|
||||||
```
|
|
||||||
Request:
|
|
||||||
(empty)
|
|
||||||
|
|
||||||
Response:
|
|
||||||
total_persons: int32
|
|
||||||
total_encounters: int32
|
|
||||||
favorites_count: int32
|
|
||||||
regulars_count: int32
|
|
||||||
strangers_count: int32
|
|
||||||
database_size_mb: float32
|
|
||||||
last_backup: Time
|
|
||||||
database_path: string
|
|
||||||
```
|
|
||||||
|
|
||||||
### 4. Statistics Publishing
|
|
||||||
- **Topic**: `/saltybot/social_memory/stats_update` (std_msgs/String, JSON)
|
|
||||||
- **Frequency**: Configurable (default 60s)
|
|
||||||
- **Content**: Database statistics snapshot
|
|
||||||
|
|
||||||
### 5. Backup & Restore
|
|
||||||
- **Automatic Backups**: Gzipped snapshots stored in backup directory
|
|
||||||
- **Backup Tracking**: Recorded in database with timestamp and size
|
|
||||||
- **Manual Restore**: Restore from any backup file
|
|
||||||
|
|
||||||
## Database Schema
|
|
||||||
|
|
||||||
### `persons` Table
|
|
||||||
| Column | Type | Notes |
|
|
||||||
|--------|------|-------|
|
|
||||||
| person_id | INTEGER PRIMARY KEY | Auto-increment |
|
|
||||||
| name | TEXT UNIQUE | Person's name |
|
|
||||||
| embedding | BLOB | JSON-encoded face embedding |
|
|
||||||
| relationship_tier | TEXT | stranger/regular/favorite |
|
|
||||||
| relationship_score | REAL | 0.0–1.0 computed from encounters |
|
|
||||||
| interaction_count | INTEGER | Total encounters |
|
|
||||||
| positive_interactions | INTEGER | Count of positive encounters |
|
|
||||||
| greeting_stranger | TEXT | Greeting for strangers |
|
|
||||||
| greeting_regular | TEXT | Greeting for regular contacts |
|
|
||||||
| greeting_favorite | TEXT | Greeting for favorites |
|
|
||||||
| notes | TEXT | Custom observations |
|
|
||||||
| created_at | TIMESTAMP | When person was first recorded |
|
|
||||||
| last_seen | TIMESTAMP | Last encounter timestamp |
|
|
||||||
| last_tier_promotion | TIMESTAMP | Cooldown tracking |
|
|
||||||
|
|
||||||
### `encounters` Table
|
|
||||||
| Column | Type | Notes |
|
|
||||||
|--------|------|-------|
|
|
||||||
| encounter_id | INTEGER PRIMARY KEY | Auto-increment |
|
|
||||||
| person_id | INTEGER | Foreign key to persons |
|
|
||||||
| transcript | TEXT | Conversation text/summary |
|
|
||||||
| detected_mood | TEXT | Emotion detected during encounter |
|
|
||||||
| engagement_score | REAL | 0.0–1.0 |
|
|
||||||
| positive_interaction | BOOLEAN | Was interaction positive? |
|
|
||||||
| encounter_timestamp | TIMESTAMP | When encounter occurred |
|
|
||||||
|
|
||||||
### `backups` Table
|
|
||||||
| Column | Type | Notes |
|
|
||||||
|--------|------|-------|
|
|
||||||
| backup_id | INTEGER PRIMARY KEY | Auto-increment |
|
|
||||||
| backup_path | TEXT UNIQUE | Full path to backup file |
|
|
||||||
| backup_timestamp | TIMESTAMP | When backup was created |
|
|
||||||
| file_size_mb | REAL | Compressed size in MB |
|
|
||||||
|
|
||||||
## Configuration
|
|
||||||
|
|
||||||
Edit `config/social_memory.yaml`:
|
|
||||||
|
|
||||||
```yaml
|
|
||||||
database_path: "/home/seb/saltybot-data/social_memory.db"
|
|
||||||
backup_dir: "/home/seb/saltybot-data/backups"
|
|
||||||
|
|
||||||
publish_stats: true
|
|
||||||
stats_publish_interval: 60.0 # seconds
|
|
||||||
|
|
||||||
tier_promotion:
|
|
||||||
regular_threshold: 5 # Encounters for regular tier
|
|
||||||
favorite_threshold: 20 # Encounters for favorite tier
|
|
||||||
positive_ratio_min: 0.8 # 80% positive required
|
|
||||||
cooldown_hours: 1 # Minimum between promotions
|
|
||||||
```
|
|
||||||
|
|
||||||
## Running
|
|
||||||
|
|
||||||
### Launch the service node
|
|
||||||
```bash
|
|
||||||
ros2 launch saltybot_social_memory_node social_memory.launch.py
|
|
||||||
```
|
|
||||||
|
|
||||||
### Direct execution
|
|
||||||
```bash
|
|
||||||
ros2 run saltybot_social_memory_node social_memory
|
|
||||||
```
|
|
||||||
|
|
||||||
## Usage Examples
|
|
||||||
|
|
||||||
### Add a new person
|
|
||||||
```bash
|
|
||||||
ros2 service call /saltybot/social_memory/update saltybot_social_memory/PersonUpdate \
|
|
||||||
"{name: 'Alice', relationship_tier: 'stranger', \
|
|
||||||
custom_greetings: ['Hello there!', 'Nice to see you!', 'Great to see you again!']}"
|
|
||||||
```
|
|
||||||
|
|
||||||
### Record an encounter
|
|
||||||
```bash
|
|
||||||
ros2 service call /saltybot/social_memory/encounter saltybot_social_memory/EncounterRecord \
|
|
||||||
"{person_id: 1, transcript: 'Alice: Hi! How are you?', \
|
|
||||||
mood: 'happy', engagement_score: 0.85, positive_interaction: true}"
|
|
||||||
```
|
|
||||||
|
|
||||||
### Lookup a person
|
|
||||||
```bash
|
|
||||||
ros2 service call /saltybot/social_memory/lookup saltybot_social_memory/PersonLookup \
|
|
||||||
"{query_type: 'name', query_value: 'Alice'}"
|
|
||||||
```
|
|
||||||
|
|
||||||
### Get database statistics
|
|
||||||
```bash
|
|
||||||
ros2 service call /saltybot/social_memory/stats saltybot_social_memory/DatabaseStats
|
|
||||||
```
|
|
||||||
|
|
||||||
### Monitor statistics stream
|
|
||||||
```bash
|
|
||||||
ros2 topic echo /saltybot/social_memory/stats_update
|
|
||||||
```
|
|
||||||
|
|
||||||
## Integration Points
|
|
||||||
|
|
||||||
1. **Emotion Engine** (`saltybot_emotion_engine`)
|
|
||||||
- Queries person familiarity for warmth modifier
|
|
||||||
- Records encounters to build relationship history
|
|
||||||
|
|
||||||
2. **Person Tracker** (`sl-perception`)
|
|
||||||
- Provides person_id from face recognition
|
|
||||||
- Supplies engagement_score and detected_mood
|
|
||||||
|
|
||||||
3. **Voice Recognition** (`sl-jetson`)
|
|
||||||
- Provides transcript of conversations
|
|
||||||
- Records as encounter transcript
|
|
||||||
|
|
||||||
4. **Greeting System**
|
|
||||||
- Queries custom_greetings based on tier
|
|
||||||
- Personalizes robot responses
|
|
||||||
|
|
||||||
## Tier Promotion Logic
|
|
||||||
|
|
||||||
```
|
|
||||||
When encounter recorded:
|
|
||||||
1. Increment interaction_count
|
|
||||||
2. Update positive_interactions if positive_interaction=true
|
|
||||||
3. Compute positive_ratio = positive_interactions / interaction_count
|
|
||||||
4. Check promotion eligibility:
|
|
||||||
- current_tier == "stranger" && interaction_count >= 5 && positive_ratio >= 0.8
|
|
||||||
→ Promote to "regular"
|
|
||||||
- current_tier == "regular" && interaction_count >= 20 && positive_ratio >= 0.8
|
|
||||||
→ Promote to "favorite"
|
|
||||||
5. If promoted and cooldown_hours passed since last promotion:
|
|
||||||
- Update relationship_tier
|
|
||||||
- Set last_tier_promotion = now()
|
|
||||||
- Return tier_changed_to in response
|
|
||||||
```
|
|
||||||
|
|
||||||
## Backup Strategy
|
|
||||||
|
|
||||||
- **Manual backups** via `backup_database()` method
|
|
||||||
- **Automatic tracking** in backups table with timestamp
|
|
||||||
- **Restore** from any backup: `restore_database(backup_path)`
|
|
||||||
- **Format**: gzip-compressed SQLite database (.db.gz)
|
|
||||||
|
|
||||||
## Thread Safety
|
|
||||||
|
|
||||||
- All database operations protected by `threading.Lock()`
|
|
||||||
- Safe for concurrent ROS2 service calls
|
|
||||||
- Sequential writes to ensure data integrity
|
|
||||||
|
|
||||||
## Performance Characteristics
|
|
||||||
|
|
||||||
- **Lookup**: O(1) indexed by person_id, O(n) by name pattern
|
|
||||||
- **Encounter recording**: O(1) insertion + O(1) stats update
|
|
||||||
- **Database size**: ~1KB per person + ~0.5KB per encounter
|
|
||||||
|
|
||||||
## Future Enhancements
|
|
||||||
|
|
||||||
1. Persistent memory integration with LLM (e.g., "Alice loves plants")
|
|
||||||
2. Group relationships (track person groups/families)
|
|
||||||
3. Preference learning (dietary restrictions, favorite topics)
|
|
||||||
4. Sentiment analysis over time
|
|
||||||
5. Web UI for person management
|
|
||||||
6. Integration with voice recognition confidence
|
|
||||||
7. Automated cleanup of old encounters (>1 year)
|
|
||||||
8. Graph-based relationship networks
|
|
||||||
@ -1,18 +0,0 @@
|
|||||||
/**:
|
|
||||||
ros__parameters:
|
|
||||||
# Database path
|
|
||||||
database_path: "/home/seb/saltybot-data/social_memory.db"
|
|
||||||
|
|
||||||
# Backup directory
|
|
||||||
backup_dir: "/home/seb/saltybot-data/backups"
|
|
||||||
|
|
||||||
# Publish periodic statistics
|
|
||||||
publish_stats: true
|
|
||||||
stats_publish_interval: 60.0 # seconds
|
|
||||||
|
|
||||||
# Relationship tier promotion thresholds (encounters)
|
|
||||||
tier_promotion:
|
|
||||||
regular_threshold: 5 # 5 encounters to "regular"
|
|
||||||
favorite_threshold: 20 # 20 encounters to "favorite"
|
|
||||||
positive_ratio_min: 0.8 # 80% positive interactions required
|
|
||||||
cooldown_hours: 1 # Minimum time between promotions
|
|
||||||
@ -1,28 +0,0 @@
|
|||||||
#!/usr/bin/env python3
|
|
||||||
|
|
||||||
from launch import LaunchDescription
|
|
||||||
from launch_ros.actions import Node
|
|
||||||
from launch.substitutions import PathJoinSubstitution
|
|
||||||
from launch_ros.substitutions import FindPackageShare
|
|
||||||
|
|
||||||
|
|
||||||
def generate_launch_description():
|
|
||||||
"""Launch social memory service node."""
|
|
||||||
|
|
||||||
package_share_dir = FindPackageShare("saltybot_social_memory_node")
|
|
||||||
config_file = PathJoinSubstitution(
|
|
||||||
[package_share_dir, "config", "social_memory.yaml"]
|
|
||||||
)
|
|
||||||
|
|
||||||
social_memory_node = Node(
|
|
||||||
package="saltybot_social_memory_node",
|
|
||||||
executable="social_memory",
|
|
||||||
name="social_memory",
|
|
||||||
output="screen",
|
|
||||||
parameters=[config_file],
|
|
||||||
remappings=[],
|
|
||||||
)
|
|
||||||
|
|
||||||
return LaunchDescription([
|
|
||||||
social_memory_node,
|
|
||||||
])
|
|
||||||
@ -1,26 +0,0 @@
|
|||||||
<?xml version="1.0"?>
|
|
||||||
<?xml-model href="http://download.ros.org/schema/package_format3.xsd" schematypens="http://www.w3.org/2001/XMLSchema"?>
|
|
||||||
<package format="3">
|
|
||||||
<name>saltybot_social_memory_node</name>
|
|
||||||
<version>0.1.0</version>
|
|
||||||
<description>
|
|
||||||
ROS2 service node for persistent social memory database.
|
|
||||||
Implements person lookup, update, encounter recording, and automatic relationship tier promotion.
|
|
||||||
</description>
|
|
||||||
<maintainer email="seb@vayrette.com">seb</maintainer>
|
|
||||||
<license>MIT</license>
|
|
||||||
|
|
||||||
<depend>rclpy</depend>
|
|
||||||
<depend>std_msgs</depend>
|
|
||||||
<depend>saltybot_social_memory</depend>
|
|
||||||
<depend>ament_index_python</depend>
|
|
||||||
|
|
||||||
<test_depend>ament_copyright</test_depend>
|
|
||||||
<test_depend>ament_flake8</test_depend>
|
|
||||||
<test_depend>ament_pep257</test_depend>
|
|
||||||
<test_depend>python3-pytest</test_depend>
|
|
||||||
|
|
||||||
<export>
|
|
||||||
<build_type>ament_python</build_type>
|
|
||||||
</export>
|
|
||||||
</package>
|
|
||||||
@ -1,583 +0,0 @@
|
|||||||
#!/usr/bin/env python3
|
|
||||||
|
|
||||||
import sqlite3
|
|
||||||
import json
|
|
||||||
import shutil
|
|
||||||
import hashlib
|
|
||||||
from datetime import datetime, timedelta
|
|
||||||
from pathlib import Path
|
|
||||||
from typing import Optional, List, Dict, Tuple
|
|
||||||
import threading
|
|
||||||
import gzip
|
|
||||||
|
|
||||||
import rclpy
|
|
||||||
from rclpy.node import Node
|
|
||||||
from rclpy.qos import QoSProfile, ReliabilityPolicy, HistoryPolicy
|
|
||||||
from std_msgs.msg import String
|
|
||||||
from builtin_interfaces.msg import Time
|
|
||||||
|
|
||||||
from saltybot_social_memory.srv import (
|
|
||||||
PersonLookup,
|
|
||||||
PersonUpdate,
|
|
||||||
EncounterRecord,
|
|
||||||
DatabaseStats,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
class RelationshipTier:
|
|
||||||
"""Relationship tier definitions and promotion logic."""
|
|
||||||
|
|
||||||
TIERS = ["stranger", "regular", "favorite"]
|
|
||||||
TIER_THRESHOLDS = {
|
|
||||||
"stranger": 0,
|
|
||||||
"regular": 5, # 5+ encounters
|
|
||||||
"favorite": 20, # 20+ encounters or high positive score
|
|
||||||
}
|
|
||||||
TIER_COOLDOWN = 3600 # 1 hour between promotions
|
|
||||||
|
|
||||||
@staticmethod
|
|
||||||
def promote_tier(current_tier: str, encounter_count: int, positive_score: float) -> Tuple[str, bool]:
|
|
||||||
"""Determine if person should be promoted to higher tier."""
|
|
||||||
current_idx = RelationshipTier.TIERS.index(current_tier)
|
|
||||||
|
|
||||||
if current_idx >= len(RelationshipTier.TIERS) - 1:
|
|
||||||
return current_tier, False # Already at max tier
|
|
||||||
|
|
||||||
next_tier = RelationshipTier.TIERS[current_idx + 1]
|
|
||||||
threshold = RelationshipTier.TIER_THRESHOLDS[next_tier]
|
|
||||||
|
|
||||||
# Promote if encounter count meets threshold and positive interactions >80%
|
|
||||||
if encounter_count >= threshold and positive_score > 0.8:
|
|
||||||
return next_tier, True
|
|
||||||
|
|
||||||
return current_tier, False
|
|
||||||
|
|
||||||
|
|
||||||
class SocialMemoryDatabase:
|
|
||||||
"""SQLite-backed social memory database."""
|
|
||||||
|
|
||||||
def __init__(self, db_path: Path):
|
|
||||||
self.db_path = db_path
|
|
||||||
self.db_path.parent.mkdir(parents=True, exist_ok=True)
|
|
||||||
self.lock = threading.Lock()
|
|
||||||
self._init_schema()
|
|
||||||
|
|
||||||
def _init_schema(self):
|
|
||||||
"""Initialize database schema if not exists."""
|
|
||||||
with sqlite3.connect(self.db_path) as conn:
|
|
||||||
conn.execute("""
|
|
||||||
CREATE TABLE IF NOT EXISTS persons (
|
|
||||||
person_id INTEGER PRIMARY KEY AUTOINCREMENT,
|
|
||||||
name TEXT UNIQUE NOT NULL,
|
|
||||||
embedding BLOB,
|
|
||||||
relationship_tier TEXT DEFAULT 'stranger',
|
|
||||||
relationship_score REAL DEFAULT 0.5,
|
|
||||||
interaction_count INTEGER DEFAULT 0,
|
|
||||||
positive_interactions INTEGER DEFAULT 0,
|
|
||||||
greeting_stranger TEXT,
|
|
||||||
greeting_regular TEXT,
|
|
||||||
greeting_favorite TEXT,
|
|
||||||
notes TEXT,
|
|
||||||
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
|
||||||
last_seen TIMESTAMP,
|
|
||||||
last_tier_promotion TIMESTAMP
|
|
||||||
)
|
|
||||||
""")
|
|
||||||
|
|
||||||
conn.execute("""
|
|
||||||
CREATE TABLE IF NOT EXISTS encounters (
|
|
||||||
encounter_id INTEGER PRIMARY KEY AUTOINCREMENT,
|
|
||||||
person_id INTEGER NOT NULL,
|
|
||||||
transcript TEXT,
|
|
||||||
detected_mood TEXT,
|
|
||||||
engagement_score REAL,
|
|
||||||
positive_interaction BOOLEAN,
|
|
||||||
encounter_timestamp TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
|
||||||
FOREIGN KEY (person_id) REFERENCES persons(person_id) ON DELETE CASCADE
|
|
||||||
)
|
|
||||||
""")
|
|
||||||
|
|
||||||
conn.execute("""
|
|
||||||
CREATE TABLE IF NOT EXISTS backups (
|
|
||||||
backup_id INTEGER PRIMARY KEY AUTOINCREMENT,
|
|
||||||
backup_path TEXT UNIQUE NOT NULL,
|
|
||||||
backup_timestamp TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
|
|
||||||
file_size_mb REAL
|
|
||||||
)
|
|
||||||
""")
|
|
||||||
|
|
||||||
conn.commit()
|
|
||||||
|
|
||||||
def add_or_update_person(
|
|
||||||
self,
|
|
||||||
name: str,
|
|
||||||
relationship_tier: str = "stranger",
|
|
||||||
custom_greetings: Optional[Dict[str, str]] = None,
|
|
||||||
notes: str = "",
|
|
||||||
embedding: Optional[List[float]] = None,
|
|
||||||
) -> int:
|
|
||||||
"""Add or update a person profile."""
|
|
||||||
with self.lock:
|
|
||||||
with sqlite3.connect(self.db_path) as conn:
|
|
||||||
cursor = conn.cursor()
|
|
||||||
|
|
||||||
# Check if person exists
|
|
||||||
cursor.execute("SELECT person_id FROM persons WHERE name = ?", (name,))
|
|
||||||
result = cursor.fetchone()
|
|
||||||
|
|
||||||
if result:
|
|
||||||
person_id = result[0]
|
|
||||||
# Update existing person
|
|
||||||
cursor.execute("""
|
|
||||||
UPDATE persons SET
|
|
||||||
relationship_tier = ?,
|
|
||||||
greeting_stranger = ?,
|
|
||||||
greeting_regular = ?,
|
|
||||||
greeting_favorite = ?,
|
|
||||||
notes = ?
|
|
||||||
WHERE person_id = ?
|
|
||||||
""", (
|
|
||||||
relationship_tier,
|
|
||||||
custom_greetings.get("stranger", "") if custom_greetings else "",
|
|
||||||
custom_greetings.get("regular", "") if custom_greetings else "",
|
|
||||||
custom_greetings.get("favorite", "") if custom_greetings else "",
|
|
||||||
notes,
|
|
||||||
person_id,
|
|
||||||
))
|
|
||||||
else:
|
|
||||||
# Insert new person
|
|
||||||
embedding_blob = None
|
|
||||||
if embedding:
|
|
||||||
embedding_blob = json.dumps(embedding).encode()
|
|
||||||
|
|
||||||
cursor.execute("""
|
|
||||||
INSERT INTO persons (
|
|
||||||
name, relationship_tier, greeting_stranger,
|
|
||||||
greeting_regular, greeting_favorite, notes, embedding
|
|
||||||
) VALUES (?, ?, ?, ?, ?, ?, ?)
|
|
||||||
""", (
|
|
||||||
name,
|
|
||||||
relationship_tier,
|
|
||||||
custom_greetings.get("stranger", "") if custom_greetings else "",
|
|
||||||
custom_greetings.get("regular", "") if custom_greetings else "",
|
|
||||||
custom_greetings.get("favorite", "") if custom_greetings else "",
|
|
||||||
notes,
|
|
||||||
embedding_blob,
|
|
||||||
))
|
|
||||||
cursor.execute("SELECT last_insert_rowid()")
|
|
||||||
person_id = cursor.fetchone()[0]
|
|
||||||
|
|
||||||
conn.commit()
|
|
||||||
|
|
||||||
return person_id
|
|
||||||
|
|
||||||
def lookup_person(self, query_type: str, query_value: str) -> Optional[Dict]:
|
|
||||||
"""Lookup a person by ID, name, or recent encounters."""
|
|
||||||
with self.lock:
|
|
||||||
with sqlite3.connect(self.db_path) as conn:
|
|
||||||
conn.row_factory = sqlite3.Row
|
|
||||||
cursor = conn.cursor()
|
|
||||||
|
|
||||||
if query_type == "id":
|
|
||||||
cursor.execute("SELECT * FROM persons WHERE person_id = ?", (int(query_value),))
|
|
||||||
elif query_type == "name":
|
|
||||||
cursor.execute("SELECT * FROM persons WHERE name LIKE ?", (f"%{query_value}%",))
|
|
||||||
elif query_type == "recent":
|
|
||||||
# Get most recent encounters within N days
|
|
||||||
days = int(query_value) if query_value else 7
|
|
||||||
cursor.execute("""
|
|
||||||
SELECT DISTINCT p.* FROM persons p
|
|
||||||
JOIN encounters e ON p.person_id = e.person_id
|
|
||||||
WHERE e.encounter_timestamp > datetime('now', '-' || ? || ' days')
|
|
||||||
ORDER BY p.last_seen DESC
|
|
||||||
LIMIT 5
|
|
||||||
""", (days,))
|
|
||||||
else:
|
|
||||||
return None
|
|
||||||
|
|
||||||
row = cursor.fetchone()
|
|
||||||
if not row:
|
|
||||||
return None
|
|
||||||
|
|
||||||
# Parse embedding if present
|
|
||||||
embedding = None
|
|
||||||
if row["embedding"]:
|
|
||||||
embedding = json.loads(row["embedding"])
|
|
||||||
|
|
||||||
return {
|
|
||||||
"person_id": row["person_id"],
|
|
||||||
"name": row["name"],
|
|
||||||
"relationship_tier": row["relationship_tier"],
|
|
||||||
"relationship_score": row["relationship_score"],
|
|
||||||
"interaction_count": row["interaction_count"],
|
|
||||||
"custom_greetings": {
|
|
||||||
"stranger": row["greeting_stranger"] or "",
|
|
||||||
"regular": row["greeting_regular"] or "",
|
|
||||||
"favorite": row["greeting_favorite"] or "",
|
|
||||||
},
|
|
||||||
"notes": row["notes"] or "",
|
|
||||||
"last_seen": row["last_seen"],
|
|
||||||
"embedding": embedding,
|
|
||||||
}
|
|
||||||
|
|
||||||
def record_encounter(
|
|
||||||
self,
|
|
||||||
person_id: int,
|
|
||||||
transcript: str,
|
|
||||||
mood: str,
|
|
||||||
engagement_score: float,
|
|
||||||
positive_interaction: bool,
|
|
||||||
) -> Tuple[int, Optional[str]]:
|
|
||||||
"""Record an encounter and check for tier promotion."""
|
|
||||||
with self.lock:
|
|
||||||
with sqlite3.connect(self.db_path) as conn:
|
|
||||||
cursor = conn.cursor()
|
|
||||||
|
|
||||||
# Insert encounter
|
|
||||||
cursor.execute("""
|
|
||||||
INSERT INTO encounters (
|
|
||||||
person_id, transcript, detected_mood, engagement_score, positive_interaction
|
|
||||||
) VALUES (?, ?, ?, ?, ?)
|
|
||||||
""", (person_id, transcript, mood, engagement_score, positive_interaction))
|
|
||||||
|
|
||||||
cursor.execute("SELECT last_insert_rowid()")
|
|
||||||
encounter_id = cursor.fetchone()[0]
|
|
||||||
|
|
||||||
# Update person stats
|
|
||||||
cursor.execute("""
|
|
||||||
SELECT interaction_count, positive_interactions, relationship_tier, last_tier_promotion
|
|
||||||
FROM persons WHERE person_id = ?
|
|
||||||
""", (person_id,))
|
|
||||||
row = cursor.fetchone()
|
|
||||||
current_count = row[0]
|
|
||||||
positive_count = row[1]
|
|
||||||
current_tier = row[2]
|
|
||||||
last_promotion = row[3]
|
|
||||||
|
|
||||||
new_interaction_count = current_count + 1
|
|
||||||
new_positive_count = positive_count + (1 if positive_interaction else 0)
|
|
||||||
positive_score = new_positive_count / max(1, new_interaction_count)
|
|
||||||
|
|
||||||
# Check for tier promotion (with cooldown)
|
|
||||||
new_tier = current_tier
|
|
||||||
tier_changed = None
|
|
||||||
now = datetime.now().isoformat()
|
|
||||||
|
|
||||||
if last_promotion is None or \
|
|
||||||
(datetime.fromisoformat(last_promotion) + timedelta(hours=RelationshipTier.TIER_COOLDOWN)).isoformat() < now:
|
|
||||||
new_tier, promoted = RelationshipTier.promote_tier(
|
|
||||||
current_tier,
|
|
||||||
new_interaction_count,
|
|
||||||
positive_score,
|
|
||||||
)
|
|
||||||
if promoted:
|
|
||||||
tier_changed = new_tier
|
|
||||||
|
|
||||||
# Update person record
|
|
||||||
cursor.execute("""
|
|
||||||
UPDATE persons SET
|
|
||||||
interaction_count = ?,
|
|
||||||
positive_interactions = ?,
|
|
||||||
relationship_score = ?,
|
|
||||||
relationship_tier = ?,
|
|
||||||
last_seen = CURRENT_TIMESTAMP,
|
|
||||||
last_tier_promotion = ?
|
|
||||||
WHERE person_id = ?
|
|
||||||
""", (
|
|
||||||
new_interaction_count,
|
|
||||||
new_positive_count,
|
|
||||||
positive_score,
|
|
||||||
new_tier,
|
|
||||||
now if tier_changed else None,
|
|
||||||
person_id,
|
|
||||||
))
|
|
||||||
|
|
||||||
conn.commit()
|
|
||||||
|
|
||||||
return encounter_id, tier_changed
|
|
||||||
|
|
||||||
def get_stats(self) -> Dict:
|
|
||||||
"""Get database statistics."""
|
|
||||||
with self.lock:
|
|
||||||
with sqlite3.connect(self.db_path) as conn:
|
|
||||||
cursor = conn.cursor()
|
|
||||||
|
|
||||||
cursor.execute("SELECT COUNT(*) FROM persons")
|
|
||||||
total_persons = cursor.fetchone()[0]
|
|
||||||
|
|
||||||
cursor.execute("SELECT COUNT(*) FROM encounters")
|
|
||||||
total_encounters = cursor.fetchone()[0]
|
|
||||||
|
|
||||||
cursor.execute(
|
|
||||||
"SELECT COUNT(*) FROM persons WHERE relationship_tier = ?",
|
|
||||||
("favorite",)
|
|
||||||
)
|
|
||||||
favorites = cursor.fetchone()[0]
|
|
||||||
|
|
||||||
cursor.execute(
|
|
||||||
"SELECT COUNT(*) FROM persons WHERE relationship_tier = ?",
|
|
||||||
("regular",)
|
|
||||||
)
|
|
||||||
regulars = cursor.fetchone()[0]
|
|
||||||
|
|
||||||
cursor.execute(
|
|
||||||
"SELECT COUNT(*) FROM persons WHERE relationship_tier = ?",
|
|
||||||
("stranger",)
|
|
||||||
)
|
|
||||||
strangers = cursor.fetchone()[0]
|
|
||||||
|
|
||||||
# Get database file size
|
|
||||||
db_size_mb = self.db_path.stat().st_size / (1024 * 1024)
|
|
||||||
|
|
||||||
# Get last backup time
|
|
||||||
cursor.execute(
|
|
||||||
"SELECT backup_timestamp FROM backups ORDER BY backup_timestamp DESC LIMIT 1"
|
|
||||||
)
|
|
||||||
last_backup = cursor.fetchone()
|
|
||||||
last_backup_time = last_backup[0] if last_backup else None
|
|
||||||
|
|
||||||
return {
|
|
||||||
"total_persons": total_persons,
|
|
||||||
"total_encounters": total_encounters,
|
|
||||||
"favorites": favorites,
|
|
||||||
"regulars": regulars,
|
|
||||||
"strangers": strangers,
|
|
||||||
"database_size_mb": db_size_mb,
|
|
||||||
"last_backup": last_backup_time,
|
|
||||||
}
|
|
||||||
|
|
||||||
def backup_database(self, backup_dir: Path) -> Path:
|
|
||||||
"""Create a gzipped backup of the database."""
|
|
||||||
with self.lock:
|
|
||||||
backup_dir.mkdir(parents=True, exist_ok=True)
|
|
||||||
timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
|
|
||||||
backup_path = backup_dir / f"social_memory_backup_{timestamp}.db.gz"
|
|
||||||
|
|
||||||
try:
|
|
||||||
with open(self.db_path, "rb") as f_in:
|
|
||||||
with gzip.open(backup_path, "wb") as f_out:
|
|
||||||
f_out.write(f_in.read())
|
|
||||||
|
|
||||||
# Record backup in database
|
|
||||||
with sqlite3.connect(self.db_path) as conn:
|
|
||||||
cursor = conn.cursor()
|
|
||||||
size_mb = backup_path.stat().st_size / (1024 * 1024)
|
|
||||||
cursor.execute(
|
|
||||||
"INSERT INTO backups (backup_path, file_size_mb) VALUES (?, ?)",
|
|
||||||
(str(backup_path), size_mb),
|
|
||||||
)
|
|
||||||
conn.commit()
|
|
||||||
|
|
||||||
return backup_path
|
|
||||||
except Exception as e:
|
|
||||||
raise Exception(f"Backup failed: {str(e)}")
|
|
||||||
|
|
||||||
def restore_database(self, backup_path: Path):
|
|
||||||
"""Restore database from gzipped backup."""
|
|
||||||
with self.lock:
|
|
||||||
try:
|
|
||||||
with gzip.open(backup_path, "rb") as f_in:
|
|
||||||
with open(self.db_path, "wb") as f_out:
|
|
||||||
f_out.write(f_in.read())
|
|
||||||
self._init_schema() # Ensure schema is valid
|
|
||||||
except Exception as e:
|
|
||||||
raise Exception(f"Restore failed: {str(e)}")
|
|
||||||
|
|
||||||
|
|
||||||
class SocialMemoryNode(Node):
|
|
||||||
"""ROS2 service server for social memory database."""
|
|
||||||
|
|
||||||
def __init__(self):
|
|
||||||
super().__init__("saltybot_social_memory")
|
|
||||||
|
|
||||||
# Configuration
|
|
||||||
self.declare_parameter("database_path", "/home/seb/saltybot-data/social_memory.db")
|
|
||||||
self.declare_parameter("backup_dir", "/home/seb/saltybot-data/backups")
|
|
||||||
self.declare_parameter("publish_stats", True)
|
|
||||||
self.declare_parameter("stats_publish_interval", 60.0)
|
|
||||||
|
|
||||||
db_path = Path(self.get_parameter("database_path").value)
|
|
||||||
backup_dir = Path(self.get_parameter("backup_dir").value)
|
|
||||||
|
|
||||||
# Initialize database
|
|
||||||
self.db = SocialMemoryDatabase(db_path)
|
|
||||||
self.backup_dir = backup_dir
|
|
||||||
|
|
||||||
# Create service servers
|
|
||||||
self.lookup_service = self.create_service(
|
|
||||||
PersonLookup,
|
|
||||||
"/saltybot/social_memory/lookup",
|
|
||||||
self.handle_lookup,
|
|
||||||
)
|
|
||||||
|
|
||||||
self.update_service = self.create_service(
|
|
||||||
PersonUpdate,
|
|
||||||
"/saltybot/social_memory/update",
|
|
||||||
self.handle_update,
|
|
||||||
)
|
|
||||||
|
|
||||||
self.encounter_service = self.create_service(
|
|
||||||
EncounterRecord,
|
|
||||||
"/saltybot/social_memory/encounter",
|
|
||||||
self.handle_encounter,
|
|
||||||
)
|
|
||||||
|
|
||||||
self.stats_service = self.create_service(
|
|
||||||
DatabaseStats,
|
|
||||||
"/saltybot/social_memory/stats",
|
|
||||||
self.handle_stats,
|
|
||||||
)
|
|
||||||
|
|
||||||
# Publisher for stats (if enabled)
|
|
||||||
if self.get_parameter("publish_stats").value:
|
|
||||||
self.stats_pub = self.create_publisher(
|
|
||||||
String,
|
|
||||||
"/saltybot/social_memory/stats_update",
|
|
||||||
10,
|
|
||||||
)
|
|
||||||
interval = self.get_parameter("stats_publish_interval").value
|
|
||||||
self.stats_timer = self.create_timer(interval, self.publish_stats_callback)
|
|
||||||
|
|
||||||
self.get_logger().info(
|
|
||||||
f"Social memory node initialized: {db_path}, "
|
|
||||||
f"backup dir: {backup_dir}"
|
|
||||||
)
|
|
||||||
|
|
||||||
def handle_lookup(self, request: PersonLookup.Request, response: PersonLookup.Response) -> PersonLookup.Response:
|
|
||||||
"""Handle person lookup requests."""
|
|
||||||
try:
|
|
||||||
person = self.db.lookup_person(request.query_type, request.query_value)
|
|
||||||
|
|
||||||
if person:
|
|
||||||
response.found = True
|
|
||||||
response.person_id = person["person_id"]
|
|
||||||
response.name = person["name"]
|
|
||||||
response.relationship_tier = person["relationship_tier"]
|
|
||||||
response.relationship_score = person["relationship_score"]
|
|
||||||
response.interaction_count = person["interaction_count"]
|
|
||||||
response.notes = person["notes"]
|
|
||||||
|
|
||||||
# Build custom greetings list
|
|
||||||
greetings = person["custom_greetings"]
|
|
||||||
response.custom_greetings = [
|
|
||||||
greetings.get("stranger", ""),
|
|
||||||
greetings.get("regular", ""),
|
|
||||||
greetings.get("favorite", ""),
|
|
||||||
]
|
|
||||||
|
|
||||||
if person["last_seen"]:
|
|
||||||
last_seen = datetime.fromisoformat(person["last_seen"])
|
|
||||||
response.last_seen.sec = int(last_seen.timestamp())
|
|
||||||
else:
|
|
||||||
response.found = False
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
self.get_logger().error(f"Lookup error: {str(e)}")
|
|
||||||
response.found = False
|
|
||||||
|
|
||||||
return response
|
|
||||||
|
|
||||||
def handle_update(self, request: PersonUpdate.Request, response: PersonUpdate.Response) -> PersonUpdate.Response:
|
|
||||||
"""Handle person update requests."""
|
|
||||||
try:
|
|
||||||
custom_greetings = {
|
|
||||||
"stranger": request.custom_greetings[0] if len(request.custom_greetings) > 0 else "",
|
|
||||||
"regular": request.custom_greetings[1] if len(request.custom_greetings) > 1 else "",
|
|
||||||
"favorite": request.custom_greetings[2] if len(request.custom_greetings) > 2 else "",
|
|
||||||
}
|
|
||||||
|
|
||||||
person_id = self.db.add_or_update_person(
|
|
||||||
name=request.name,
|
|
||||||
relationship_tier=request.relationship_tier,
|
|
||||||
custom_greetings=custom_greetings,
|
|
||||||
notes=request.notes,
|
|
||||||
embedding=request.embedding if request.embedding else None,
|
|
||||||
)
|
|
||||||
|
|
||||||
response.success = True
|
|
||||||
response.message = f"Person '{request.name}' updated successfully"
|
|
||||||
response.updated_person_id = person_id
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
response.success = False
|
|
||||||
response.message = f"Update error: {str(e)}"
|
|
||||||
self.get_logger().error(f"Update error: {str(e)}")
|
|
||||||
|
|
||||||
return response
|
|
||||||
|
|
||||||
def handle_encounter(self, request: EncounterRecord.Request, response: EncounterRecord.Response) -> EncounterRecord.Response:
|
|
||||||
"""Handle encounter recording requests."""
|
|
||||||
try:
|
|
||||||
encounter_id, tier_changed = self.db.record_encounter(
|
|
||||||
person_id=request.person_id,
|
|
||||||
transcript=request.transcript,
|
|
||||||
mood=request.mood,
|
|
||||||
engagement_score=request.engagement_score,
|
|
||||||
positive_interaction=request.positive_interaction,
|
|
||||||
)
|
|
||||||
|
|
||||||
response.success = True
|
|
||||||
response.encounter_id = encounter_id
|
|
||||||
response.tier_changed_to = tier_changed if tier_changed else ""
|
|
||||||
response.message = f"Encounter recorded (ID: {encounter_id})"
|
|
||||||
|
|
||||||
if tier_changed:
|
|
||||||
self.get_logger().info(
|
|
||||||
f"Person {request.person_id} promoted to {tier_changed}"
|
|
||||||
)
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
response.success = False
|
|
||||||
response.message = f"Encounter error: {str(e)}"
|
|
||||||
self.get_logger().error(f"Encounter error: {str(e)}")
|
|
||||||
|
|
||||||
return response
|
|
||||||
|
|
||||||
def handle_stats(self, request: DatabaseStats.Request, response: DatabaseStats.Response) -> DatabaseStats.Response:
|
|
||||||
"""Handle statistics requests."""
|
|
||||||
try:
|
|
||||||
stats = self.db.get_stats()
|
|
||||||
|
|
||||||
response.total_persons = stats["total_persons"]
|
|
||||||
response.total_encounters = stats["total_encounters"]
|
|
||||||
response.favorites_count = stats["favorites"]
|
|
||||||
response.regulars_count = stats["regulars"]
|
|
||||||
response.strangers_count = stats["strangers"]
|
|
||||||
response.database_size_mb = stats["database_size_mb"]
|
|
||||||
response.database_path = str(self.db.db_path)
|
|
||||||
|
|
||||||
if stats["last_backup"]:
|
|
||||||
last_backup = datetime.fromisoformat(stats["last_backup"])
|
|
||||||
response.last_backup.sec = int(last_backup.timestamp())
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
self.get_logger().error(f"Stats error: {str(e)}")
|
|
||||||
|
|
||||||
return response
|
|
||||||
|
|
||||||
def publish_stats_callback(self):
|
|
||||||
"""Periodically publish database statistics."""
|
|
||||||
try:
|
|
||||||
stats = self.db.get_stats()
|
|
||||||
msg = String()
|
|
||||||
msg.data = json.dumps(stats, default=str)
|
|
||||||
self.stats_pub.publish(msg)
|
|
||||||
except Exception as e:
|
|
||||||
self.get_logger().error(f"Stats publish error: {str(e)}")
|
|
||||||
|
|
||||||
|
|
||||||
def main(args=None):
|
|
||||||
rclpy.init(args=args)
|
|
||||||
node = SocialMemoryNode()
|
|
||||||
|
|
||||||
try:
|
|
||||||
rclpy.spin(node)
|
|
||||||
except KeyboardInterrupt:
|
|
||||||
pass
|
|
||||||
finally:
|
|
||||||
node.destroy_node()
|
|
||||||
rclpy.shutdown()
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
main()
|
|
||||||
@ -1,4 +0,0 @@
|
|||||||
[develop]
|
|
||||||
script_dir=$base/lib/saltybot_social_memory_node
|
|
||||||
[install]
|
|
||||||
install_lib=$base/lib/saltybot_social_memory_node
|
|
||||||
@ -1,32 +0,0 @@
|
|||||||
from setuptools import setup
|
|
||||||
import os
|
|
||||||
from glob import glob
|
|
||||||
|
|
||||||
package_name = 'saltybot_social_memory_node'
|
|
||||||
|
|
||||||
setup(
|
|
||||||
name=package_name,
|
|
||||||
version='0.1.0',
|
|
||||||
packages=[package_name],
|
|
||||||
data_files=[
|
|
||||||
('share/ament_index/resource_index/packages',
|
|
||||||
['resource/' + package_name]),
|
|
||||||
('share/' + package_name, ['package.xml']),
|
|
||||||
(os.path.join('share', package_name, 'launch'),
|
|
||||||
glob('launch/*.py')),
|
|
||||||
(os.path.join('share', package_name, 'config'),
|
|
||||||
glob('config/*.yaml')),
|
|
||||||
],
|
|
||||||
install_requires=['setuptools'],
|
|
||||||
zip_safe=True,
|
|
||||||
maintainer='seb',
|
|
||||||
maintainer_email='seb@vayrette.com',
|
|
||||||
description='ROS2 service server for persistent social memory database',
|
|
||||||
license='MIT',
|
|
||||||
tests_require=['pytest'],
|
|
||||||
entry_points={
|
|
||||||
'console_scripts': [
|
|
||||||
'social_memory = saltybot_social_memory_node.social_memory_node:main',
|
|
||||||
],
|
|
||||||
},
|
|
||||||
)
|
|
||||||
Loading…
x
Reference in New Issue
Block a user