Some checks failed
social-bot integration tests / Lint (flake8 + pep257) (push) Failing after 2s
social-bot integration tests / Core integration tests (mock sensors, no GPU) (push) Has been skipped
social-bot integration tests / Lint (flake8 + pep257) (pull_request) Failing after 2s
social-bot integration tests / Core integration tests (mock sensors, no GPU) (pull_request) Has been skipped
social-bot integration tests / Latency profiling (GPU, Orin) (push) Has been cancelled
social-bot integration tests / Latency profiling (GPU, Orin) (pull_request) Has been cancelled
- Add Expression.msg / ExpressionArray.msg ROS2 message definitions - Add emotion_classifier.py: 7-class CNN (happy/sad/angry/surprised/fearful/disgusted/neutral) via TensorRT FP16 with landmark-geometry fallback; EMA per-person smoothing; opt-out registry - Add emotion_node.py: subscribes /social/faces/detections, runs TRT crop inference (<5ms), publishes /social/faces/expressions and /social/emotion/context JSON for LLM - Wire emotion context into conversation_node.py: emotion hint injected into LLM prompt when speaker shows non-neutral affect; subscribes /social/emotion/context - Add emotion_params.yaml config and emotion.launch.py launch file - Add 67-test suite (test_emotion_classifier.py): classifier, tracker, opt-out, heuristic Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
45 lines
1.8 KiB
Python
45 lines
1.8 KiB
Python
from setuptools import find_packages, setup
|
|
import os
|
|
from glob import glob
|
|
|
|
package_name = 'saltybot_social'
|
|
|
|
setup(
|
|
name=package_name,
|
|
version='0.1.0',
|
|
packages=find_packages(exclude=['test']),
|
|
data_files=[
|
|
('share/ament_index/resource_index/packages',
|
|
['resource/' + package_name]),
|
|
('share/' + package_name, ['package.xml']),
|
|
(os.path.join('share', package_name, 'launch'),
|
|
glob(os.path.join('launch', '*launch.[pxy][yma]*'))),
|
|
(os.path.join('share', package_name, 'config'),
|
|
glob(os.path.join('config', '*.yaml'))),
|
|
],
|
|
install_requires=['setuptools'],
|
|
zip_safe=True,
|
|
maintainer='seb',
|
|
maintainer_email='seb@vayrette.com',
|
|
description='Social interaction layer — person tracking, speech, LLM, TTS, orchestrator',
|
|
license='MIT',
|
|
tests_require=['pytest'],
|
|
entry_points={
|
|
'console_scripts': [
|
|
'person_state_tracker = saltybot_social.person_state_tracker_node:main',
|
|
'expression_node = saltybot_social.expression_node:main',
|
|
'attention_node = saltybot_social.attention_node:main',
|
|
'speech_pipeline_node = saltybot_social.speech_pipeline_node:main',
|
|
'conversation_node = saltybot_social.conversation_node:main',
|
|
'tts_node = saltybot_social.tts_node:main',
|
|
'orchestrator_node = saltybot_social.orchestrator_node:main',
|
|
# Voice command NLU bridge (Issue #137)
|
|
'voice_command_node = saltybot_social.voice_command_node:main',
|
|
# Multi-camera gesture recognition (Issue #140)
|
|
'gesture_node = saltybot_social.gesture_node:main',
|
|
# Facial expression recognition (Issue #161)
|
|
'emotion_node = saltybot_social.emotion_node:main',
|
|
],
|
|
},
|
|
)
|