Merge pull request 'feat: Orin Nano Super platform update + 4x IMX219 CSI cameras' (#51) from sl-jetson/orin-platform-cameras into main

This commit is contained in:
seb 2026-02-28 23:08:49 -05:00
commit 5008e03cc4
12 changed files with 707 additions and 216 deletions

View File

@ -60,7 +60,7 @@ services:
# ── RPLIDAR driver node ────────────────────────────────────────────────────
rplidar:
image: saltybot/ros2-humble:jetson-nano
image: saltybot/ros2-humble:jetson-orin
build:
context: .
dockerfile: Dockerfile
@ -72,18 +72,18 @@ services:
- ROS_DOMAIN_ID=42
- RMW_IMPLEMENTATION=rmw_cyclonedds_cpp
devices:
- /dev/ttyUSB0:/dev/ttyUSB0
- /dev/rplidar:/dev/rplidar
command: >
bash -c "
source /opt/ros/humble/setup.bash &&
ros2 launch rplidar_ros rplidar_a1_launch.py
serial_port:=/dev/ttyUSB0
serial_port:=/dev/rplidar
frame_id:=laser
"
# ── RealSense D435i driver node ────────────────────────────────────────────
realsense:
image: saltybot/ros2-humble:jetson-nano
image: saltybot/ros2-humble:jetson-orin
build:
context: .
dockerfile: Dockerfile
@ -111,9 +111,9 @@ services:
rgb_camera.profile:=640x480x30
"
# ── STM32 bridge node (custom serial↔ROS2 bridge) ─────────────────────────
# ── STM32 bridge node (bidirectional serial↔ROS2) ─────────────────────────
stm32-bridge:
image: saltybot/ros2-humble:jetson-nano
image: saltybot/ros2-humble:jetson-orin
build:
context: .
dockerfile: Dockerfile
@ -125,14 +125,43 @@ services:
- ROS_DOMAIN_ID=42
- RMW_IMPLEMENTATION=rmw_cyclonedds_cpp
devices:
- /dev/ttyUSB1:/dev/ttyUSB1
- /dev/stm32-bridge:/dev/stm32-bridge
command: >
bash -c "
source /opt/ros/humble/setup.bash &&
ros2 run saltybot_stm32_bridge bridge_node
--ros-args
-p serial_port:=/dev/ttyUSB1
-p baud_rate:=921600
ros2 launch saltybot_bridge bridge.launch.py
mode:=bidirectional
serial_port:=/dev/stm32-bridge
"
# ── 4× IMX219 CSI cameras ─────────────────────────────────────────────────
csi-cameras:
image: saltybot/ros2-humble:jetson-orin
build:
context: .
dockerfile: Dockerfile
container_name: saltybot-csi-cameras
restart: unless-stopped
runtime: nvidia
network_mode: host
privileged: true # CSI camera access requires elevated perms
environment:
- ROS_DOMAIN_ID=42
- RMW_IMPLEMENTATION=rmw_cyclonedds_cpp
- NVIDIA_VISIBLE_DEVICES=all
- NVIDIA_DRIVER_CAPABILITIES=all
devices:
- /dev/video0:/dev/video0
- /dev/video2:/dev/video2
- /dev/video4:/dev/video4
- /dev/video6:/dev/video6
command: >
bash -c "
source /opt/ros/humble/setup.bash &&
ros2 launch saltybot_cameras csi_cameras.launch.py
width:=640
height:=480
fps:=30
"
# ── Nav2 autonomous navigation stack ────────────────────────────────────────

View File

@ -1,31 +1,31 @@
# Jetson Nano — GPIO / I2C / UART Pinout Reference
## Self-Balancing Robot: STM32F722 Bridge + RealSense D435i + RPLIDAR A1M8
# Jetson Orin Nano Super — GPIO / I2C / UART / CSI Pinout Reference
## Self-Balancing Robot: STM32F722 Bridge + RealSense D435i + RPLIDAR A1M8 + 4× IMX219
Last updated: 2026-02-28
JetPack version: 4.6 (L4T R32.6.1)
JetPack version: 6.x (L4T R36.x / Ubuntu 22.04)
---
## 40-Pin Header Overview
The Jetson Nano 40-pin header is physically compatible with Raspberry Pi HATs.
Pin numbering below follows **physical board pin** (140) and the Jetson GPIO BCM-equivalent name.
The Jetson Orin Nano Super 40-pin header is physically compatible with Raspberry Pi HATs.
Pin numbering follows **physical board pin** (140).
```
3.3V [ 1] [ 2] 5V
SDA1 [ 3] [ 4] 5V ← I2C SDA (i2c-1)
SCL1 [ 5] [ 6] GND ← I2C SCL (i2c-1)
GPIO [ 7] [ 8] TXD0 ← UART TX (ttyTHS1)
GND [ 9] [10] RXD0 ← UART RX (ttyTHS1)
SDA1 [ 3] [ 4] 5V ← I2C SDA (i2c-7 on Orin Nano)
SCL1 [ 5] [ 6] GND ← I2C SCL (i2c-7 on Orin Nano)
GPIO [ 7] [ 8] TXD0 ← UART TX (ttyTHS0)
GND [ 9] [10] RXD0 ← UART RX (ttyTHS0)
GPIO [11] [12] GPIO
GPIO [13] [14] GND
GPIO [15] [16] GPIO
3.3V [17] [18] GPIO
MOSI [19] [20] GND ← SPI0 MOSI
MISO [21] [22] GPIO ← SPI0 MISO
SCLK [23] [24] CE0 ← SPI0 CLK / CS0
GND [25] [26] CE1 ← SPI0 CS1
ID_SD[27] [28] ID_SC ← I2C ID EEPROM (reserved)
MOSI [19] [20] GND ← SPI1 MOSI
MISO [21] [22] GPIO ← SPI1 MISO
SCLK [23] [24] CE0 ← SPI1 CLK / CS0
GND [25] [26] CE1 ← SPI1 CS1
ID_SD[27] [28] ID_SC ← I2C ID EEPROM (reserved)
GPIO [29] [30] GND
GPIO [31] [32] GPIO
GPIO [33] [34] GND
@ -34,165 +34,262 @@ SCL1 [ 5] [ 6] GND ← I2C SCL (i2c-1)
GND [39] [40] GPIO
```
**Note on Orin Nano I2C bus numbering:** The 40-pin header I2C pins (3/5) map to
`/dev/i2c-7` on Orin Nano (not i2c-1 as on the older Nano). Verify with:
```bash
ls /dev/i2c-*
i2cdetect -l
```
---
## 1. STM32F722 Bridge (UART)
## 1. STM32F722 Bridge (USB CDC — Primary)
The STM32 acts as a real-time motor + IMU controller. Communication to Jetson is via **USB CDC serial** (primary) with hardware UART as fallback.
The STM32 acts as a real-time motor + IMU controller. Communication is via **USB CDC serial**.
### USB CDC (Primary — Recommended)
### USB CDC Connection
| Connection | Detail |
|-----------|--------|
| Interface | USB Micro-B on STM32 dev board → USB-A on Jetson |
| Device node | `/dev/ttyACM0` or `/dev/ttyUSB1` |
| Device node | `/dev/ttyACM0` → symlink `/dev/stm32-bridge` (via udev) |
| Baud rate | 921600 (configured in STM32 firmware) |
| Protocol | Custom binary framing (see `src/comm/`) |
| Power | Powered via Jetson USB 5V (500mA max from host) |
| Protocol | JSON telemetry RX + ASCII command TX (see bridge docs) |
| Power | Powered via robot 5V bus (data-only via USB) |
### Hardware UART (Fallback)
### Hardware UART (Fallback — 40-pin header)
| Jetson Pin | Signal | STM32 Pin | Notes |
|-----------|--------|-----------|-------|
| Pin 8 (TXD0) | TX → | PA10 (UART1 RX) | Cross-connect TX→RX |
| Pin 10 (RXD0) | RX ← | PA9 (UART1 TX) | Cross-connect RX→TX |
| Pin 6 (GND) | GND | GND | Common ground **required** |
**Jetson device node:** `/dev/ttyTHS1`
**Jetson device node:** `/dev/ttyTHS0`
**Baud rate:** 921600, 8N1
**Voltage level:** 3.3V — STM32F722 is 3.3V tolerant; Jetson GPIO is 3.3V
**Do NOT use 5V** — Jetson GPIO max is 3.3V
**Voltage level:** 3.3V — both Jetson Orin and STM32F722 are 3.3V GPIO
```bash
# Verify UART on Jetson
ls /dev/ttyTHS1
# Check permissions (add user to dialout group)
# Verify UART
ls /dev/ttyTHS0
sudo usermod -aG dialout $USER
# Quick loopback test (connect TX→RX)
picocom -b 921600 /dev/ttyTHS1
# Quick test
picocom -b 921600 /dev/ttyTHS0
```
**ROS2 topic mapping (STM32 bridge node):**
**ROS2 topics (STM32 bridge node):**
| ROS2 Topic | Direction | Content |
|-----------|-----------|---------|
| `/stm32/imu_raw` | STM32→Jetson | IMU data (accel, gyro) at 500Hz |
| `/stm32/motor_state` | STM32→Jetson | Motor RPM, current, temperature |
| `/cmd_vel` | Jetson→STM32 | Velocity commands (m/s, rad/s) |
| `/stm32/estop` | Jetson→STM32 | Emergency stop signal |
|-----------|-----------|---------
| `/saltybot/imu` | STM32→Jetson | IMU data (accel, gyro) at 50Hz |
| `/saltybot/balance_state` | STM32→Jetson | Motor cmd, pitch, state |
| `/cmd_vel` | Jetson→STM32 | Velocity commands `C<spd>,<str>\n` |
| `/saltybot/estop` | Jetson→STM32 | Emergency stop |
---
## 2. RealSense D435i (USB3)
The D435i provides RGB-D (depth + color) and IMU (accelerometer + gyroscope).
## 2. RealSense D435i (USB 3.1)
### Connection
| Parameter | Value |
|-----------|-------|
| Interface | USB 3.1 Gen 1 (USB-A on Jetson) |
| Interface | USB 3.1 Gen 1 (USB-A on Jetson — use blue port) |
| Device node | `/dev/bus/usb/...` (udev-managed) |
| USB PID:VID | `0x8086:0x0b3a` (D435i) |
| Power draw | ~1.5W active, 3.5W peak during init |
| Cable | USB 3.1 — use **short cable ≤1m** for stability |
**Note:** The Jetson Nano has **4× USB-A ports** — use a USB3 port (blue) for D435i.
**Note:** Orin Nano Developer Kit has **2× USB-A + 1× USB-C**. Use USB-A (blue = USB 3.1) for D435i.
```bash
# Verify detection
lsusb | grep Intel
# Expected: Bus 002 Device 003: ID 8086:0b3a Intel Corp. Intel RealSense D435i
# Install udev rules (required for non-root access)
sudo cp /etc/udev/rules.d/99-realsense-libusb.rules /etc/udev/rules.d/
sudo udevadm control --reload-rules && sudo udevadm trigger
# Test with realsense-viewer (if installed)
realsense-viewer
```
**ROS2 topics published:**
**ROS2 topics:**
| Topic | Type | Rate |
|-------|------|------|
| `/camera/color/image_raw` | `sensor_msgs/Image` | 30Hz |
| `/camera/depth/image_rect_raw` | `sensor_msgs/Image` | 30Hz |
| `/camera/aligned_depth_to_color/image_raw` | `sensor_msgs/Image` | 30Hz |
| `/camera/imu` | `sensor_msgs/Imu` | 400Hz |
| `/camera/color/camera_info` | `sensor_msgs/CameraInfo` | 30Hz |
---
## 3. RPLIDAR A1M8 (UART via USB adapter)
The A1M8 uses a CP2102/CH340 USB-UART adapter (included in kit).
### Connection
| Parameter | Value |
|-----------|-------|
| Interface | USB Micro-B (via included USB-UART adapter) |
| Device node | `/dev/ttyUSB0` (first USB-UART device) |
| Interface | USB Micro-B (via included CP2102 USB-UART adapter) |
| Device node | `/dev/ttyUSB0` → symlink `/dev/rplidar` (via udev) |
| Baud rate | 115200 |
| Power draw | ~2.6W motor on, 0.4W idle |
| Motor control | DTR line (handled by rplidar_ros driver) |
```bash
# Verify detection
ls /dev/ttyUSB*
# Expected: /dev/ttyUSB0
# Set permissions
sudo usermod -aG dialout $USER
# Test — should output scan data
ros2 launch rplidar_ros rplidar_a1_launch.py serial_port:=/dev/ttyUSB0
ls /dev/rplidar # should exist after udev rule applied
ros2 launch rplidar_ros rplidar_a1_launch.py serial_port:=/dev/rplidar
```
**ROS2 topics published:**
**ROS2 topics:**
| Topic | Type | Rate |
|-------|------|------|
| `/scan` | `sensor_msgs/LaserScan` | 10Hz |
**udev rule (set consistent device name):**
---
## 4. 4× IMX219 CSI Cameras (MIPI CSI-2)
The IMX219 (Sony 8MP) cameras connect via MIPI CSI-2 FFC cables to the Jetson Orin Nano.
### CSI Connector Layout (Orin Nano Developer Kit)
The Orin Nano Developer Kit has two MIPI CSI-2 connectors:
- **CSI-A (J5):** 15-pin FFC — connects to ArduCam adapter A
- **CSI-B (J8):** 15-pin FFC — connects to ArduCam adapter B
Each ArduCam multi-camera adapter multiplexes 2× IMX219 cameras onto one CSI lane.
### ArduCam Multi-Camera Adapter Wiring
| Adapter | Cameras | CSI Connector | V4L2 Devices |
|---------|---------|---------------|--------------|
| Adapter A (CSI-A) | front + left | J5 | `/dev/video0`, `/dev/video2` |
| Adapter B (CSI-B) | rear + right | J8 | `/dev/video4`, `/dev/video6` |
### IMX219 15-pin FFC Pinout (each camera module)
| Pin | Signal | Notes |
|-----|--------|-------|
| 1 | GND | |
| 2 | CSI D0- | MIPI data lane 0 negative |
| 3 | CSI D0+ | MIPI data lane 0 positive |
| 4 | GND | |
| 5 | CSI D1- | MIPI data lane 1 negative |
| 6 | CSI D1+ | MIPI data lane 1 positive |
| 7 | GND | |
| 8 | CSI CLK- | MIPI clock negative |
| 9 | CSI CLK+ | MIPI clock positive |
| 10 | GND | |
| 11 | CAM_GPIO | Camera enable (active high) |
| 12 | CAM_CLK | I2C / control clock |
| 13 | CAM_SDA | I2C data |
| 14 | GND | |
| 15 | 3.3V | Power |
### V4L2 Device Nodes
```bash
# /etc/udev/rules.d/99-rplidar.rules
KERNEL=="ttyUSB*", ATTRS{idVendor}=="10c4", ATTRS{idProduct}=="ea60", \
SYMLINK+="rplidar", MODE="0666"
# List all video devices
v4l2-ctl --list-devices
# Expected output:
# vi-output, imx219 2-0010 (platform:tegra-capture-vi:0):
# /dev/video0
# vi-output, imx219 2-0010 (platform:tegra-capture-vi:1):
# /dev/video2
# vi-output, imx219 4-0010 (platform:tegra-capture-vi:2):
# /dev/video4
# vi-output, imx219 4-0010 (platform:tegra-capture-vi:3):
# /dev/video6
# Capture test (GStreamer)
gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! \
'video/x-raw(memory:NVMM),width=640,height=480,framerate=30/1' ! \
nvvidconv ! xvimagesink
# Capture via V4L2
v4l2-ctl --device=/dev/video0 --stream-mmap --stream-count=1 \
--set-fmt-video=width=640,height=480,pixelformat=RG10
```
### ROS2 Topics (saltybot_cameras package)
| Topic | Camera | Type | Rate |
|-------|--------|------|------|
| `/camera/front/image_raw` | front (video0) | `sensor_msgs/Image` | 30Hz |
| `/camera/left/image_raw` | left (video2) | `sensor_msgs/Image` | 30Hz |
| `/camera/rear/image_raw` | rear (video4) | `sensor_msgs/Image` | 30Hz |
| `/camera/right/image_raw` | right (video6) | `sensor_msgs/Image` | 30Hz |
### TF Frames
| Camera | Frame ID | Offset from sensor_head_link |
|--------|----------|------------------------------|
| front | `camera_front_link` | x=+0.05m, yaw=0° |
| left | `camera_left_link` | y=+0.05m, yaw=+90° |
| rear | `camera_rear_link` | x=-0.05m, yaw=180° |
| right | `camera_right_link` | y=-0.05m, yaw=-90° |
---
## 4. I2C Bus (i2c-1) — Pin 3 / Pin 5
## 5. I2C Bus (i2c-7) — Pin 3 / Pin 5
Available for future peripherals (IMU breakout, OLED display, etc.).
| Parameter | Value |
|-----------|-------|
| Jetson I2C bus | i2c-1 (pins 3 = SDA, 5 = SCL) |
| Jetson I2C bus | i2c-7 (pins 3 = SDA, 5 = SCL) on Orin Nano |
| Voltage | 3.3V pull-up |
| Max clock | 400kHz (Fast Mode) |
| Current source | Jetson 3.3V rail (max ~500mA shared) |
```bash
# Scan i2c-1 bus
i2cdetect -y -r 1
# Scan i2c-7 bus
i2cdetect -y -r 7
```
**Note:** i2c-0 (pins 27/28) is reserved for EEPROM ID — do not use.
---
## 5. GPIO Summary Table
## 6. M.2 NVMe Storage
| Physical Pin | Jetson GPIO | Voltage | Current Used For |
|-------------|-------------|---------|-----------------|
| 3 | SDA1 | 3.3V | I2C data (i2c-1) |
| 5 | SCL1 | 3.3V | I2C clock (i2c-1) |
| 8 | TXD0 | 3.3V | UART TX → STM32 (fallback) |
| 10 | RXD0 | 3.3V | UART RX ← STM32 (fallback) |
| USB-A (×4) | — | 5V | D435i, RPLIDAR adapter, STM32 USB |
The Orin Nano Developer Kit includes an M.2 Key M slot.
| Parameter | Value |
|-----------|-------|
| Interface | PCIe Gen 3 ×4 |
| Form factor | M.2 2230 / 2242 / 2280 |
| Recommended | 256GB+ NVMe SSD (e.g., WD SN530, Samsung PM991) |
| Mount point | `/mnt/nvme` |
```bash
# Verify NVMe detected
lsblk | grep nvme
nvme list
# Partition + format (one-time setup — see setup-jetson.sh)
sudo parted /dev/nvme0n1 mklabel gpt
sudo parted /dev/nvme0n1 mkpart primary ext4 0% 100%
sudo mkfs.ext4 /dev/nvme0n1p1
sudo mkdir -p /mnt/nvme
```
---
## 6. Device Enumeration Notes
## 7. USB Ports Summary
USB devices may enumerate differently across reboots. Use udev rules for stable names:
| Port | Type | Used For |
|------|------|----------|
| USB-A (top, blue) | USB 3.1 Gen 1 | RealSense D435i |
| USB-A (bottom) | USB 2.0 | RPLIDAR (via USB-UART adapter) |
| USB-C | USB 3.1 Gen 1 (+ DP) | STM32 CDC or host flash |
| Micro-USB | Debug/flash | JetPack flash only |
---
## 8. GPIO Summary Table
| Physical Pin | Function | Voltage | Used For |
|-------------|----------|---------|----------|
| 3 | SDA1 | 3.3V | I2C data (i2c-7) |
| 5 | SCL1 | 3.3V | I2C clock (i2c-7) |
| 8 | TXD0 | 3.3V | UART TX → STM32 (fallback) |
| 10 | RXD0 | 3.3V | UART RX ← STM32 (fallback) |
| USB-A ×2 | — | 5V | D435i, RPLIDAR |
| USB-C | — | 5V | STM32 CDC |
| CSI-A (J5) | MIPI CSI-2 | — | Cameras front + left |
| CSI-B (J8) | MIPI CSI-2 | — | Cameras rear + right |
| M.2 Key M | PCIe Gen3 ×4 | — | NVMe SSD |
---
## 9. udev Rules
Apply stable device names:
```bash
# /etc/udev/rules.d/99-saltybot.rules
@ -204,9 +301,15 @@ KERNEL=="ttyUSB*", ATTRS{idVendor}=="10c4", ATTRS{idProduct}=="ea60", \
# STM32 USB CDC (STMicroelectronics)
KERNEL=="ttyACM*", ATTRS{idVendor}=="0483", ATTRS{idProduct}=="5740", \
SYMLINK+="stm32-bridge", MODE="0666"
# Intel RealSense D435i
SUBSYSTEM=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b3a", \
MODE="0666"
# IMX219 CSI cameras (V4L2)
KERNEL=="video[0246]", SUBSYSTEM=="video4linux", MODE="0666"
```
Apply rules:
```bash
sudo cp docs/99-saltybot.rules /etc/udev/rules.d/
sudo udevadm control --reload-rules && sudo udevadm trigger

View File

@ -1,177 +1,188 @@
# Jetson Nano Power Budget Analysis
## Self-Balancing Robot — 10W Envelope
# Jetson Orin Nano Super Power Budget Analysis
## Self-Balancing Robot — 25W Envelope
Last updated: 2026-02-28
Target: Operate within 10W SoC power envelope (MAXN 10W mode)
Target: Operate within 25W SoC power envelope (MAXN 25W mode)
---
## Power Modes
Jetson Nano supports two NVPModel power modes:
Jetson Orin Nano Super supports multiple NVPModel power modes:
| Mode | GPU | CPU cores | CPU freq | Memory freq | TDP |
|------|-----|-----------|----------|-------------|-----|
| **MAXN (Mode 0)** | 128 core | 4 | 1.43GHz | 1600MHz | **10W** |
| 5W (Mode 1) | 128 core | 2 | 0.92GHz | 1600MHz | 5W |
| Mode | ID | CPU | GPU | TDP |
|------|-----|-----|-----|-----|
| **MAXN** | 0 | 6× A78AE @ 1.5GHz | 1024-core Ampere | **25W** |
| 15W | 1 | 6× A78AE @ 1.2GHz | 1024-core Ampere | 15W |
| 10W | 2 | 4× A78AE @ 1.2GHz | 1024-core Ampere | 10W |
| 7W | 3 | 4× A78AE @ 0.8GHz | 1024-core Ampere | 7W |
For this robot, we target **MAXN 10W mode** with careful peripheral management.
For this robot, we target **MAXN 25W mode** — a significant upgrade from the previous Nano 10W budget.
```bash
# Check current mode
sudo nvpmodel -q
# Set 10W MAXN mode
# Set 25W MAXN mode
sudo nvpmodel -m 0
# Set 5W mode (thermal/battery save)
# Set 15W mode (thermal / battery save)
sudo nvpmodel -m 1
# Monitor power in real time
sudo tegrastats
# or via jtop
sudo jtop
```
---
## Component Power Budget
### SoC (Jetson Nano Module)
### SoC (Jetson Orin Nano Super Module)
| Component | Idle (W) | Load (W) | Peak (W) | Notes |
|-----------|----------|----------|----------|-------|
| CPU (4× Cortex-A57) | 1.0 | 3.5 | 4.0 | ROS2 + SLAM compute |
| GPU (128-core Maxwell) | 0.5 | 2.5 | 3.0 | Depth processing, ML inference |
| DDR4 RAM (4GB) | 0.3 | 0.6 | 0.8 | |
| eMMC / SD | 0.1 | 0.2 | 0.3 | |
| **SoC Subtotal** | **1.9** | **6.8** | **8.1** | |
| CPU (6× A78AE) | 1.5 | 6.0 | 8.0 | ROS2, SLAM, Nav2 |
| GPU (1024-core Ampere) | 0.8 | 5.0 | 7.0 | Depth processing, DNN inference |
| LPDDR5 RAM (8GB) | 0.4 | 0.8 | 1.0 | |
| NVMe SSD (M.2) | 0.2 | 0.5 | 0.8 | Map storage, rosbags |
| Video encoder / ISP | 0.0 | 1.5 | 2.5 | 4× IMX219 ISP processing |
| **SoC Subtotal** | **2.9** | **13.8** | **19.3** | |
### Peripherals (USB / GPIO)
### Peripherals
| Peripheral | Idle (W) | Active (W) | Peak (W) | Interface | Notes |
|-----------|----------|------------|----------|-----------|-------|
| RealSense D435i | 0.3 | 1.5 | 3.5 | USB 3.1 | Peak during boot/init |
| RPLIDAR A1M8 | 0.4 | 2.6 | 3.0 | USB (UART adapter) | Motor spinning |
| STM32F722 bridge | 0.3 | 0.5 | 0.8 | USB CDC | Powered from Jetson USB |
| **Peripheral Subtotal** | **1.0** | **4.6** | **7.3** | | |
| STM32F722 bridge | 0.0 | 0.0 | 0.0 | USB CDC | Self-powered from robot 5V |
| 4× IMX219 cameras | 0.2 | 2.0 | 2.4 | MIPI CSI-2 | ~0.5W per camera active |
| **Peripheral Subtotal** | **0.9** | **6.1** | **8.9** | | |
### Total System (from Jetson 5V barrel jack)
| Scenario | SoC (W) | Peripherals (W) | **Total (W)** | Margin |
|----------|---------|-----------------|---------------|--------|
| Idle | 1.9 | 1.0 | **2.9** | +7.1W |
| Nominal (SLAM running) | 6.8 | 4.6 | **11.4** | **-1.4W ⚠️** |
| Peak (all active, ML) | 8.1 | 7.3 | **15.4** | **-5.4W ❌** |
| Scenario | SoC (W) | Peripherals (W) | **Total (W)** | Margin vs 25W |
|----------|---------|-----------------|---------------|----------------|
| Idle | 2.9 | 0.9 | **3.8** | +21.2W |
| Nominal (SLAM + cameras) | 13.8 | 6.1 | **19.9** | **+5.1W ✅** |
| Peak (DNN + all sensors) | 19.3 | 8.9 | **28.2** | **-3.2W ⚠️** |
---
## Budget Compliance Strategy
## Budget Analysis vs Previous Platform
The nominal load of **11.4W exceeds the 10W envelope** — mitigation required:
| Metric | Jetson Nano | Jetson Orin Nano Super |
|--------|------------|------------------------|
| TDP | 10W | 25W |
| CPU | 4× Cortex-A57 @ 1.43GHz | 6× A78AE @ 1.5GHz |
| GPU | 128-core Maxwell | 1024-core Ampere |
| RAM | 4GB LPDDR4 | 8GB LPDDR5 |
| AI TOPS | ~0.5 | 67 |
| Nominal load | 11.4W (over budget) | 19.9W (5W headroom) |
| Cameras | 0 CSI | 4× IMX219 CSI |
| Storage | microSD | NVMe M.2 |
### Mitigation 1: RPLIDAR Power Gating
The RPLIDAR motor can be stopped when not scanning. The ROS2 driver handles this via DTR line.
**The Orin Nano Super has 2.5× more thermal headroom at nominal load.** No aggressive power-gating needed for normal operation.
| Mode | Savings |
|------|---------|
| RPLIDAR motor off | 2.2W |
| RPLIDAR idle | 0.4W vs 2.6W |
---
### Mitigation 2: RealSense Resolution Reduction
Lower RGB-D resolution reduces USB bandwidth and D435i processing:
## Power Compliance Strategy
| Profile | Power |
|---------|-------|
| 1280×720 @ 30fps | 1.5W |
| 640×480 @ 30fps | 1.1W ← **Recommended** |
| 424×240 @ 30fps | 0.8W |
### Nominal Operation (SLAM + cameras) — ✅ Within 25W
### Mitigation 3: Jetson GPU Workload Scheduling
Avoid running depth inference and SLAM simultaneously at full throttle:
At 19.9W nominal, we have 5W headroom. No mitigation required for normal robot operation.
### Peak Operation (DNN inference) — ⚠️ Briefly exceeds 25W
When running DNN inference (e.g., object detection) simultaneously with full sensor suite:
**Mitigation 1: Thermal throttling (automatic)**
The Orin's DVFS will automatically throttle CPU/GPU when temperature exceeds threshold.
No explicit action needed — the Orin handles this gracefully.
**Mitigation 2: Switch to 15W mode during high-load phases**
```bash
# Cap GPU frequency (reduce from max 921.6MHz)
sudo jetson_clocks --show
# Set conservative clocks
echo 614400000 | sudo tee /sys/devices/17000000.gp10b/devfreq/17000000.gp10b/min_freq
sudo nvpmodel -m 1 # 15W mode: reduces peak to ~22W
sudo nvpmodel -m 0 # return to MAXN when cooling
```
### Mitigation 4: STM32 Self-Powered
Power STM32 from robot's 5V bus (separate from Jetson USB rail):
**Mitigation 3: RPLIDAR motor gating**
Stop RPLIDAR motor between scan cycles: saves ~2.2W average.
Handled automatically by `rplidar_ros` driver via DTR line control.
| Option | Jetson USB load |
|--------|----------------|
| STM32 powered from Jetson USB | 0.5W |
| STM32 powered from robot 5V | **0W** (data only via USB) |
**Mitigation 4: Camera resolution reduction**
For compute-heavy phases, drop from 640×480 to 424×240 per camera: saves ~0.6W.
---
## Revised Budget with Mitigations
## CSI Camera Bandwidth
Applying: 640×480 D435i + RPLIDAR gating + STM32 self-powered:
4× IMX219 cameras at 640×480@30fps:
| Component | Power (W) |
|-----------|-----------|
| CPU (SLAM, 4 cores) | 3.5 |
| GPU (depth processing) | 2.0 |
| RAM + misc SoC | 1.0 |
| RealSense D435i (640×480) | 1.1 |
| RPLIDAR A1M8 (active) | 2.6 |
| STM32 bridge (self-powered) | 0.0 |
| **Total** | **10.2W** |
| Parameter | Value |
|-----------|-------|
| Per-camera raw bandwidth | 640×480×30×10bpp = 92.16 Mb/s |
| Total 4 cameras | ~369 Mb/s |
| MIPI CSI-2 capacity (Orin) | 40 Gb/s total (2× 4-lane) |
| ISP processing overhead | ~1.5W (all 4 cameras active) |
**Near-compliant at 10.2W.** Further savings achievable by:
- Enabling RPLIDAR standby between scan cycles (0.5W avg)
- Using 5W nvpmodel during motor-heavy phases
**CSI bandwidth is well within capacity.** The Orin Nano Super's ISP handles 4 cameras simultaneously.
---
## Input Power Requirements
### Jetson Nano Power Input
### Jetson Orin Nano Super Power Input
| Spec | Value |
|------|-------|
| Input connector | 5.5mm / 2.1mm barrel jack |
| Input connector | 5.5mm / 2.5mm barrel jack |
| Input voltage | 5V DC |
| Recommended current | ≥4A (20W supply for headroom) |
| Recommended current | ≥6A (30W supply for headroom) |
| Absolute max | 5.25V |
> **Use a 5V 4A supply minimum.** A 2A supply will brownout under load.
> **Use a 5V 6A supply minimum.** A 4A supply may brownout under DNN peak load.
### Robot Power Architecture (Recommended)
```
LiPo 3S (12.6V max)
LiPo 4S (16.8V max)
├─► DC-DC Buck → 5V 5A ──► Jetson Nano barrel jack
│ (e.g., XL4016)
├─► DC-DC Buck → 5V 6A ──► Jetson Orin barrel jack (30W)
│ (e.g., XL4016E1)
├─► DC-DC Buck → 5V 3A ──► STM32 + logic 5V rail
└─► Hoverboard ESC ──► Hub motors (48V loop)
```
This isolates the Jetson 5V supply from motor switching noise.
Using a 4S LiPo (vs 3S previously) gives better efficiency for the 5V buck converter
at Orin's higher power draw.
---
## Real-Time Monitoring
```bash
# Live power telemetry
# Live power telemetry (tegrastats)
sudo tegrastats --interval 500
# Key fields:
# POM_5V_IN X/Y — total input power (current W / average W)
# POM_5V_IN X/Y — total input power (current mW / average mW)
# POM_5V_GPU X/Y — GPU power
# POM_5V_CPU X/Y — CPU power
# Log to file
# Interactive monitoring (jtop — recommended)
sudo pip3 install jetson-stats
sudo jtop
# Log power to file
sudo tegrastats --interval 1000 --logfile /tmp/power_log.txt &
# Parse log
grep "POM_5V_IN" /tmp/power_log.txt | \
awk '{for(i=1;i<=NF;i++) if($i=="POM_5V_IN") print $(i+1)}' | \
awk -F'/' '{sum+=$1; count++} END {print "Avg:", sum/count, "mW"}'
awk -F'/' '{sum+=$1; count++} END {print "Avg:", sum/count/1000, "W"}'
```
---
@ -180,9 +191,10 @@ grep "POM_5V_IN" /tmp/power_log.txt | \
| Metric | Value |
|--------|-------|
| Target envelope | 10W |
| Nominal (no mitigation) | 11.4W |
| Nominal (with mitigations) | ~10.2W |
| Compliant scenario | RPLIDAR standby + 640p D435i |
| Recommended PSU | 5V 4A (20W) |
| Target envelope | 25W (MAXN) |
| Nominal (SLAM + 4 cameras) | ~19.9W |
| Peak (DNN inference) | ~28.2W (briefly) |
| Compliant scenario | All sensors + SLAM (no DNN) |
| Recommended PSU | 5V 6A (30W) |
| Power mode | nvpmodel MAXN (Mode 0) |
| Upgrade from Nano | +150% TDP, +13,300% AI TOPS |

View File

@ -0,0 +1,46 @@
# saltybot_cameras — CSI camera parameters
# Applied to each v4l2_camera node via launch args
# Image dimensions (per camera)
image_width: 640
image_height: 480
framerate: 30.0
# V4L2 pixel format for IMX219 raw Bayer
# Use YUYV for processed output (default from v4l2_camera)
# Use RG10 for raw Bayer (requires ISP processing)
pixel_format: "YUYV"
# Camera info URL template (set per-camera in launch file)
# camera_info_url: "file:///config/camera_{name}_info.yaml"
# Sensor head geometry (metres)
# Used by camera_tf.launch.py for static TF
sensor_head_radius: 0.05
# Camera positions (yaw degrees, CCW positive)
cameras:
front:
device: "/dev/video0"
frame_id: "camera_front_link"
x_offset: 0.05
y_offset: 0.00
yaw_deg: 0.0
left:
device: "/dev/video2"
frame_id: "camera_left_link"
x_offset: 0.00
y_offset: 0.05
yaw_deg: 90.0
rear:
device: "/dev/video4"
frame_id: "camera_rear_link"
x_offset: -0.05
y_offset: 0.00
yaw_deg: 180.0
right:
device: "/dev/video6"
frame_id: "camera_right_link"
x_offset: 0.00
y_offset: -0.05
yaw_deg: -90.0

View File

@ -0,0 +1,66 @@
"""
camera_tf.launch.py Static TF transforms for 4× IMX219 surround camera array
Sensor head geometry:
All cameras mount on a circular sensor head (radius 5cm) at the same height.
Cameras point outward at 90° intervals.
The sensor_head_link frame is the centre of the ring.
camera_front_link: x=+0.05, yaw= 0° (faces robot +X / forward)
camera_left_link: y=+0.05, yaw=+90° (faces robot +Y / left)
camera_rear_link: x=-0.05, yaw=180° (faces robot -X / rear)
camera_right_link: y=-0.05, yaw=-90° (faces robot -Y / right)
Camera optical frame convention: z=forward, x=right, y=down
We publish the body frame here; optical frame offset is per-camera calibration.
Parent frame: sensor_head_link (expected to be published by robot URDF/state publisher)
"""
import math
from launch import LaunchDescription
from launch_ros.actions import Node
def _static_tf(parent, child, x=0.0, y=0.0, z=0.0, yaw_deg=0.0):
"""Return a static_transform_publisher node (x y z yaw pitch roll parent child)."""
yaw = math.radians(yaw_deg)
# static_transform_publisher args: x y z yaw pitch roll frame_id child_frame_id
return Node(
package="tf2_ros",
executable="static_transform_publisher",
name=f"tf_{child.replace('_link', '')}",
arguments=[
str(x), str(y), str(z),
str(yaw), "0.0", "0.0",
parent, child,
],
output="screen",
)
def generate_launch_description():
r = 0.05 # sensor head radius (metres) — adjust to actual hardware
return LaunchDescription([
# Front camera: faces +X (robot forward)
_static_tf(
"sensor_head_link", "camera_front_link",
x=r, y=0.0, z=0.0, yaw_deg=0.0,
),
# Left camera: faces +Y (robot left)
_static_tf(
"sensor_head_link", "camera_left_link",
x=0.0, y=r, z=0.0, yaw_deg=90.0,
),
# Rear camera: faces -X (robot rear)
_static_tf(
"sensor_head_link", "camera_rear_link",
x=-r, y=0.0, z=0.0, yaw_deg=180.0,
),
# Right camera: faces -Y (robot right)
_static_tf(
"sensor_head_link", "camera_right_link",
x=0.0, y=-r, z=0.0, yaw_deg=-90.0,
),
])

View File

@ -0,0 +1,96 @@
"""
csi_cameras.launch.py 4× IMX219 CSI surround camera launch
Hardware: Jetson Orin Nano Super
CSI-A (connector 0): Camera 0 (front) + Camera 1 (left) via multi-camera adapter
CSI-B (connector 1): Camera 2 (rear) + Camera 3 (right) via multi-camera adapter
V4L2 device assignment (with ArduCam multi-camera adapter or equivalent):
/dev/video0 front (sensor_id=0, CSI-A, cam-select=0)
/dev/video2 left (sensor_id=1, CSI-A, cam-select=1)
/dev/video4 rear (sensor_id=2, CSI-B, cam-select=0)
/dev/video6 right (sensor_id=3, CSI-B, cam-select=1)
Topics published per camera:
/camera/<name>/image_raw sensor_msgs/Image 30Hz
/camera/<name>/camera_info sensor_msgs/CameraInfo 30Hz
TF frames:
camera_front_link, camera_left_link, camera_rear_link, camera_right_link
(all children of sensor_head_link see camera_tf.launch.py)
Usage:
ros2 launch saltybot_cameras csi_cameras.launch.py
ros2 launch saltybot_cameras csi_cameras.launch.py width:=1280 height:=720 fps:=15
"""
from launch import LaunchDescription
from launch.actions import DeclareLaunchArgument, GroupAction
from launch.substitutions import LaunchConfiguration
from launch_ros.actions import Node, PushRosNamespace
_CAMERAS = [
{"name": "front", "device": "/dev/video0", "frame_id": "camera_front_link"},
{"name": "left", "device": "/dev/video2", "frame_id": "camera_left_link"},
{"name": "rear", "device": "/dev/video4", "frame_id": "camera_rear_link"},
{"name": "right", "device": "/dev/video6", "frame_id": "camera_right_link"},
]
def generate_launch_description():
width_arg = DeclareLaunchArgument("width", default_value="640")
height_arg = DeclareLaunchArgument("height", default_value="480")
fps_arg = DeclareLaunchArgument("fps", default_value="30")
# Pixel format for IMX219 via V4L2: YUYV or MJPG
# On Orin with Argus ISP path: use 'RG10' (raw Bayer) or let V4L2 negotiate
fmt_arg = DeclareLaunchArgument(
"pixel_format", default_value="YUYV",
description="V4L2 pixel format: YUYV | MJPG | RG10"
)
nodes = [width_arg, height_arg, fps_arg, fmt_arg]
for cam in _CAMERAS:
group = GroupAction([
PushRosNamespace(f"camera/{cam['name']}"),
Node(
package="v4l2_camera",
executable="v4l2_camera_node",
name=f"cam_{cam['name']}",
output="screen",
parameters=[{
"video_device": cam["device"],
"image_size": [
LaunchConfiguration("width"),
LaunchConfiguration("height"),
],
"time_per_frame": [1, LaunchConfiguration("fps")],
"pixel_format": LaunchConfiguration("pixel_format"),
"camera_frame_id": cam["frame_id"],
"camera_name": cam["name"],
# Disable auto-exposure for consistent surround-view balance
"brightness": 0,
"auto_exposure": 1, # 1 = manual on most V4L2 drivers
}],
# Remap within namespace so topic is /camera/<name>/image_raw
remappings=[("image_raw", "image_raw")],
),
])
nodes.append(group)
# Include static TF for all cameras
from launch.actions import IncludeLaunchDescription
from launch.launch_description_sources import PythonLaunchDescriptionSource
import os
tf_launch = IncludeLaunchDescription(
PythonLaunchDescriptionSource([
os.path.join(
os.path.dirname(os.path.realpath(__file__)),
"camera_tf.launch.py",
)
])
)
nodes.append(tf_launch)
return LaunchDescription(nodes)

View File

@ -0,0 +1,25 @@
<?xml version="1.0"?>
<?xml-model href="http://download.ros.org/schema/package_format3.xsd" schematypens="http://www.w3.org/2001/XMLSchema"?>
<package format="3">
<name>saltybot_cameras</name>
<version>0.1.0</version>
<description>4× IMX219 CSI surround camera integration for saltybot (Jetson Orin Nano Super)</description>
<maintainer email="seb@vayrette.com">seb</maintainer>
<license>MIT</license>
<depend>rclpy</depend>
<depend>v4l2_camera</depend>
<depend>tf2_ros</depend>
<depend>sensor_msgs</depend>
<exec_depend>python3-launch-ros</exec_depend>
<test_depend>ament_copyright</test_depend>
<test_depend>ament_flake8</test_depend>
<test_depend>ament_pep257</test_depend>
<test_depend>python3-pytest</test_depend>
<export>
<build_type>ament_python</build_type>
</export>
</package>

View File

@ -0,0 +1,4 @@
[develop]
script_dir=$base/lib/saltybot_cameras
[install]
install_scripts=$base/lib/saltybot_cameras

View File

@ -0,0 +1,30 @@
from setuptools import setup
import os
from glob import glob
package_name = 'saltybot_cameras'
setup(
name=package_name,
version='0.1.0',
packages=[package_name],
data_files=[
('share/ament_index/resource_index/packages',
['resource/' + package_name]),
('share/' + package_name, ['package.xml']),
(os.path.join('share', package_name, 'launch'),
glob('launch/*.py')),
(os.path.join('share', package_name, 'config'),
glob('config/*.yaml')),
],
install_requires=['setuptools'],
zip_safe=True,
maintainer='seb',
maintainer_email='seb@vayrette.com',
description='4x IMX219 CSI surround camera integration for saltybot',
license='MIT',
tests_require=['pytest'],
entry_points={
'console_scripts': [],
},
)

View File

@ -1,41 +1,56 @@
#!/usr/bin/env bash
# Jetson Nano host setup script
# Run once on fresh JetPack 4.6 installation
# Jetson Orin Nano Super host setup script
# Run once on fresh JetPack 6 installation (Ubuntu 22.04)
# Usage: sudo bash setup-jetson.sh
set -euo pipefail
echo "=== Jetson Nano Host Setup — saltybot ==="
echo "JetPack 4.6 / L4T R32.6.1 expected"
echo "=== Jetson Orin Nano Super Host Setup — saltybot ==="
echo "JetPack 6.x / L4T R36.x / Ubuntu 22.04 expected"
# ── Verify we're on Jetson ────────────────────────────────────────────────────
# ── Verify we're on Jetson Orin ───────────────────────────────────────────────
if ! uname -m | grep -q aarch64; then
echo "ERROR: Must run on Jetson (aarch64). Got: $(uname -m)"
exit 1
fi
if [ -f /etc/nv_tegra_release ]; then
L4T_VER=$(head -1 /etc/nv_tegra_release | grep -o 'R[0-9]*' | head -1)
echo "[i] Detected L4T: $L4T_VER"
if [[ "$L4T_VER" != "R36" ]]; then
echo "WARNING: Expected L4T R36 (JetPack 6). Got $L4T_VER"
echo " This script is tuned for Orin Nano Super / JetPack 6."
read -rp "Continue anyway? [y/N] " ans
[[ "${ans,,}" == "y" ]] || exit 1
fi
fi
# ── System update ─────────────────────────────────────────────────────────────
apt-get update && apt-get upgrade -y
# ── Install Docker + NVIDIA runtime ──────────────────────────────────────────
# ── Install Docker + NVIDIA Container Toolkit ─────────────────────────────────
if ! command -v docker &>/dev/null; then
echo "[+] Installing Docker..."
curl -fsSL https://get.docker.com | sh
usermod -aG docker "$SUDO_USER"
fi
# NVIDIA container runtime
if ! dpkg -l | grep -q nvidia-container-runtime; then
echo "[+] Installing NVIDIA Container Runtime..."
distribution=$(. /etc/os-release; echo $ID$VERSION_ID)
curl -s -L https://nvidia.github.io/libnvidia-container/gpgkey | apt-key add -
# NVIDIA Container Toolkit (JetPack 6 method — replaces legacy nvidia-docker2)
if ! dpkg -l | grep -q nvidia-container-toolkit; then
echo "[+] Installing NVIDIA Container Toolkit..."
distribution=$(. /etc/os-release; echo "${ID}${VERSION_ID}")
# JetPack 6 / Ubuntu 22.04 uses the new toolkit keyring
curl -fsSL https://nvidia.github.io/libnvidia-container/gpgkey \
| gpg --dearmor -o /usr/share/keyrings/nvidia-container-toolkit-keyring.gpg
curl -s -L "https://nvidia.github.io/libnvidia-container/$distribution/libnvidia-container.list" \
> /etc/apt/sources.list.d/nvidia-container-runtime.list
| sed 's#deb https://#deb [signed-by=/usr/share/keyrings/nvidia-container-toolkit-keyring.gpg] https://#g' \
> /etc/apt/sources.list.d/nvidia-container-toolkit.list
apt-get update
apt-get install -y nvidia-container-runtime
apt-get install -y nvidia-container-toolkit
nvidia-ctk runtime configure --runtime=docker
fi
# Configure Docker daemon for NVIDIA runtime
# Configure Docker daemon for NVIDIA runtime + NVMe data root
cat > /etc/docker/daemon.json << 'EOF'
{
"runtimes": {
@ -44,16 +59,47 @@ cat > /etc/docker/daemon.json << 'EOF'
"runtimeArgs": []
}
},
"default-runtime": "nvidia"
"default-runtime": "nvidia",
"data-root": "/mnt/nvme/docker"
}
EOF
systemctl restart docker
# ── Set Jetson power mode to MAXN 10W ────────────────────────────────────────
echo "[+] Setting MAXN 10W power mode..."
# ── Set Jetson power mode to MAXN 25W ─────────────────────────────────────────
echo "[+] Setting MAXN 25W power mode (Orin Nano Super)..."
nvpmodel -m 0
jetson_clocks
# ── NVMe SSD setup ────────────────────────────────────────────────────────────
echo "[+] Setting up NVMe SSD..."
if lsblk | grep -q nvme; then
NVME_DEV=$(lsblk -d -n -o NAME | grep nvme | head -1)
NVME_PATH="/dev/$NVME_DEV"
if ! lsblk "${NVME_PATH}" | grep -q "${NVME_DEV}p1"; then
echo " [+] Partitioning NVMe at ${NVME_PATH}..."
parted "${NVME_PATH}" --script mklabel gpt
parted "${NVME_PATH}" --script mkpart primary ext4 0% 100%
mkfs.ext4 -F "${NVME_PATH}p1"
fi
mkdir -p /mnt/nvme
if ! grep -q "/mnt/nvme" /etc/fstab; then
NVME_UUID=$(blkid -s UUID -o value "${NVME_PATH}p1")
echo "UUID=${NVME_UUID} /mnt/nvme ext4 defaults,noatime 0 2" >> /etc/fstab
fi
mount -a
# Create saltybot directories on NVMe
mkdir -p /mnt/nvme/{saltybot,docker,rosbags,slam-maps}
mkdir -p /mnt/nvme/saltybot/maps
chown -R "$SUDO_USER":"$SUDO_USER" /mnt/nvme/saltybot /mnt/nvme/rosbags /mnt/nvme/slam-maps
echo " [+] NVMe mounted at /mnt/nvme"
else
echo " [!] No NVMe detected. Skipping NVMe setup."
echo " Install an M.2 NVMe SSD in the Key M slot for best performance."
fi
# ── Install udev rules ────────────────────────────────────────────────────────
echo "[+] Installing udev rules..."
cat > /etc/udev/rules.d/99-saltybot.rules << 'EOF'
@ -68,6 +114,9 @@ KERNEL=="ttyACM*", ATTRS{idVendor}=="0483", ATTRS{idProduct}=="5740", \
# Intel RealSense D435i
SUBSYSTEM=="usb", ATTRS{idVendor}=="8086", ATTRS{idProduct}=="0b3a", \
MODE="0666"
# IMX219 CSI cameras via V4L2
KERNEL=="video[0246]", SUBSYSTEM=="video4linux", MODE="0666", GROUP="video"
EOF
udevadm control --reload-rules
@ -75,10 +124,7 @@ udevadm trigger
# ── Install RealSense udev rules ──────────────────────────────────────────────
echo "[+] Installing RealSense udev rules..."
if [ -f /etc/udev/rules.d/99-realsense-libusb.rules ]; then
echo " Already installed."
else
# Download from librealsense repo
if [ ! -f /etc/udev/rules.d/99-realsense-libusb.rules ]; then
wget -q -O /etc/udev/rules.d/99-realsense-libusb.rules \
https://raw.githubusercontent.com/IntelRealSense/librealsense/master/config/99-realsense-libusb.rules
udevadm control --reload-rules
@ -88,32 +134,63 @@ fi
# ── Enable I2C + UART ─────────────────────────────────────────────────────────
echo "[+] Enabling I2C and UART..."
modprobe i2c-dev
# Add user to i2c and dialout groups
usermod -aG i2c,dialout,gpio "$SUDO_USER"
# Add user to required groups (i2c-7 on Orin Nano)
usermod -aG i2c,dialout,gpio,video "$SUDO_USER"
# ── Configure UART (disable console on ttyTHS1) ───────────────────────────────
# ttyTHS1 is the 40-pin header UART — disable serial console to free it
if grep -q "console=ttyS0" /boot/extlinux/extlinux.conf; then
echo "[+] UART ttyTHS1 already free for application use."
# ── Configure UART (disable console on ttyTHS0) ───────────────────────────────
# ttyTHS0 is the 40-pin header UART on Orin — disable serial console to free it
if grep -q "console=ttyTCU0" /boot/extlinux/extlinux.conf 2>/dev/null; then
echo "[i] Serial console is on ttyTCU0 (debug UART) — ttyTHS0 is free."
else
echo "[!] Warning: Check /boot/extlinux/extlinux.conf if serial console"
echo " is using ttyTHS1. Disable it to use UART for STM32 bridge."
echo "[!] Check /boot/extlinux/extlinux.conf — ensure ttyTHS0 is not used"
echo " as a serial console if you need it for STM32 UART fallback."
fi
# ── Check CSI camera drivers ──────────────────────────────────────────────────
echo "[+] Checking CSI camera drivers..."
if modprobe imx219 2>/dev/null; then
echo " [+] IMX219 driver loaded."
else
echo " [!] IMX219 driver not available — may need JetPack camera driver package."
echo " Install: sudo apt-get install nvidia-jetpack"
fi
if command -v v4l2-ctl &>/dev/null; then
echo " [i] V4L2 devices:"
v4l2-ctl --list-devices 2>/dev/null || echo " (no cameras detected yet)"
else
apt-get install -y v4l-utils
fi
# ── Docker Compose ────────────────────────────────────────────────────────────
if ! command -v docker-compose &>/dev/null && ! docker compose version &>/dev/null 2>&1; then
echo "[+] Installing docker-compose..."
pip3 install docker-compose
echo "[+] Installing docker-compose plugin..."
apt-get install -y docker-compose-plugin
fi
# ── Swap (improve stability under memory pressure) ────────────────────────────
# ── Swap (prefer NVMe if available) ──────────────────────────────────────────
if [ "$(swapon --show | wc -l)" -le 1 ]; then
echo "[+] Creating 4GB swap file..."
fallocate -l 4G /swapfile
chmod 600 /swapfile
mkswap /swapfile
swapon /swapfile
echo '/swapfile none swap sw 0 0' >> /etc/fstab
if [ -d /mnt/nvme ]; then
echo "[+] Creating 8GB swap on NVMe..."
fallocate -l 8G /mnt/nvme/swapfile
chmod 600 /mnt/nvme/swapfile
mkswap /mnt/nvme/swapfile
swapon /mnt/nvme/swapfile
echo '/mnt/nvme/swapfile none swap sw 0 0' >> /etc/fstab
else
echo "[+] Creating 4GB swap file on eMMC..."
fallocate -l 4G /swapfile
chmod 600 /swapfile
mkswap /swapfile
swapon /swapfile
echo '/swapfile none swap sw 0 0' >> /etc/fstab
fi
fi
# ── Install jtop for power monitoring ────────────────────────────────────────
if ! command -v jtop &>/dev/null; then
echo "[+] Installing jetson-stats (jtop)..."
pip3 install jetson-stats
fi
echo ""
@ -125,3 +202,6 @@ echo " 1. cd jetson/"
echo " 2. docker compose build"
echo " 3. docker compose up -d"
echo " 4. docker compose logs -f"
echo ""
echo "Monitor power: sudo jtop"
echo "Check cameras: v4l2-ctl --list-devices"