Project: OpenAssistive — Modular Assistive Mobility System
Document: Technical Specification v1.0
Author: Milton Rodolfo Amador Zúniga
Copyright (C) 2026 Milton Rodolfo Amador Zúniga

License: GNU General Public License v3 (GPLv3)

This project is Free Software and Open Hardware.
This is a non-profit open social technology initiative.

------------------------------------------------------------

# OpenAssistive — Technical Specification v1.0

## 1. Purpose and Scope

OpenAssistive defines a modular, low-cost assistive mobility system intended to support blind and visually impaired persons in safe independent movement and environmental interaction.

The system is designed to:

- complement (not replace) the traditional white cane
- provide early obstacle detection
- enable QR / marker reading
- provide haptic and audio feedback
- operate with low-cost components
- be repairable locally
- function offline where possible

This specification defines Version 1 prototype architecture.

---

## 2. Design Principles

- Low cost
- Modular hardware
- Textile-integrated cabling
- Repairable components
- Non-invasive wearable layout
- Offline-capable core functions
- Open documentation
- GPL licensed software
- Social purpose first

---

## 3. System Overview

The OpenAssistive Mobility System consists of five main modules:

1. Smart Cane Sensor Module
2. Shoulder Camera Module
3. Central Backpack Compute Module
4. Haptic Feedback Module
5. Audio + Voice Interaction Module

System flow:

Sensors → Microcontroller → Central Compute → Decision → Haptic / Audio Output

Camera → Compute → QR/Text Recognition → Audio Output

User → Button / Voice → Compute → Function Trigger

---

## 4. Module Architecture

### 4.1 Cane Sensor Module (Adaptable Sleeve)

Physical design:

- detachable sleeve (“sock”) mounted on lower cane segment
- fixed with velcro or elastic strap
- does not modify cane structure permanently

Components:

- 2× ToF distance sensors (front + angled)
- microcontroller (ESP32 class recommended)
- splash-resistant housing
- upward cable routing

Functions:

- detect obstacles before contact
- measure distance bands
- send distance data to central unit

---

### 4.2 Cane Handle Module

Components:

- large tactile button
- primary vibration motor
- magnetic safety connector
- strain-relief cable

Functions:

- immediate alerts
- manual trigger input
- confirmation feedback

---

### 4.3 Shoulder Camera Module

Placement:

- backpack shoulder strap
- upper chest height
- slight downward angle

Components:

- compact wide-angle camera
- small directional microphone
- protective housing

Functions:

- QR code reading (1–5 m with large codes)
- large text capture
- scene snapshot input
- voice command capture

Design constraint:

Must not depend on eyeglasses.

---

### 4.4 Central Backpack Compute Module

Core unit:

- Raspberry Pi 4 or 5 class SBC

Supporting components:

- USB audio interface
- Bluetooth module
- vibration driver board
- power distribution board
- removable storage

Functions:

- sensor fusion
- QR recognition
- text recognition (large print)
- rule-based navigation alerts
- audio generation
- logging (optional)

---

### 4.5 Haptic Feedback Module

Primary:

- handle vibration motor

Optional extensions:

- chest strap vibrators
- shoulder vibrators

Feedback encoding:

- distance bands
- direction hints
- warning urgency

---

### 4.6 Audio + Voice Module

Components:

- mono earbud or bone-conduction headset
- inline microphone
- simple button control

Functions:

- spoken alerts
- QR content reading
- status messages
- optional voice commands

Voice is secondary — buttons remain primary control.

---

## 5. Recommended Components (Prototype v1)

### Sensors
- ToF distance sensors (VL53L1X class)

### Microcontroller
- ESP32 or RP2040 class

### Compute
- Raspberry Pi 4 / 5

### Camera
- Pi Camera Module or USB camera

### Haptics
- coin vibration motors 3–5V

### Power
- USB power bank 20,000 mAh

### Connectors
- magnetic breakaway connectors preferred

---

## 6. Electrical Topology

Recommended architecture:

Sensors → Microcontroller → USB/UART → Raspberry Pi

Reasons:

- isolates timing-sensitive sensor reads
- reduces Pi GPIO load
- improves robustness
- allows future wireless cane module

Communication options:

- USB serial
- UART
- BLE (future version)

---

## 7. Power Budget (Estimated)

Typical consumption:

- Raspberry Pi: 5–7 W
- Camera: 1–2 W
- Sensors + MCU: <1 W
- Audio + haptics: 1–2 W peaks

Estimated total:

~8–12 W average

Battery:

20,000 mAh power bank → 6–10 hours typical use

---

## 8. Prototype Build Order (v1)

Phase 1:
- cane sensor sleeve
- microcontroller distance readout
- vibration feedback only

Phase 2:
- connect to Raspberry Pi
- add audio alerts

Phase 3:
- add camera QR reader

Phase 4:
- integrate backpack layout

Phase 5:
- user field testing

---

## 9. Safety Notes

This system:

- is assistive only
- does not replace mobility training
- must be tested gradually
- must not be sole navigation tool initially

All prototypes must be tested in controlled environments first.

---

## 10. Open Contribution Model

Contributors may:

- modify hardware
- improve software
- redesign modules
- localize documentation

Requirements:

- derivatives must remain GPL compatible
- modifications must be documented
- safety-impacting changes must be declared

---

## 11. Future Extensions

- GPS assisted routing
- offline map hints
- object recognition
- indoor beacon navigation
- wireless cane module
- low-power compute variants

---

End of Technical Specification v1.0
OpenAssistive Project
