Back to Projects

Agentic Unified Robotics Architecture (AURA): Mind controlled robotic arm application

AURA is a brain-computer interface (BCI) project built around OpenBCI EEG hardware to enable direct control of a robotic arm through thought. The system integrates two EEG paradigms, P300 and Motor Imagery (MI) with tailored signal processing and machine learning pipelines. Together, these modules allow a user to both select actions (P300) and continuously guide motion (MI). The framework transforms neural intent into agentic robotic arm control, with the goal of providing assistive functionality for patients with paralysis, stroke, or severe motor impairments.

AURA EEG setup with 16-channel OpenBCI headset and real-time signal monitoring

Demo Videos

Watch AURA in action as it demonstrates mind-controlled robotic arm movements for assistive applications.

Robotic Arm Test with OpenBCI Integration

GUI-Only Control Logic Demo

The Challenge

Stroke survivors, patients with paralysis, and individuals with spinal cord injuries face devastating loss of arm movement, making even basic tasks like eating, drinking, or reaching for objects impossible. Traditional assistive devices require physical input methods that these patients simply cannot use.

"Imagine losing the ability to feed yourself or pick up a glass of water. AURA aims to restore independence by enabling patients to control robotic arms through thought alone, using their brain signals to perform daily tasks."

Our Solution

Dual EEG Paradigm System

P300 Module

  • Discrete action selection (activate, stop, select mode)
  • 80% offline accuracy on OpenBCI data
  • Reliable intent detection with low false positives

Motor Imagery (MI) Module

  • Continuous directional control (left/right, up/down)
  • Mu and beta rhythm detection
  • Naturalistic arm movement control

System Architecture

🧠

EEG Data Acquisition

Real-time signal streaming from 16-channel OpenBCI headset with central-parietal and sensorimotor electrode placements for high-resolution brain signal capture

âš¡

Signal Processing

Bandpass filtering, artifact removal, and feature extraction for both P300 and MI paradigms

🤖

ML Classification

Separate trained models for P300 (ERP features) and MI (spectral power analysis) with confidence scoring

🎯

Agentic Control

Intent fusion system that maps neural signals to safe robotic arm trajectories and actions

Key Features

🎯 Unified Framework

Combines discrete and continuous control in one integrated system

🔬 OpenBCI Integration

Leverages open-source, accessible EEG hardware for neuroprosthetics

♿ Assistive Focus

Designed specifically for patients with motor impairments

Technical Implementation

Hardware

16-Channel OpenBCI EEG HeadsetElectrodesRobotic Arm

Signal Processing

Bandpass FilteringArtifact RemovalFeature ExtractionEpoching

Machine Learning

P300 ClassifierMotor Imagery ClassifierConfidence ScoringCross-validation

Control Systems

Intent FusionTrajectory PlanningSafety ChecksReal-time Processing

Impact & Applications

Assistive Robotics

Enable paralyzed patients to manipulate objects and perform daily tasks through thought control

Neurorehabilitation

MI training to re-engage motor pathways during arm control therapy