Skip to content

NeuroPhone is a complete Android application for neurosymbolic AI on mobile devices. It combines spiking neural networks with large language models for advanced on-device intelligence.

License

Notifications You must be signed in to change notification settings

hyperpolymath/neurophone

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

87 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

License: PMPL-1.0 Palimpsest

NeuroPhone - Neurosymbolic AI Android Application

Platform Target Device RSR Compliance

Tip

AI-Assisted Install: Just tell any AI:
Set up NeuroPhone on my Android from https://github.com/hyperpolymath/neurophone
It reads this repo, asks a few questions, and does everything. Details below.

AI-Assisted Installation (Recommended)

Just Say It

You don’t need to read this README. Just say this to any AI assistant:

Set up NeuroPhone on my Android from https://github.com/hyperpolymath/neurophone

That’s it. You don’t type commands, install packages, or configure anything. The AI fetches this repo, reads the installation guide inside it, figures out your device, and does everything. You just answer a few questions and confirm the privacy notice.

The URL is the key — it points the AI to this repo where docs/AI_INSTALLATION_GUIDE.adoc contains the complete step-by-step recipe. Any AI that can read a URL and run commands (or generate commands for you to paste) can do this.

The AI handles all of this automatically:

  • Checking your device and storage

  • Installing Termux (if needed), Rust, Git, and dependencies

  • Cloning and building NeuroPhone for your specific hardware

  • Downloading the right LLM model for your device’s RAM/storage

  • Creating your configuration with sensible defaults

  • Running the setup wizard

  • Giving you a working NeuroPhone

Other Ways to Say It

If your AI already knows about NeuroPhone (e.g. it can search the web), even shorter versions work:

  • "Make my phone a NeuroPhone"

  • "Install NeuroPhone on my Android"

  • "Turn my Oppo Reno 13 into a NeuroPhone"

If it doesn’t know the project, just include the URL:

What You’ll Be Asked

Your AI will ask you:

  1. What device? (so it picks the right thread count and model size)

  2. Privacy confirmation — what sensors are used and how data stays on-device

  3. Cloud fallback? (optional Claude API for complex queries — default is local-only)

That’s it. Everything else is automatic. No package managers, no build flags, no config files.

Privacy & Security Notice

Important

What NeuroPhone does:

  • Reads phone sensors (accelerometer, gyroscope, magnetometer, light, proximity)

  • Processes everything on-device using Rust neural networks + local Llama LLM

  • Stores neural states locally in ~/.local/share/neurophone/ (never uploaded)

  • Optionally uses Claude API for complex queries (you control this)

What NeuroPhone does NOT do:

  • Upload sensor data to any server (unless you enable cloud fallback)

  • Track you or collect analytics

  • Access camera, microphone, contacts, or personal data

You control everything: cloud fallback toggle, all config in ~/.config/neurophone/, uninstall anytime.

After Install

Once your AI finishes setup, just use it:

neurophone                                       # Start NeuroPhone
neurophone query "What am I doing right now?"    # Ask a question
neurophone status                                # Check system status

Uninstall

Tell your AI: "Uninstall NeuroPhone from my phone"

Troubleshooting

Tell your AI what went wrong — it can read the troubleshooting docs in this repo. Common issues:

Problem Solution

"Termux not found"

AI will guide you to install from F-Droid (NOT Google Play)

Build takes too long

Normal for first build (5-10 min). AI adjusts thread count for your device.

"Model download failed"

AI will try alternate download methods or suggest adb push from PC

"LSM crashes"

Low RAM. AI will reduce model size or neuron count for your device.

For manual installation without AI assistance, see the Getting Started section below.


What This Is

neurophone is a complete Android application for neurosymbolic AI on mobile devices. It combines spiking neural networks with large language models for on-device intelligence.

Important
This is an application, NOT a library. For the underlying platform-agnostic routing library, see mobile-ai-orchestrator.

Target Device

Primary target: Oppo Reno 13 (MediaTek Dimensity 8350)

  • 12GB RAM

  • NPU acceleration available

  • Android 14+

Also compatible with Android 8.0+ devices with 4GB+ RAM.

Core Purpose

┌─────────────────────────────────────────────────────────────────┐
│                      NEUROPHONE                                 │
│                   (THIS APPLICATION)                            │
├─────────────────────────────────────────────────────────────────┤
│                                                                 │
│  ┌─────────────┐      ┌─────────────┐      ┌─────────────┐     │
│  │  Sensors    │─────▶│    LSM      │─────▶│   Bridge    │     │
│  │ Accel/Gyro  │      │  (spiking   │      │  (state     │     │
│  │ Light/Prox  │      │  reservoir) │      │  encoding)  │     │
│  └─────────────┘      └─────────────┘      └──────┬──────┘     │
│                                                   │            │
│                                                   ▼            │
│  ┌─────────────┐      ┌─────────────┐      ┌─────────────┐     │
│  │   Output    │◀─────│    ESN      │◀────▶│    LLM      │     │
│  │  (actions)  │      │  (echo      │      │ (Llama 3.2) │     │
│  │             │      │  reservoir) │      │             │     │
│  └─────────────┘      └─────────────┘      └─────────────┘     │
│                                                                 │
│  Processes: Sensor data → Neural interpretation → LLM query    │
│  Runs: ON THE DEVICE, with cloud fallback                      │
│                                                                 │
└─────────────────────────────────────────────────────────────────┘
                              │
                              ▼ (cloud fallback)
                    ┌─────────────────────┐
                    │      Claude API     │
                    │  (complex queries)  │
                    └─────────────────────┘

Key Differentiators

Feature This App Typical Mobile AI Apps

Neural Processing

On-device LSM + ESN (spiking networks)

Cloud-only or simple TFLite

Sensor Integration

Real-time sensor → neural → LLM pipeline

Separate sensor and AI components

LLM

Local Llama 3.2 + Claude fallback

Cloud-only

Latency

<100ms local inference

500ms+ network round-trip

Privacy

Sensor data stays on device

Often sent to cloud

Architecture

Rust Crates (8 modules)

Crate Purpose Key Features

lsm

Liquid State Machine

512 spiking neurons, 3D grid, 1kHz processing

esn

Echo State Network

300-neuron reservoir, ridge regression

bridge

Neural ↔ Symbolic

State encoding, context generation

sensors

Phone Sensors

Accel, gyro, magnetometer, light, proximity

llm

Local Inference

Llama 3.2 via llama.cpp, streaming

claude-client

Cloud Fallback

Claude API, retry logic, context injection

neurophone-core

Orchestration

Main coordinator, query routing

neurophone-android

Android JNI

Kotlin ↔ Rust bridge

Android App (Kotlin)

android/
├── app/src/main/
│   ├── java/ai/neurophone/
│   │   ├── MainActivity.kt
│   │   ├── NativeLib.kt          # JNI interface
│   │   ├── SensorManager.kt      # Sensor collection
│   │   └── ui/                   # Compose UI
│   └── res/
└── build.gradle.kts

Components

LSM (Liquid State Machine)

Spiking neural network for temporal sensor processing:

  • 3D grid: 8×8×8 = 512 Leaky Integrate-and-Fire neurons

  • Distance-dependent connectivity

  • Excitatory/inhibitory balance

  • Real-time spike processing at 1kHz

ESN (Echo State Network)

Reservoir for state prediction:

  • 300-neuron reservoir

  • Spectral radius: 0.95

  • Leaky integrator dynamics

  • Ridge regression output

Sensors

Phone sensor integration:

  • Accelerometer, gyroscope, magnetometer

  • Light and proximity sensors

  • IIR filtering (low-pass, high-pass)

  • Feature extraction at 50Hz

Bridge

Neural ↔ Symbolic translation:

  • Integrates LSM + ESN states

  • Generates natural language context for LLMs

  • Temporal pattern detection

  • Salience and urgency computation

Local LLM

On-device language model:

  • Llama 3.2 1B/3B via llama.cpp

  • Optimized for Dimensity 8350

  • Q4_K_M quantization (~700MB)

  • Neural context injection

Claude Client

Cloud fallback for complex queries:

  • Messages API integration

  • Automatic retry with exponential backoff

  • Hybrid inference (local/cloud decision)

  • Neural state context injection

Getting Started

Prerequisites

  • Rust 1.75+

  • Android NDK 26+

  • Android Studio (for app development)

  • Oppo Reno 13 or Android 8.0+ device

Build

# Clone
git clone https://github.com/hyperpolymath/neurophone
cd neurophone

# Setup
./scripts/setup.sh

# Build native libraries for Android
./scripts/build-android.sh

# Open android/ in Android Studio

Download LLM Model

# Download Llama 3.2 1B Instruct (Q4_K_M, ~700MB)
# From: https://huggingface.co/bartowski/Llama-3.2-1B-Instruct-GGUF

# Push to device
adb push llama-3.2-1b-instruct-q4_k_m.gguf /data/local/tmp/

Configure

Set Claude API key (for cloud fallback):

export ANTHROPIC_API_KEY="your-api-key"

Or in config/default.toml:

[claude]
api_key = "your-api-key"
model = "claude-sonnet-4-20250514"

[llm]
model_path = "/data/local/tmp/llama-3.2-1b-q4_k_m.gguf"
n_threads = 4
context_size = 2048

Usage

Kotlin API

// Initialize
NativeLib.init()
NativeLib.start()

// Query with neural context
val response = NativeLib.query("What's my current activity?", preferLocal = true)

// Get raw neural state
val context = NativeLib.getNeuralContext()
// Returns: [NEURAL_STATE] Description: ... [/NEURAL_STATE]

// Cleanup
NativeLib.stop()

Rust API

use neurophone_core::{NeuroSymbolicSystem, SystemConfig};

let mut system = NeuroSymbolicSystem::with_config(config)?;
let _rx = system.start().await?;

// Send sensor data
system.send_sensor(reading).await?;

// Query
let response = system.query("What's happening?", true).await?;

// Get neural context
let context = system.get_neural_context().await;

Performance

Optimized for Oppo Reno 13 (Dimensity 8350):

Component Latency Notes

Sensor processing

<1ms

50Hz loop

LSM step

<2ms

512 neurons

ESN step

<1ms

300 neurons

Bridge integration

<1ms

Per step

Local LLM (1B)

50-100ms/token

Q4 quantized

Claude API

500-2000ms

Network dependent

Relationship to mobile-ai-orchestrator

This application and mobile-ai-orchestrator are complementary:

neurophone mobile-ai-orchestrator

Type

Application

Library

Platform

Android-specific

Platform-agnostic

Focus

Sensor → Neural → LLM pipeline

AI routing decisions

Neural

LSM, ESN (spiking networks)

MLP, Reservoir (routing)

Use Case

Run on phone, process sensors

Embed in any app for routing

Future integration: neurophone may adopt mobile-ai-orchestrator for its routing decisions, combining:

  • neurophone’s sensor processing + neural interpretation

  • mobile-ai-orchestrator’s intelligent local/cloud routing

Project Relationship Description

mobile-ai-orchestrator

Complementary library

Platform-agnostic AI routing (may integrate)

echomesh

Related

Conversation context preservation

oblibeny

Related

Safety-critical programming concepts

RSR Compliance

Bronze-level RSR (Rhodium Standard Repository) compliance:

  • Type safety (Rust)

  • Memory safety (ownership model)

  • Comprehensive documentation

  • Build automation

  • Security policy

Development

# Run tests
cargo test

# Build for Android
./scripts/build-android.sh

# Generate docs
cargo doc --open

Contributing

Contributions welcome! See CONTRIBUTING.md.

License

Palimpsest-MPL-1.0 License - See LICENSE file

Citation

@software{neurophone_2025,
  author = {Jewell, Jonathan D.A.},
  title = {NeuroPhone: Neurosymbolic AI Android Application},
  year = {2025},
  url = {https://github.com/hyperpolymath/neurophone},
  note = {On-device LSM + ESN + LLM}
}

Contact


Android Application • On-Device Neural Processing • Spiking Networks • Local LLM

Architecture

See TOPOLOGY.md for a visual architecture map and completion dashboard.

About

NeuroPhone is a complete Android application for neurosymbolic AI on mobile devices. It combines spiking neural networks with large language models for advanced on-device intelligence.

Topics

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Sponsor this project

Packages

No packages published

Contributors 3

  •  
  •  
  •