|
Tip
|
AI-Assisted Install: Just tell any AI: |
You don’t need to read this README. Just say this to any AI assistant:
Set up NeuroPhone on my Android from https://github.com/hyperpolymath/neurophoneThat’s it. You don’t type commands, install packages, or configure anything. The AI fetches this repo, reads the installation guide inside it, figures out your device, and does everything. You just answer a few questions and confirm the privacy notice.
The URL is the key — it points the AI to this repo where docs/AI_INSTALLATION_GUIDE.adoc contains the complete step-by-step recipe. Any AI that can read a URL and run commands (or generate commands for you to paste) can do this.
The AI handles all of this automatically:
-
Checking your device and storage
-
Installing Termux (if needed), Rust, Git, and dependencies
-
Cloning and building NeuroPhone for your specific hardware
-
Downloading the right LLM model for your device’s RAM/storage
-
Creating your configuration with sensible defaults
-
Running the setup wizard
-
Giving you a working NeuroPhone
If your AI already knows about NeuroPhone (e.g. it can search the web), even shorter versions work:
-
"Make my phone a NeuroPhone"
-
"Install NeuroPhone on my Android"
-
"Turn my Oppo Reno 13 into a NeuroPhone"
If it doesn’t know the project, just include the URL:
-
"Set up https://github.com/hyperpolymath/neurophone on my phone"
-
"I want neurosymbolic AI on my phone — install from https://github.com/hyperpolymath/neurophone"
Your AI will ask you:
-
What device? (so it picks the right thread count and model size)
-
Privacy confirmation — what sensors are used and how data stays on-device
-
Cloud fallback? (optional Claude API for complex queries — default is local-only)
That’s it. Everything else is automatic. No package managers, no build flags, no config files.
|
Important
|
What NeuroPhone does:
What NeuroPhone does NOT do:
You control everything: cloud fallback toggle, all config in |
Once your AI finishes setup, just use it:
neurophone # Start NeuroPhone
neurophone query "What am I doing right now?" # Ask a question
neurophone status # Check system statusTell your AI what went wrong — it can read the troubleshooting docs in this repo. Common issues:
| Problem | Solution |
|---|---|
"Termux not found" |
AI will guide you to install from F-Droid (NOT Google Play) |
Build takes too long |
Normal for first build (5-10 min). AI adjusts thread count for your device. |
"Model download failed" |
AI will try alternate download methods or suggest |
"LSM crashes" |
Low RAM. AI will reduce model size or neuron count for your device. |
For manual installation without AI assistance, see the Getting Started section below.
neurophone is a complete Android application for neurosymbolic AI on mobile devices. It combines spiking neural networks with large language models for on-device intelligence.
|
Important
|
This is an application, NOT a library. For the underlying platform-agnostic routing library, see mobile-ai-orchestrator. |
Primary target: Oppo Reno 13 (MediaTek Dimensity 8350)
-
12GB RAM
-
NPU acceleration available
-
Android 14+
Also compatible with Android 8.0+ devices with 4GB+ RAM.
┌─────────────────────────────────────────────────────────────────┐
│ NEUROPHONE │
│ (THIS APPLICATION) │
├─────────────────────────────────────────────────────────────────┤
│ │
│ ┌─────────────┐ ┌─────────────┐ ┌─────────────┐ │
│ │ Sensors │─────▶│ LSM │─────▶│ Bridge │ │
│ │ Accel/Gyro │ │ (spiking │ │ (state │ │
│ │ Light/Prox │ │ reservoir) │ │ encoding) │ │
│ └─────────────┘ └─────────────┘ └──────┬──────┘ │
│ │ │
│ ▼ │
│ ┌─────────────┐ ┌─────────────┐ ┌─────────────┐ │
│ │ Output │◀─────│ ESN │◀────▶│ LLM │ │
│ │ (actions) │ │ (echo │ │ (Llama 3.2) │ │
│ │ │ │ reservoir) │ │ │ │
│ └─────────────┘ └─────────────┘ └─────────────┘ │
│ │
│ Processes: Sensor data → Neural interpretation → LLM query │
│ Runs: ON THE DEVICE, with cloud fallback │
│ │
└─────────────────────────────────────────────────────────────────┘
│
▼ (cloud fallback)
┌─────────────────────┐
│ Claude API │
│ (complex queries) │
└─────────────────────┘| Feature | This App | Typical Mobile AI Apps |
|---|---|---|
Neural Processing |
On-device LSM + ESN (spiking networks) |
Cloud-only or simple TFLite |
Sensor Integration |
Real-time sensor → neural → LLM pipeline |
Separate sensor and AI components |
LLM |
Local Llama 3.2 + Claude fallback |
Cloud-only |
Latency |
<100ms local inference |
500ms+ network round-trip |
Privacy |
Sensor data stays on device |
Often sent to cloud |
| Crate | Purpose | Key Features |
|---|---|---|
|
Liquid State Machine |
512 spiking neurons, 3D grid, 1kHz processing |
|
Echo State Network |
300-neuron reservoir, ridge regression |
|
Neural ↔ Symbolic |
State encoding, context generation |
|
Phone Sensors |
Accel, gyro, magnetometer, light, proximity |
|
Local Inference |
Llama 3.2 via llama.cpp, streaming |
|
Cloud Fallback |
Claude API, retry logic, context injection |
|
Orchestration |
Main coordinator, query routing |
|
Android JNI |
Kotlin ↔ Rust bridge |
Spiking neural network for temporal sensor processing:
-
3D grid: 8×8×8 = 512 Leaky Integrate-and-Fire neurons
-
Distance-dependent connectivity
-
Excitatory/inhibitory balance
-
Real-time spike processing at 1kHz
Reservoir for state prediction:
-
300-neuron reservoir
-
Spectral radius: 0.95
-
Leaky integrator dynamics
-
Ridge regression output
Phone sensor integration:
-
Accelerometer, gyroscope, magnetometer
-
Light and proximity sensors
-
IIR filtering (low-pass, high-pass)
-
Feature extraction at 50Hz
Neural ↔ Symbolic translation:
-
Integrates LSM + ESN states
-
Generates natural language context for LLMs
-
Temporal pattern detection
-
Salience and urgency computation
On-device language model:
-
Llama 3.2 1B/3B via llama.cpp
-
Optimized for Dimensity 8350
-
Q4_K_M quantization (~700MB)
-
Neural context injection
-
Rust 1.75+
-
Android NDK 26+
-
Android Studio (for app development)
-
Oppo Reno 13 or Android 8.0+ device
# Clone
git clone https://github.com/hyperpolymath/neurophone
cd neurophone
# Setup
./scripts/setup.sh
# Build native libraries for Android
./scripts/build-android.sh
# Open android/ in Android Studio# Download Llama 3.2 1B Instruct (Q4_K_M, ~700MB)
# From: https://huggingface.co/bartowski/Llama-3.2-1B-Instruct-GGUF
# Push to device
adb push llama-3.2-1b-instruct-q4_k_m.gguf /data/local/tmp/// Initialize
NativeLib.init()
NativeLib.start()
// Query with neural context
val response = NativeLib.query("What's my current activity?", preferLocal = true)
// Get raw neural state
val context = NativeLib.getNeuralContext()
// Returns: [NEURAL_STATE] Description: ... [/NEURAL_STATE]
// Cleanup
NativeLib.stop()use neurophone_core::{NeuroSymbolicSystem, SystemConfig};
let mut system = NeuroSymbolicSystem::with_config(config)?;
let _rx = system.start().await?;
// Send sensor data
system.send_sensor(reading).await?;
// Query
let response = system.query("What's happening?", true).await?;
// Get neural context
let context = system.get_neural_context().await;Optimized for Oppo Reno 13 (Dimensity 8350):
| Component | Latency | Notes |
|---|---|---|
Sensor processing |
<1ms |
50Hz loop |
LSM step |
<2ms |
512 neurons |
ESN step |
<1ms |
300 neurons |
Bridge integration |
<1ms |
Per step |
Local LLM (1B) |
50-100ms/token |
Q4 quantized |
Claude API |
500-2000ms |
Network dependent |
This application and mobile-ai-orchestrator are complementary:
| neurophone | mobile-ai-orchestrator | |
|---|---|---|
Type |
Application |
Library |
Platform |
Android-specific |
Platform-agnostic |
Focus |
Sensor → Neural → LLM pipeline |
AI routing decisions |
Neural |
LSM, ESN (spiking networks) |
MLP, Reservoir (routing) |
Use Case |
Run on phone, process sensors |
Embed in any app for routing |
Future integration: neurophone may adopt mobile-ai-orchestrator for its routing decisions, combining:
-
neurophone’s sensor processing + neural interpretation
-
mobile-ai-orchestrator’s intelligent local/cloud routing
| Project | Relationship | Description |
|---|---|---|
Complementary library |
Platform-agnostic AI routing (may integrate) |
|
Related |
Conversation context preservation |
|
Related |
Safety-critical programming concepts |
Bronze-level RSR (Rhodium Standard Repository) compliance:
-
Type safety (Rust)
-
Memory safety (ownership model)
-
Comprehensive documentation
-
Build automation
-
Security policy
# Run tests
cargo test
# Build for Android
./scripts/build-android.sh
# Generate docs
cargo doc --openContributions welcome! See CONTRIBUTING.md.
@software{neurophone_2025,
author = {Jewell, Jonathan D.A.},
title = {NeuroPhone: Neurosymbolic AI Android Application},
year = {2025},
url = {https://github.com/hyperpolymath/neurophone},
note = {On-device LSM + ESN + LLM}
}-
Author: Jonathan D.A. Jewell
-
Email: hyperpolymath@protonmail.com
Android Application • On-Device Neural Processing • Spiking Networks • Local LLM
See TOPOLOGY.md for a visual architecture map and completion dashboard.