Upstream docs: https://github.com/OpenSecretCloud/maple-proxy#readme
Everything not listed in this document should behave the same as upstream Maple Proxy v0.1.6. If a feature, setting, or behavior is not mentioned here, the upstream documentation is accurate and fully applicable.
Maple Proxy is a lightweight OpenAI-compatible proxy server for Maple/OpenSecret's TEE (Trusted Execution Environment) infrastructure. It provides privacy-preserving AI inference through secure enclaves.
- Image and Container Runtime
- Volume and Data Layout
- Installation and First-Run Flow
- Configuration Management
- Network Access and Interfaces
- Actions (StartOS UI)
- Dependencies
- Backups and Restore
- Health Checks
- Limitations and Differences
- What Is Unchanged from Upstream
- Contributing
- Quick Reference for AI Consumers
| Property | Value |
|---|---|
| maple-proxy | ghcr.io/opensecretcloud/maple-proxy:0.1.6 (pre-built) |
| maple-ui | nginx:stable-alpine (custom build from assets/ui/) |
| Architectures | x86_64, aarch64 |
The service runs two containers:
- maple-proxy — the Rust API server (upstream image, unmodified)
- maple-ui — nginx serving a chat web UI and reverse-proxying
/v1/*to the API
| Volume | Mount Point | Purpose |
|---|---|---|
main |
/data |
All Maple Proxy data |
Key paths on the main volume:
store.json— persists API key and backend URL configuration
| Step | Upstream | StartOS |
|---|---|---|
| Install | docker run |
Sideload or install from marketplace |
| Configure | Environment variables | StartOS Configure action |
| First run | Set MAPLE_API_KEY env var |
Action prompt to set API key |
On first install, store.json is seeded with default values and a task is created prompting you to configure your Maple API key.
All configuration is managed through the Configure action in the StartOS UI.
| Setting | Description | Default |
|---|---|---|
| API Key | Your Maple API key (optional — clients can provide their own) | (empty) |
| Backend URL | The Maple/OpenSecret backend URL | https://enclave.trymaple.ai |
Configuration is stored in store.json on the main volume and read by the maple-proxy binary at startup.
| Interface | Type | Port | Protocol | Description |
|---|---|---|---|---|
| API | api |
8080 | HTTP | OpenAI-compatible API endpoint |
| Web UI | ui |
80 | HTTP | Chat interface for Maple Proxy |
API endpoints:
POST /v1/chat/completions— chat completions (streaming supported)GET /v1/models— list available modelsPOST /v1/embeddings— text embeddingsGET /health— health check
| Action | Description | Allowed States |
|---|---|---|
| Configure | Set API key and backend URL | Any |
| Proxy Properties | Display API and UI port numbers | Any |
None.
The main volume is included in backups, preserving your store.json configuration.
| Check | Daemon | Method | Grace Period |
|---|---|---|---|
| API Interface | primary |
Port listening (8080) | 10s |
| Web Interface | ui |
Port listening (80) | — |
| Area | Upstream | StartOS |
|---|---|---|
| Configuration | Environment variables | store.json via Configure action |
| Web UI | Not included | Bundled nginx chat interface |
| Networking | Direct port binding | StartOS multi-host interfaces |
- All API endpoints behave identically
- Model routing and TEE integration
- Streaming support
- API key authentication
To build locally:
npm install
npm run build
make # builds both x86_64 and aarch64 s9pk packages
make arm # aarch64 only
make x86 # x86_64 onlyRequires start-cli v0.4.0+ and Docker.
package_id: maple-proxy
upstream_version: 0.1.6
wrapper_version: 0.4-beta.1
images:
maple-proxy: ghcr.io/opensecretcloud/maple-proxy:0.1.6
maple-ui: nginx:stable-alpine (custom)
architectures: [x86_64, aarch64]
volumes:
main: /data
interfaces:
api:
port: 8080
type: api
endpoints: [/v1/chat/completions, /v1/models, /v1/embeddings, /health]
ui:
port: 80
type: ui
dependencies: none
actions: [configure, proxy-properties]
health_checks:
primary: { port_listening: 8080, grace_period: 10s }
ui: { port_listening: 80 }
backup_volumes: [main]