A universal text translator built in Rust. Uses llama.cpp (via the llama-cpp-2 Rust crate) for fast, fully local inference of TranslateGemma 4B (Gemma 3 4B instruction-tuned), covering 55 languages, and Lingua for automatic source-language detection.
No API keys required. No network calls at runtime. Everything runs on your machine.
License: MIT OR Apache-2.0 — see LICENSE-MIT and LICENSE-APACHE. Model attributions: see ATTRIBUTIONS.md — covers TranslateGemma 4B and the runtime libraries used for inference. Model license: TranslateGemma 4B weights are subject to the Gemma Terms of Use — see LICENSE-GEMMA and NOTICE.
55 supported languages:
| Code | Language | Code | Language | Code | Language |
|---|---|---|---|---|---|
| af | Afrikaans | hr | Croatian | pt | Portuguese |
| am | Amharic | hu | Hungarian | ro | Romanian |
| ar | Arabic | id | Indonesian | ru | Russian |
| bg | Bulgarian | it | Italian | si | Sinhala |
| bn | Bengali | ja | Japanese | sk | Slovak |
| ca | Catalan | kn | Kannada | sl | Slovenian |
| cs | Czech | ko | Korean | sr | Serbian |
| da | Danish | lt | Lithuanian | sv | Swedish |
| de | German | lv | Latvian | sw | Swahili |
| el | Greek | ml | Malayalam | ta | Tamil |
| en | English | mr | Marathi | te | Telugu |
| es | Spanish | ms | Malay | th | Thai |
| et | Estonian | mt | Maltese | tr | Turkish |
| fa | Persian | ne | Nepali | uk | Ukrainian |
| fi | Finnish | nl | Dutch | ur | Urdu |
| fr | French | no | Norwegian | vi | Vietnamese |
| gu | Gujarati | pa | Punjabi | yi | Yiddish |
| ha | Hausa | pl | Polish | zh | Chinese |
| hi | Hindi |
Source language is detected automatically — no configuration required.
universal-translator/
├── translator-core/ # Core library: engine, language detector, types
├── translator-api/ # Axum HTTP API server
├── translator-cli/ # Command-line interface
└── docs/models.md # Model management guide
- Rust toolchain (stable) — rustup.rs
- CMake ≥ 3.14 and a C++17 compiler (required to build vendored llama.cpp)
- macOS:
xcode-select --install(provides both) - Ubuntu/Debian:
sudo apt install cmake g++ - Fedora:
sudo dnf install cmake gcc-c++
- macOS:
- Tested on Linux (x86_64, arm64) and macOS (Apple Silicon)
cargo build --releasecargo run -p translator-cli -- setupThis downloads TranslateGemma 4B Q8_0 (~4.1 GB) directly from HuggingFace into the default model directory. No Python or HuggingFace CLI required.
For the smaller Q4_K_M variant (~2.6 GB):
cargo run -p translator-cli -- setup --url https://huggingface.co/mradermacher/translategemma-4b-it-GGUF/resolve/main/translategemma-4b-it.Q4_K_M.ggufUse --model-path <path> to select a specific model file at runtime. Q8_0 is the default.
See docs/models.md for details and alternative hosting options.
cargo run -p translator-apiThe server listens on http://localhost:3000. See API.md for endpoint
reference and request/response schemas.
To run with full observability (traces, metrics, logs pushed to a local monitoring stack):
docker compose -f docker/docker-compose.yml up -d
OTLP_ENDPOINT=http://localhost:4317 \
cargo run -p translator-api --features opentelemetryGrafana opens at http://localhost:3001 with a pre-built dashboard. See METRICS.md for the full setup guide and metrics catalogue.
cargo run -p translator-cli -- translate -t "Hello world" -l frSee CLI.md for the full command reference (translate, detect-language,
detect, languages) including all flags and output formats.
- CLI Reference — all subcommands, flags, and output formats
- API Reference — HTTP endpoints, request/response schemas, examples
- Engine Internals — inference limits, concurrency, sampling, language detection
- Observability — OTel traces/metrics/logs, Grafana dashboard, metrics catalogue
The source code in this repository is dual-licensed under the Apache License, Version 2.0 and the MIT License. You may choose either license for your use of the source code.
The TranslateGemma model weights and any model derivatives are not covered by the Apache or MIT licenses. They are subject to the Gemma Terms of Use and Gemma Prohibited Use Policy. By downloading or using the model, you agree to abide by those terms. See LICENSE-GEMMA and NOTICE.