Skip to content

docs + fix: rewrite README for discoverability and compact JSON responses#42

Merged
KochC merged 10 commits intomainfrom
dev
Mar 27, 2026
Merged

docs + fix: rewrite README for discoverability and compact JSON responses#42
KochC merged 10 commits intomainfrom
dev

Conversation

@KochC
Copy link
Copy Markdown
Owner

@KochC KochC commented Mar 27, 2026

Summary

  • Rewrites README with badges, architecture diagram, feature table, quickstart, SDK examples (OpenAI JS/Python, Anthropic JS/Python, Gemini JS, LangChain), and UI integration guides (Open WebUI, Chatbox, Continue, Zed)
  • Expands package.json description and keywords (20 keywords) for npm/GitHub discoverability
  • Removes pretty-printing (JSON.stringify(data, null, 2)JSON.stringify(data)) from all JSON API responses to reduce payload size

KochC added 10 commits March 27, 2026 15:44
- Add 17 new integration tests: CORS edge cases (disallowed origins,
  no-origin header, OPTIONS for disallowed origin), auth (401/pass-through),
  and error handling (400/502/404) for /v1/chat/completions
  Closes #14, closes #16
- Add ESLint with flat config, npm run lint script, and Lint job in CI
  Closes #15
- Improve README with quickstart section, npm install instructions, and
  corrected package name; add type column to env vars table
  Closes #17
- Implement streaming for POST /v1/chat/completions (issue #11):
  subscribe to opencode event stream, pipe message.part.updated deltas
  as SSE chat.completion.chunk events, finish on session.idle
- Implement streaming for POST /v1/responses (issue #11):
  emit response.created / output_text.delta / response.completed events
- Fix provider-agnostic system prompt hint (issue #12): remove
  'OpenAI-compatible' wording so non-OpenAI models are not confused
- Add TextEncoder and ReadableStream to ESLint globals
- Add streaming integration tests (happy path, unknown model, session.error)
- Extract createSseQueue() helper, eliminating duplicated SSE queue pattern
  in /v1/chat/completions and /v1/responses streaming branches (closes #34)
- Add tests for GET /v1/models happy path, empty providers, and error path (closes #33)
- Add tests for POST /v1/responses: happy path, validation, streaming, session.error (closes #32)
- Fix package.json description to be provider-agnostic (closes #35)
- Add engines field declaring bun >=1.0.0 requirement (closes #35)
- Line coverage: 55% -> 89%, function coverage: 83% -> 94%
- POST /v1/messages — Anthropic Messages API with streaming (SSE)
- POST /v1beta/models/:model:generateContent — Gemini non-streaming
- POST /v1beta/models/:model:streamGenerateContent — Gemini NDJSON streaming
- New helpers: normalizeAnthropicMessages, normalizeGeminiContents,
  extractGeminiSystemInstruction, mapFinishReasonToAnthropic/Gemini
- 35 new tests (77 -> 112 total, all passing)
- Update README to document all supported API formats

Closes #38, #39
- Lead with value proposition, ASCII diagram, and feature table
- Quickstart reduced to 4 steps; works in under 60 seconds
- SDK examples for OpenAI, Anthropic, Gemini (JS+Python), LangChain
- UI integration guides: Open WebUI, Chatbox, Continue, Zed
- Reference section kept concise; full prose docs moved inline
- package.json: sharper description, 20 keywords covering all search terms
  (openai-compatible, anthropic, gemini, ollama, langchain, open-webui,
   llm-proxy, ai-gateway, local-llm, github-copilot, model-router, …)
@KochC KochC merged commit 04111cc into main Mar 27, 2026
6 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant