This commit is contained in:
2025-06-23 17:27:37 +02:00
parent 491ff811c3
commit c2e0414d56

View File

@@ -1,10 +1,7 @@
# Enchanted Proxy for OpenRouter
This repository is specifically made for use with the [Enchanted project](https://github.com/gluonfield/enchanted/tree/main).
The original author of this proxy is [marknefedov](https://github.com/marknefedov/ollama-openrouter-proxy).
## Free Model Proxy for OpenRouter
This proxy automatically selects and uses free models from OpenRouter, exposing them through Ollama and OpenAI-compatible APIs. It is a work in progress, designed to utilize free resources available on OpenRouter automatically, without requiring manual model selection.
## Description
This repository provides a proxy server that emulates [Ollama's REST API](https://github.com/ollama/ollama) but forwards requests to [OpenRouter](https://openrouter.ai/). It uses the [sashabaranov/go-openai](https://github.com/sashabaranov/go-openai) library under the hood, with minimal code changes to keep the Ollama API calls the same. This allows you to use Ollama-compatible tooling and clients, but run your requests on OpenRouter-managed models.
Currently, it is enough for usage with [Jetbrains AI assistant](https://blog.jetbrains.com/ai/2024/11/jetbrains-ai-assistant-2024-3/#more-control-over-your-chat-experience-choose-between-gemini,-openai,-and-local-models).
It hasn't been extensively tested with paid models, but it should work with any OpenRouter model that is compatible with the OpenAI API.
## Features
- **Free Mode (Default)**: Automatically selects and uses free models from OpenRouter with intelligent fallback. Enabled by default unless `FREE_MODE=false` is set.
@@ -116,23 +113,10 @@ curl -X POST http://localhost:11434/v1/chat/completions \
}'
```
## Installation
1. **Clone the Repository**:
git clone https://github.com/your-username/ollama-openrouter-proxy.git
cd ollama-openrouter-proxy
2. **Install Dependencies**:
go mod tidy
3. **Build**:
go build -o ollama-proxy
## Docker Usage
### Using Docker Compose (Recommended)
### Using Docker Compose
1. **Clone the repository and create environment file**:
```bash
@@ -149,7 +133,7 @@ curl -X POST http://localhost:11434/v1/chat/completions \
3. **Run with Docker Compose**:
```bash
docker-compose up -d
docker compose up -d
```
The service will be available at `http://localhost:11434`.
@@ -160,3 +144,7 @@ The service will be available at `http://localhost:11434`.
docker build -t ollama-proxy .
docker run -p 11434:11434 -e OPENAI_API_KEY="your-openrouter-api-key" ollama-proxy
```
## Acknowledgements
Inspiration for this project was [xsharov/enchanted-ollama-openrouter-proxy](https://github.com/xsharov/enchanted-ollama-openrouter-proxy) who took inspiration from [marknefedov](https://github.com/marknefedov/ollama-openrouter-proxy).