mirror of
https://github.com/aljazceru/ollama-free-model-proxy.git
synced 2025-12-17 05:04:20 +01:00
f140fdd8d2afa4162a3e05d45cfb75a4446f4627
Ollama Proxy for OpenRouter
Description
This repository provides a proxy server that emulates Ollama's REST API but forwards requests to OpenRouter. It uses the sashabaranov/go-openai library under the hood, with minimal code changes to keep the Ollama API calls the same. This allows you to use Ollama-compatible tooling and clients, but run your requests on OpenRouter-managed models. Currently, it is enough for usage with Jetbrains AI assistant.
Features
- Ollama-like API: The server listens on
8080and exposes endpoints similar to Ollama (e.g.,/api/chat,/api/tags). - Model Listing: Fetch a list of available models from OpenRouter.
- Model Details: Retrieve metadata about a specific model.
- Streaming Chat: Forward streaming responses from OpenRouter in a chunked JSON format that is compatible with Ollama’s expectations.
Usage
You can provide your OpenRouter (OpenAI-compatible) API key through an environment variable or a command-line argument:
1. Environment Variable
export OPENAI_API_KEY="your-openrouter-api-key"
./ollama-proxy
2. Command Line Argument
./ollama-proxy "your-openrouter-api-key"
Once running, the proxy listens on port 8080. You can make requests to http://localhost:8080 with your Ollama-compatible tooling.
Installation
-
Clone the Repository:
git clone https://github.com/your-username/ollama-openrouter-proxy.git cd ollama-openrouter-proxy -
Install Dependencies:
go mod tidy -
Build:
go build -o ollama-proxy
Description
Languages
Go
98.5%
Dockerfile
1.5%