mirror of
https://github.com/aljazceru/ollama-free-model-proxy.git
synced 2025-12-17 05:04:20 +01:00
Merge branch 'main' of https://github.com/xsharov/ollama-openrouter-proxy
This commit is contained in:
@@ -11,7 +11,7 @@ Currently, it is enough for usage with [Jetbrains AI assistant](https://blog.jet
|
||||
|
||||
**Note**: OpenRouter model names may sometimes include a vendor prefix, for example `deepseek/deepseek-chat-v3-0324:free`. To make sure filtering works correctly, remove the vendor part when adding the name to your `models-filter` file, e.g. `deepseek-chat-v3-0324:free`.
|
||||
|
||||
- **Ollama-like API**: The server listens on `8080` and exposes endpoints similar to Ollama (e.g., `/api/chat`, `/api/tags`).
|
||||
- **Ollama-like API**: The server listens on `11434` and exposes endpoints similar to Ollama (e.g., `/api/chat`, `/api/tags`).
|
||||
- **Model Listing**: Fetch a list of available models from OpenRouter.
|
||||
- **Model Details**: Retrieve metadata about a specific model.
|
||||
- **Streaming Chat**: Forward streaming responses from OpenRouter in a chunked JSON format that is compatible with Ollama’s expectations.
|
||||
@@ -28,7 +28,7 @@ You can provide your **OpenRouter** (OpenAI-compatible) API key through an envir
|
||||
|
||||
./ollama-proxy "your-openrouter-api-key"
|
||||
|
||||
Once running, the proxy listens on port `8080`. You can make requests to `http://localhost:8080` with your Ollama-compatible tooling.
|
||||
Once running, the proxy listens on port `11434`. You can make requests to `http://localhost:11434` with your Ollama-compatible tooling.
|
||||
|
||||
## Installation
|
||||
1. **Clone the Repository**:
|
||||
|
||||
Reference in New Issue
Block a user