2025-09-08 10:59:19 +02:00
2025-06-02 11:34:39 +02:00
2025-05-30 15:40:16 +02:00
2025-09-08 10:59:19 +02:00

signallama

WARNING - this is under heavy development

Signal bot to chat with your ollama instance

Description

Signallama is a bridge between Signal messenger and LLM models. It allows you to chat with AI models through Signal messages. While primarily designed for use with Ollama, it supports multiple LLM providers through LiteLLM.

Requirements

Setup

  1. Deploy signal-cli-rest-api and set it up with your Signal account
  2. Clone this repository
  3. Copy example_settings.py to settings.py and configure your settings:
    • Set your Signal phone number
    • Configure your preferred LLM provider (Ollama by default)
    • Set API details for your LLM
    • (Optional) Set up Whisper ASR for voice message transcription

Whisper ASR Webservice (Voice Transcription)

To enable transcription of voice messages, you need to run the onerahmet/openai-whisper-asr-webservice Docker image:

docker run -d --name whisper-asr -p 9000:9000 onerahmet/openai-whisper-asr-webservice:latest

In your settings.py, set the Whisper ASR API URL:

WHISPER_URL = 'http://localhost:9000'  # Whisper ASR webservice endpoint

If WHISPER_URL is set, voice messages sent to the bot will be transcribed and the transcript will be sent as a reply.

Configuration

The settings.py file contains all necessary configuration:

# Signal API Configuration
SIGNAL_URL = 'http://localhost:8080'  # signal-cli-rest-api endpoint
SIGNAL_NUMBER = '+123456789'  # Your Signal number

# LLM Configuration
LLM_PROVIDER = 'ollama'  # or 'openai', 'anthropic', etc.
LLM_MODEL = 'ollama/model-name'  # Format: provider/model
LLM_API_BASE = 'http://localhost:11434'  # Ollama API endpoint
LLM_API_KEY = None  # API key if required

Running

python signallama.py

The bot will listen for messages through Signal and respond using your configured LLM.

Description
No description provided
Readme Unlicense 231 KiB
Languages
Python 100%