Tom Chivert 04ec6ccb2c
build / build (push) Successful in 2m8s
Update README.md
2024-04-07 12:02:05 +02:00
2023-09-14 14:18:22 +02:00
2023-09-14 12:16:43 +02:00
2023-09-17 14:44:51 +02:00
2023-09-14 15:08:34 +02:00
2024-04-07 12:02:05 +02:00

mmai-bot

Mattermost bot that uses Ollama to generate responses to user input.

Installation

To run the bot, follow these steps:

  1. Set up your environment variables in the docker-compose.yml file:

    • MATTERMOST_URL: The URL of your Mattermost instance (default "https://mm.example.com")
    • MATTERMOST_PORT: The port number where your Mattermost instance is listening (default 443)
    • MATTERMOST_TOKEN: Your Mattermost bot token
    • MATTERMOST_TEAM: The name of the team you want to interact with in Mattermost (default "my-team")
    • OLLAMA_URL: The URL of the Ollama API endpoint (default "http://localhost:11434")
    • OLLAMA_MODEL: The name of the Ollama model you want to use (default "llama2")
    • OLLAMA_CTX_TIMEOUT: In minutes, time before the context is automatically flushed (default 60)
  2. Run the bot using Docker Compose:

docker compose up -d

This will start the bot in the background and listen for incoming requests on your Mattermost instance.

You can also run it alongside with a Ollama container, but make sure your host have at least 8Gb of free memory and enough CPU capabilities:

docker compose -f docker-compose-ollama.yml up -d
docker compose exec ollama ollama pull <model>

Usage

You can simply send a direct message to the bot to start the conversation.
It will keep track of the context until you send him the command flush, to reset his memory.
The context will also be cleared after a given time (default 60m).

There are also special commands you can send to the bot:

  • web: Followed by a prompt and an URL, this command will tell the bot to read a webpage and process it as requested by the prompt.
  • flush: Will clear the context of the bot for your Mattermost user.

Example

Usr> web Summarize in a few words the scope of this project: https://git.rznet.fr/tchivert/mattermost-ai-bot/src/branch/main/README.md
Bot> This project is a Mattermost bot that uses OllaMA to generate responses to user input. The bot can be run [...]
Usr> flush
Bot> Context flushed.

Acknowledgments

This bot was created using the following libraries and tools:

License

This project is licensed under the MIT License. See the LICENSE file for more information.

Description
Mattermost AI chatbot using Ollama
Readme MIT 45 KiB
Languages
Python 96.8%
Dockerfile 3.2%