You've already forked mattermost-ai-bot
57 lines
2.5 KiB
Markdown
57 lines
2.5 KiB
Markdown
# mmai-bot
|
|
|
|
Mattermost bot that uses [Ollama](https://ollama.ai/) to generate responses to user input.
|
|
|
|
## Installation
|
|
|
|
To run the bot, follow these steps:
|
|
|
|
1. Set up your environment variables in the `docker-compose.yml` file:
|
|
* `MATTERMOST_URL`: The URL of your Mattermost instance (default "https://mm.example.com")
|
|
* `MATTERMOST_PORT`: The port number where your Mattermost instance is listening (default 443)
|
|
* `MATTERMOST_TOKEN`: Your Mattermost bot token
|
|
* `MATTERMOST_TEAM`: The name of the team you want to interact with in Mattermost (default "my-team")
|
|
* `OLLAMA_URL`: The URL of the Ollama API endpoint (default "http://localhost:11434")
|
|
* `OLLAMA_MODEL`: The name of the Ollama model you want to use (default "llama2")
|
|
* `OLLAMA_CTX_TIMEOUT`: In minutes, time before the context is automatically flushed (default 60)
|
|
|
|
2. Run the bot using Docker Compose:
|
|
```
|
|
docker compose up -d
|
|
```
|
|
This will start the bot in the background and listen for incoming requests on your Mattermost instance.
|
|
|
|
You can also run it alongside with a Ollama container, but make sure your host have at least 8Gb of free memory and enough CPU capabilities:
|
|
```
|
|
docker compose -f docker-compose-ollama.yml up -d
|
|
docker compose exec ollama ollama pull <model>
|
|
```
|
|
|
|
## Usage
|
|
You can simply send a direct message to the bot to start the conversation.<br>
|
|
It will keep track of the context until you send him the command `flush`, to reset his memory.<br>
|
|
The context will also be cleared after a [given time](https://git.rznet.fr/tchivert/mattermost-ai-bot/src/branch/main/docker-compose.yml#L19) (default 60m).
|
|
|
|
There are also special commands you can send to the bot:
|
|
- `web`: Followed by a prompt and an URL, this command will tell the bot to read a webpage and process it as requested by the prompt.
|
|
- `flush`: Will clear the context of the bot for your Mattermost user.
|
|
|
|
### Example
|
|
```
|
|
Usr> web Summarize in a few words the scope of this project: https://git.rznet.fr/tchivert/mattermost-ai-bot/src/branch/main/README.md
|
|
Bot> This project is a Mattermost bot that uses OllaMA to generate responses to user input. The bot can be run [...]
|
|
Usr> flush
|
|
Bot> Context flushed.
|
|
```
|
|
|
|
## Acknowledgments
|
|
|
|
This bot was created using the following libraries and tools:
|
|
|
|
* [mmpy-bot](https://github.com/attzonko/mmpy_bot)
|
|
* [ollama](https://ollama.ai/)
|
|
|
|
## License
|
|
|
|
This project is licensed under the MIT License. See the [LICENSE](https://git.rznet.fr/tchivert/mattermost-ai-bot/src/branch/main/LICENSE) file for more information.
|