A Docker Compose setup for Open WebUI, Ollama and Fabric services with GPU support.
- webui: This is the Web UI component of the project. It is available on port 3000 of the host and depends on the Ollama service.
- ollama: This is the Ollama service, which is required by the webui.
- fabric: This is the Fabric service, which depends on the Ollama service. It is used for managing Obsidian files and is configured to run indefinitely.
- Clone the project with Git
git clone https://github.com/aliuosio/ai.git
- Navigate into the project directory
cd ai
- copy
.env.temp
to.env
- Run
docker compose up -d
to create and start your environment.
docker compose exec ollama ollama pull llama3.2:3b
(llama3.2:3b is the default set in .env.temp) find more models:
https://ollama.com/library
The Web UI is accessible at
http://localhost:3000
docker compose exec fabric bash
Please note that this service is configured for GPU acceleration and requires NVIDIA drivers working with docker
To enable Docker to use NVIDIA drivers on a Linux system, follow these steps:
-
Install the NVIDIA Container Toolkit:
https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html