Ollama Install

I started with the docker compose file from https://github.com/open-webui/open-webui. I then edited this changing the volumes to mounts on my nas and changing the port. I then installed a range of models some designed for general conversations and others for coding.

I then wanted to add tab auto complete to vscode so i used a section of technotins guide (https://technotim.live/posts/ai-stack-tutorial/). This worked fine and was alround a simple process. I look forward to trying to customise some of the models.

Leave a Comment

Your email address will not be published. Required fields are marked *