Out of multiple conversations with people at BSD conferences, I noticed that many would love to see a chatbot that provides precise information on FreeBSD—for users, admins, and developers.

I strongly believe that there should not be an official chat.freebsd.org. Local chatbots work well and can be tweaked to fit personal needs.

This documentation is written for macOS with Apple Silicon (because of the GPU support), but should work on other OSes as well.


Step 1: Install Ollama (API for Multiple LLMs)

brew install ollama
ollama pull gemma3:latest

You can try deepseek-r1:latest or even deepseek-r1:70b on more powerful GPUs.

Step 2: Install Open-WebUI for a UI and Built-in Vector Database

curl -LsSf https://astral.sh/uv/install.sh | sh
DATA_DIR=~/.open-webui uvx --python 3.11 open-webui@latest serve

Now browse to http://localhost:5000/

Welcome to your own, local chatbot!

Step 3: Feed Knowledge to the Model

This is where the real work begins: You need to tell the LLM what’s right or wrong, what’s necessary, and how FreeBSD differs from Linux.

3.1 Download the FreeBSD Documentation

Follow this guide for more details.

Install dependencies (use apt, yum etc. according to your OS):

brew install hugo ruby git bmake

Update your shell configuration:

echo 'export PATH="$(brew --prefix ruby)/bin:$PATH"' >> ~/.zshrc
echo 'export PATH="$(brew --prefix hugo)/bin:$PATH"' >> ~/.zshrc
echo 'export GEM_PATH="$(gem environment gemdir)"' >> ~/.zshrc
echo 'export PATH="${GEM_PATH}/bin:$PATH"' >> ~/.zshrc
source ~/.zshrc

Install required gems:

sudo gem install rouge asciidoctor asciidoctor-pdf asciidoctor-epub3

Clone and build the FreeBSD documentation:

git clone https://git.FreeBSD.org/doc.git ~/doc
cd ~/doc/documentation
bmake run USE_RUBYGEMS=YES RUBY_CMD=$(brew --prefix ruby)/bin/ruby

More information can be found here: https://docs.freebsd.org/en/books/fdp-primer/overview/#mac-os-installation-process

3.2 Upload Documentation to Open-WebUI

Go to: http://localhost:5000/workspace/knowledge

Click +

Add a knowledge base called "FreeBSD Official Docs"

Add the folder: ~/doc/documentation/public/en/books

3.3 Create a Model Workspace

Go to: http://127.0.0.1:5000/workspace/models

Add a new workspace called "FreeBSD Helper"

Base Model: gemma3:latest
(Or whatever you downloaded via `ollama pull`)

System Prompt

Your are a bot that helps to use, administrate and develop everything that has to do with FreeBSD.
You give technical responses and be very precise. Make sure you use proper FreeBSD tools and don't mistake it with Linux.
You online use the knowledge provided.

Select Knowledge: “FreeBSD Official Docs”

Step 4: Chat Away

Now open a new chat window and select “FreeBSD Helper” as your model.

You might be tempted to add more data, but remember:

The more unstructured the data, the more likely the LLM will give poor results. 
E. g. if you give it your PDF invoices it will just fail. 
(There are tool for parsing them correctly, but that's not part of this post. Wouldn't fit into 10 posts :-) )

You can use the slider on the top left and lower the value Temperature, this will give you more precise answers.

The hardest part is getting the source data into proper shape—which, thankfully, has already been done by the amazing FreeBSD documentation team.