If you want to try it on MacOS you can use this tutorial https://hackacad.net/post/2025-07-12-local-chatbot-rag-with-freebsd-knowledge/

Out of multiple conversations with people at BSD conferences, I noticed that many would love to see a chatbot that provides precise information on FreeBSD—for users, admins, and developers.

I strongly believe that there should not be an official chat.freebsd.org. Local chatbots work well and can be tweaked to fit personal needs.

This how-to is for demo cases. Proper authentication, firewalling, jail separation etc. is mandatory for production!

Please make sure you have your GPU libraries (e. g. nvidia-driver-570.169 installed)


Step 1: Install Ollama (API for Multiple LLMs)

The Ollama port does not support the latest models, so try Gemma2 first.


pkg install ollama

# Expose the Ollama host on the network - THIS IS ENSECURE AND JUST FOR THE DEMO CASES
OLLAMA_HOST=0.0.0.0:1234 ollama serve
OLLAMA_HOST='host IP':1234 ollama ls
ollama pull gemma2:latest

Step 2: Install Open-WebUI for a UI and Built-in Vector Database

Currently there is no support for onnxruntime and playwright on FreeBSD, so a Linux jail is the easiest way to get around that.

If you don’t know how to run debootstrap you can use Bastille for easier deployment.

bastille bootstrap jammy
bastille create -L jammy jammy 'IP for Ollama' 'interface'
bastille console jammy
apt update && apt install git curl
curl -LsSf https://astral.sh/uv/install.sh | sh
OLLAMA_BASE_URL=http://'Host IP':1234 DATA_DIR=~/.open-webui uvx --python 3.11 open-webui@latest serve --host 0.0.0.0 --port 8080

Now browse to http://‘IP for Ollama’:8080/

Welcome to your own, local chatbot!

Step 3: Feed Knowledge to the Model

This is where the real work begins: You need to tell the LLM what’s right or wrong, what’s necessary, and how FreeBSD differs from Linux.

3.1 Download the FreeBSD Documentation

Follow this guide for more details.

Install dependencies (use apt, yum etc. according to your OS):

pkg install gohugo ruby git rubygem-rouge rubygem-asciidoctor rubygem-asciidoctor-epub3 rubygem-asciidoctor-pdf

Clone and build the FreeBSD documentation:

git clone https://git.FreeBSD.org/doc.git ~/doc
cd ~/doc/documentation
make run USE_RUBYGEMS=YES RUBY_CMD=$(brew --prefix ruby)/bin/ruby

More information can be found here: https://docs.freebsd.org/en/books/fdp-primer/overview/#mac-os-installation-process

3.2 Upload Documentation to Open-WebUI

Go to: http://‘IP for Ollama’:5000/workspace/knowledge

Click +

Add a knowledge base called "FreeBSD Official Docs"

Add the folder: ~/doc/documentation/public/en/books

3.3 Create a Model Workspace

Go to: http://‘IP for Ollama’:5000/workspace/models

Add a new workspace called "FreeBSD Helper"

Base Model: gemma3:latest
(Or whatever you downloaded via `ollama pull`)

System Prompt

Your are a bot that helps to use, administrate and develop everything that has to do with FreeBSD.
You give technical responses and be very precise. Make sure you use proper FreeBSD tools and don't mistake it with Linux.
You online use the knowledge provided.

Select Knowledge: “FreeBSD Official Docs”

Step 4: Chat Away

Now open a new chat window and select “FreeBSD Helper” as your model.

You might be tempted to add more data, but remember:

The more unstructured the data, the more likely the LLM will give poor results. 
E. g. if you give it your PDF invoices it will just fail. 
(There are tool for parsing them correctly, but that's not part of this post. Wouldn't fit into 10 posts :-) )

You can use the slider on the top left and lower the value Temperature, this will give you more precise answers.

The hardest part is getting the source data into proper shape—which, thankfully, has already been done by the amazing FreeBSD documentation team.