How to Fix CORS Issues with ThunderAI and Ollama on Docker
In Gnoppix we’re using the ThunderAI BetterBird Email add-on with a local Ollama instance running inside a Docker container, you might encounter connection errors. This is usually due to Cross-Origin Resource Sharing (CORS) restrictions.
By default, Ollama only allows requests from specific origins. Since ThunderAI runs as a browser extension within Thunderbird, you need to explicitly permit its “origin” so the two can communicate.
The Fix: Setting OLLAMA_ORIGINS
To allow ThunderAI to access your Ollama API, you must set the OLLAMA_ORIGINS environment variable when starting your container.
Using the Docker Command Line
If you are launching your Ollama container via the terminal, simply add the -e flag to include the environment variable.
Run the following command to start a new container with the correct permissions:
docker run -d \
-v ollama:/root/.ollama \
-p 11434:11434 \
--name ollama \
-e "OLLAMA_ORIGINS=moz-extension://*" \
ollama/ollama
Why this works
-e "OLLAMA_ORIGINS=moz-extension://*": This tells Ollama to accept requests coming from any Mozilla-based extension (which includes Thunderbird add-ons).-p 11434:11434: Maps the standard Ollama port so Thunderbird can find it athttp://localhost:11434.
Alternative: Granting “All URLs” Permission
If you cannot modify your Docker startup command or do not have access to the server configuration, ThunderAI provides a built-in workaround:
- Open the ThunderAI Options page.
- Look for the button to grant “All URLs” permission.
- Once granted, the extension can bypass standard CORS checks for the Ollama API.
Contributing
The Ollama CORS configuration can vary slightly depending on your setup. As example if you use ollama as systemd service which you do in Gnoppix. See OLLAMA on Linux: A Guide to CPU/GPU-Centric Computing