Connecting to a local LLM with Ollama

Run a local language model with Ollama and connect Mundi for offline, free Artificial Intelligence-powered geospatial analysis on your laptop.

Mundi is an open source web GIS that can connect to local language models via Ollama, enabling offline, free use of Artificial Intelligence features without data leaving your machine. The project is compatible with the Chat Completions API that Ollama exposes, so Mundi can talk to models such as gemma3-tools. The guide demonstrates running gemma3-1b on a MacBook Pro with an M1 Pro chip, a lightweight model that can run quickly on a laptop and fit within modest disk and memory constraints.

Preparation requires a self-hosted Mundi instance built with Docker Compose, Ollama installed on the host, and at least one Ollama model downloaded. The tutorial uses orieg/gemma3-tools:1b as an example because it supports tool calling. To wire Mundi to the local server, edit docker-compose.yml and set environment variables such as OPENAI_BASE_URL to http://host.docker.internal:11434/v1, OPENAI_API_KEY to a non-empty string like ollama, and OPENAI_MODEL to the model name. Adding extra_hosts with host.docker.internal mapped to host-gateway ensures the container can reach services on the host at the Ollama default port.

After saving configuration changes, start the Ollama server on the host with ollama serve and, if needed, launch the model with ollama run orieg/gemma3-tools:1b. Then bring up the Mundi application using docker compose up app from the mundi.ai project directory. Once migrations complete and services report ready, open http://localhost:8000 to access Mundi and its assistant, Kue. Upload a GIS file such as a GeoPackage and ask Kue about your layer; Kue will use the local model to summarize layer type, geometry, feature count, and more, and can invoke tools if the model supports tool calling.

The documentation also emphasizes caution: local 1B-parameter models will often give useful, concise summaries but may hallucinate details or misinterpret data relationships, as shown in an example where the model incorrectly suggested a PostGIS connection for a standalone vector file. Users should verify outputs and treat model responses as aids rather than authoritative answers.

70

Impact Score
###CFCACHE###

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.

Please check your email for a Verification Code sent to . Didn't get a code? Click here to resend