OllamaChat 1.0

October 30th, 2025
Filed under: General | Huibert @ 9:29 am

Today I have released a new application, OllamaChat. This is a lightweight native Ollama client for macOS. It allows to easily use a GUI to chat with any LLM installed on your local (or if you want, remote) Ollama instance.

This is a project I started because I wanted my team at IBM to learn more about how AI works and allow them to easily perform demonstrations of IBM’s Granite models even when an Internet connection is not available or can’t be trusted (which is frequently the case at customer sites or during public events).

OllamaChat doesn’t just support regular or reasoning models. It works also with vision models (just drag and drop an image to start chatting with it). You can also use embedding models to understand how text is converted into vectors.

Finally, OllamaChat also supports MCP. This is actually the reason I built this app because I couldn’t find an easy way to demonstrate Agentic AI on my computer without having to install a lot of bloated software. Right now OllamaChat works with local STDIO MCP servers or remote (TCP/IP) unsecured servers. I plan to support secure servers in the future but I have decided to release this version because it is already very useful at this stage.

You can download OllamaChat here

Comments are closed.