Ever wished you could fully explore the AI capabilities provided by Ollama from a native macOS app? This is exactly what OllamaChat provides.
OllamaChat allows you to interact with all the AI models supported by Ollama. No need to use the command line or write programs in Python. OllamaChat supports chatting with LLMs but also provides support for embedding models or Vision models such as IBM's Granite Vision. Just drag and drop your pictures and you are ready to ask anything you want.
OllamaChat also fully supports local (STDIO) MCP servers (TCP/IP secure MCP servers support will be added in a future release). Adding a new MCP server is trivial and can be done from the Settings menu.
With MCP servers, your LLM models can use all kind of external tools that allow you for example to provide real-time data to your models or execute remote procedure calls.
That said, the best OllamaChat feature is that it is always just a click away. No need to install bloated JavaScript wrapped applications, web servers or additional libraries. Enjoy!
Download OllamaChat 1.0