OllamaChat is now LocalIntelligence

November 20th, 2025
Filed under: General | Huibert @ 1:54 pm

I have recently released version 1.1 of my Ollama front-end for macOS. The new version fixes some minor bugs and adds several new features:

  • Performance measurement window
  • Share sheet to export chats in both text and Markdown format
  • Model customization (temperature top_p, top_k and seed
  • Context window size control (to avoid truncation)

However, the real news is that I have decided to release the app on the Apple App Store to ensure wider distribution. The sandboxed version comes with a significant limitation, as the app will not be able to connect to STDIO MCP servers. Since this feature is important to many developers, I have decided to continue offering the notarized version on my site.

The decision to offer my latest app on the AppStore also forced me to change the app’s name and icon. Ollama has been contacting developers who have used their name to ask them to desist and I didn’t want that to happen to me. So now, the app is called LocalIntelligence. I would have preferred something else but it seems that all the cool names were already taken.

There are no comments.

Leave a Reply