Ollama adopts MLX for faster AI performance on Apple silicon Macs

Published: Mar 31nd 2026 11:18am on 9to5Mac

One of the best tools to run AI models locally on a Mac just got even better. Here’s why, and how to run it.

more…