Apple Silicon GPUs, Docker and Ollama: Pick two.
If you’ve tried to use Ollama with Docker on an Apple GPU lately, you might find out that their GPU is not supported. But you can get Ollama to run with GPU support on a Mac. This article will explain the problem, how to detect it, and how to get your Ollama workflow running with all of your VRAM (which, on a Mac, is your DRAM too)!