HN Reader

NewTopBestAskShowJob

Ask HN: Anyone Using a Mac Studio for Local AI/LLM?

score icon44
comment icon28
1 day agoby UmYeahNo
Curious to know your experience running local LLM's with a well spec'ed out M3 Ultra or M4 Pro Mac Studio. I don't see a lot of discussion on the Mac Studio for Local LLMs but it seems like you could put big models in memory with the shared VRAM. I assume that the token generation would be slow, but you might get higher quality results because you can put larger models in memory.