Building an LLM Workstation Using Apple Silicon

LLM Workstation using Apple Silicon

We are building a Large Language Model (LLM) workstation and server around a Mac Computer and Apple Silicon. My current machine is a Mac Mini M4 Pro. The specs for this machine are –

  • M4 Pro Chip
  • 48 GB of Unified Memory
  • 16 CPU Cores
  • 20 GPU Cores
  • 16-Core Neural Engine
  • 2 TB SSD Storage

We have a new Mac Studio M3 Ultra coming. This upgrade should give us considerably more processing power for my LLM Workstation. The specs for the new machine are –

  • M3 Ultra Chip
  • 512 GB of Unified Memory
  • 32 CPU Cores
  • 80 GPU Cores
  • 32-Core Neural Engine
  • 2 TB SSD Storage
  • 5 TB external Thunderbolt 5 SSD Storage

The setup has a pair of Apple 5K Studio Displays. They allow me to use it as my primary desktop machine.

Our LLM Workstation is a good platform for learning about Artificial Intelligence and Machine Learning. It is also suitable for learning about Machine learning and our planned AI projects.

We will set up our Workstation to run LLMs continuously. We will expose these LLMs via our Home Network. A web (Open WebUI) interface will connect them to other computers and smart devices around our home. Our current Docker setup will be used for this purpose.

Anita's and Fred's Home Lab

WordPress Appliance - Powered by TurnKey Linux