How to Run DeepSeek LLM Locally on Mac

Apr 10, 2025 - Leave a Comment

Run DeepSeek locally on a Mac easily with LM Studio

If you follow AI news, or even tech news, you might have heard of DeepSeek by now, the powerful Chinese Large Language Model that has capabilities rivaling ChatGPT while having dramatically lower training costs. DeepSeek is designed for advanced reasoning, with general purpose natural language and ChatBot abilities, task competency, research, while also being excellent for coding, code generation, and logic, making it a powerful AI tool in general, and potentially for your workflow. While you can run DeepSeek anywhere through the web, and you can also download and run DeepSeek through an app on your iPhone or iPad using the DeepSeek Cloud, another option is available for Mac users; running DeepSeek LLM locally on your Mac.

DeepSeek being run locally can be useful for many Mac users, whether you’re a developer, researcher, or someone simply curious about exploring AI and LLM utilities. One of the most significant benefits of using and running DeepSeek locally is that it’s offline, giving you the benefits of the DeepSeek LLM without relying on cloud services (that are linked to China, whatever you make of that), and offering more privacy and potential freedom to customize and fine-tune the model for your particular use cases.

You must have an Apple Silicon Mac to be able to run DeepSeek locally. In this case we’re going to use a free tool for Mac called LM Studio to be able to quickly setup and use DeepSeek on a Mac. While all of this sounds complicated, and AI and LLM tech can be overwhelming to newcomers, this setup process is really quite easy, as we will walk you through it.

How to Run DeepSeek Locally on Mac

Again you must have an Apple Silicon Mac with an M-series chip or better to be able to run DeepSeek locally. While this software requirement is strict, it’s also standard with running any other local model, even including Apple Intelligence, ChatGPT, Llama, or any of the other increasingly common LLM tools. You’ll also want at least 10GB of available disk space. If you do not have an Apple Silicon Mac, you can still use AI tools through your iPhone or iPad using various apps, or any Mac via the web. For our purposes here, we’re assuming you have an Apple Silicon Mac, in which case here’s how to get DeekSeek running locally:

  1. Get lmstudio free from lmstudio.ai
  2. Mount the disk image for LMStudio and copy the ‘LM Studio’ app from the disk image into your Applications folder to install it, then launch LM Studio directly from your Applications folder
  3. On first launch, you’ll be presented with an onboarding splash screen, click on the green button that says “Get your first LLM”
  4. Setup DeepSeek locally on Mac

  5. At the download your first local LLM screen, click on the green “Download” button (optionally, uncheck the box for “Enable local LLM service on login” if you don’t want a daemon to run for DeekSeek every time you boot your Mac)
  6. Download DeepSeek locally on Mac

  7. Let DeekSeek LLM download, it’s several GB and may take a while, once DeekSeek has downloaded locally you’ll be ready to interact with it directly on your Mac
  8. Running DeepSeek locally on Mac with LM Studio

Now you’re free to use and interact with DeekSeek locally on your Mac.

If you wish to confirm that your experience is entirely local, turn off your wi-fi or otherwise disconnect from the internet, and you will see that you can still interact with LM Studio and DeekSeek as you wish.

Something you will likely notice is that running any LLM locally is probably going to be slower than using a cloud-based LLM, and that’s because cloud LLM’s have huge amounts of powerful resources dedicated exclusively to running the models, whereas when you run an LLM locally it will be limited by the local resources of your Mac, including your processor, memory, and what else is running on your Mac. If you’re interested exclusively in speed and performance, you’ll likely want to use a cloud provider instead, though the latest most powerful Macs also are quite speedy.

You can also use LM Studio on the Mac to run other LLM’s locally, aside from DeekSeek this includes Llama, Mistral, and Phi, but we’re obviously focusing on DeepSeek here. We have discussed other options along the lines of local models in the past, including running Llama LLM locally on the Mac (including an uncensored model!), but this approach to running DeepSeek (and other LLMs if you’re curious) locally is really quite easy, and the performance is good too. Thanks to CultOfMac for the inspiration.

Personally I’m a huge fan of the Mac client for ChatGPT, which I use frequently, but there’s also Perplexity, and many other interesting clients and AI tools out there. That’s aside from the dozens of web-based or app-based options too, including Bing with ChatGPT, Google Gemini, X’s Grok, Facebook’s Llama, and many others.

What do you think of running DeepSeek locally on your Mac? What do you think of LLM and AI tools in general? Do you have a preferred local LLM, or a preferred LLM client? Share your thoughts and experiences with DeekSeek, LLM’s, and AI in general in the comments!

.

Related articles:

Posted by: Paul Horowitz in Mac OS, Tips & Tricks

Leave a Reply

 

Shop on Amazon.com and help support OSXDaily!

Subscribe to OSXDaily

Subscribe to RSS Subscribe to Twitter Feed Follow on Facebook Subscribe to eMail Updates

Tips & Tricks

News

iPhone / iPad

Mac

Troubleshooting

Shop on Amazon to help support this site