Docker Model Runner - Run AI Models Locally
Docker announced AI Model Runner for Docker Desktop. This feature is in preview for Docker Early Adopters. It could be a game-changer for AI developers.

The details
- Docker Model Runner is an experimental feature in Docker Desktop v4.40+. It allows you to run large language models (LLMs) locally on your computer.
- It supports native GPU acceleration for Apple Silicon and NVIDIA GPUs. Yes, you heard right. The models run locally on your computer, not in a Docker container.
- It’s like Ollama but works with the Docker API. With the Docker Model Runner CLI, you can download, run, and delete models, similar to Docker containers. In addition, AI models are available via Docker Hub.
Our thoughts
We use Docker regularly and are excited about this announcement. It’s great that models can run directly on the computer, enabling the use of GPU acceleration.
If you’ve wished to run AI models on your computer like Docker containers, then this feature is exactly that. We believe this feature will make Docker even more popular, and we will definitely use it.
More information: 🔗 Docker | Medium
Magic AI tool
Do you ever struggle to organize your meetings at work? If so, it’s time to try an intelligent calendar scheduling tool like Reclaim.
This tool automatically schedules meetings at the best times for your team, helping them stay focused on important work. In addition, Reclaim offers many integrations for tools like Slack, Zoom, Raycast, Hubspot, or Jira.
If you struggle with managing your time effectively, Reclaim could be your solution. And best of all, there is a free forever version!
Ready to transform your productivity?
Hand-picked articles
- Build an AI Finance Agent Team with phidata
- Dive into Investment Research - A Beginner’s Guide to the OpenBB Platform
- Portfolio Allocation - How to Analyze a Stock Portfolio Using Python
😀 Do you enjoy our content? If so, why not support us with a small financial contribution? This helps us fund our work to ensure we can stick around long-term.