Free for macOS

Local LLMs on Apple Silicon

Download, manage, and serve language models locally on your Mac — powered by Apple MLX. No Python. No cloud. No hassle.

Download for macOS Learn More

Everything you need to run LLMs locally

🔍

Browse HuggingFace

Search and download MLX-optimized models directly from HuggingFace. One click to install.

💬

Built-in Chat

Chat with your models in a native macOS interface. Conversation history, configurable prompts, and streaming responses.

🔌

OpenAI-Compatible API

Built-in HTTP server with OpenAI-compatible endpoints. Drop-in replacement for any app that supports the OpenAI API.

🍎

Native Apple Silicon

Powered by Apple MLX framework. Runs entirely on your GPU with zero Python dependencies.

📦

Model Management

See installed models, GPU memory usage, and model sizes at a glance. Load and unload with one click.

🔒

Fully Private

Everything runs on your machine. No data leaves your Mac. No accounts, no telemetry, no cloud.

MLXHub — Models
Models