Dyad LogoDyad

Local models

Use local LLMs

Overview

You can use Dyad with local LLMs through LM Studio (free for personal use) or Ollama, a free and open-source tool that makes it easy to run open-source language models directly on your device.

Running local LLMs can be resource-intensive, especially with larger or more advanced models. Smaller models may have trouble following Dyad's instructions and could produce less useful responses.

How to Use

Once LM Studio or Ollama is running on your device, open the model picker in Dyad, select Local Models, and you'll see the available models.

Troubleshooting

LM Studio

If no models appear in the LM Studio local models list, ensure that LM Studio is running—specifically at http://localhost:1234. In addition, the models must be loaded.

If you want to connect to LM Studio running on a different port besides 1234, you can create a custom AI provider and models.

Ollama

Dyad uses the hostname in OLLAMA_HOST or http://localhost:11434 if OLLAMA_HOST is not set to connect to Ollama.

If no models appear in the Ollama local models list, ensure that Ollama is running.

On this page