Skip to main content

How to run a local Copilot

With Ollama and Continue


Install ollama

brew install ollama

This will install ollama using Homebrew, making it easy to upgrade by running brew upgrade ollama.

Start ollama server

ollama serve

This will start ollama server.

You can also run ollama as a background service that will restart at login by running the following command instead.

brew services start ollama

Download codellama model weights

ollama pull codellama:latest

You can also try other models listed here

Download and install Continue VSCode extension

→ Visit https://continue.dev and follow instructions

Configure Continue

Edit ~/.continue/config.py to replace default model by code-llama

# ~/.continue/config.py
from continuedev.src.continuedev.libs.llm.ollama import Ollama

config = ContinueConfig(
...
models=Models(
default=Ollama(model="codellama")
)
)

See also: https://continue.dev/docs/walkthroughs/codellama

Set up VSCode

Open VSCode, hit ´Cmd+Shift+P´ and select ´Continue: Focus on view´.

  • You can now select code + comments, and type ´/edit´ in Continue sidebar to start a completion.

Other setups

  • Cody : Work in Progress