HomeOn this pagellm.dev.recipes✨ HighlightsHow to run CodeLlama using llama.cppThe easiest way to run a LLM on macOSHow to run a local code CopilotWhich models can be run on M2 Max?About quantization