Back to Blog
AI/MLPythonLLM
Creating a Local AI Copilot with Python
2024-01-20
Local AI Copilot
Privacy is a major concern when using cloud-based AI coding assistants. Running an LLM locally solves this.
The Stack
- Model: Llama 3 or Mistral (quantized).
- Inference: Ollama or Llama.cpp.
- Integration: VS Code Extension API.
...