OSS VS Code plugin — bring your own LLM
Open-source VS Code and JetBrains extension. Connect any LLM via ollama, LiteLLM, or cloud APIs. Fully customizable.
Your primary coding environment — the IDE where you spend most of your time
Other tools in this slot:
AIchitect's Genome scanner detects Continue in your project via these signals:
@continuedev/config-types.continue/config.json.continue/config.ts.continue/Continue's config accepts Ollama's local API as a model provider — any model running in Ollama appears as a completion and chat option in Continue.
→ Full AI pair programming with zero API costs or data egress — local models power the editor experience.
Continue points to LiteLLM's OpenAI-compatible proxy endpoint, routing all completions and chat through any provider LiteLLM supports.
→ Model flexibility in Continue without changing editor config — swap providers at the LiteLLM level.
Continue supports MCP as a client — any MCP server registered in its config becomes a tool available during chat and agent sessions.
→ Continue gains the same MCP tool ecosystem as Cursor and Claude Code — context7, GitHub, browser, and custom servers.
Add to your GitHub README
[](https://aichitect.dev/tool/continue)Explore the full AI landscape
See how Continue fits into the bigger picture — browse all 207 tools and their relationships.