vscode code completion using continue extension

Continue is a AI code assistant that you can use in your vscode that supports different models. In this setup, we are going to setup "Continue" and make your vscode smarter and allows you to code faster. 

To do that, ensure you have vscode installed and then you can install VSCode extension called Continue and Cline. 

Apparently continue extension has a  dependency for Llama 3.1 model. 


Once you have that install, you also need to install and keep ollama  running.  We also need to download a model called "qwen2.5-coder:1.5b-base" - this will be used for local agent. Normally, ollama executable will get installed here.  "C:\Users\your-user-path\AppData\Local\Programs\Ollama". Once you have access to that, proceed to download by running the following command:- 

ollama.exe pull llama3.1:8b

ollama.exe pull qwen2.5-coder:1.5b-base




Continue would be able to auto detect that. As you can see here, I have ollama  running and then have my gwen coder model downloaded. 


We can configure what model to use by editing config.yaml file here. (Depending on your user account - path may varies).

C:\Users\your-user-name\.continue\config.yaml

So what you have right now is just code completion. We can quickly test that out using our editor. As shown here. 


And you can also chat with it.









Comments

Popular posts from this blog

gemini cli getting file not defined error

NodeJS: Error: spawn EINVAL in window for node version 20.20 and 18.20

vllm : Failed to infer device type