llama stack - setting up using wdl ubuntu 24 and downloading model
Python 3.12 is installed by default in ubuntu 24. You can try to install pip by running
The following command "sudo apt install python3-pip"
You would received an email by Meta. Goto https://github.com/meta-llama/llama-models
And follow the readme instructions.
Next install llama stack "pip install llama-stack".
Run pip show llama-stack and see where you have install the global executable file.
Ensure you have set your path the proper folder, in my case it is
export PATH=$PATH:/home/jeremy/.local/bin
Then run "llama model list" and you get the following outputs.
Depending on the llama version that you've requested, the will dictates which model you can download. for example, i have requested llama 3.3 but when i tried to download llama 3.2 model, it doesn't allow me to do so. Llama 3.3 is really huge. So I think i should be requesting for llama 3.2 instead.
Comments