Trying to run vllm without gpu will typicaly lands you into this issue. To resolve this issue: we can use set these both. Setting one does not work for me. So i had to configure both settings below:- 1. environment variable CUDA_VISIBLE_DEVICES = "" 2. in the command line, change to use device cpu and remove tensor-parallel-size. For example, python3 -m vllm.entrypoints.openai.api_server --port 8080 --model deepseek-ai/DeepSeek-R1 --device cpu --trust-remote-code --max-model-len 4096" References https://docs.vllm.ai/en/latest/getting_started/troubleshooting.html#failed-to-infer-device-type
This org.jetbrains.kotlin.util.FileAnalysisException with a java.lang.IllegalArgumentException: source must not be null is a known issue that can sometimes occur with the Kotlin compiler. It seems to be related to how the compiler analyzes certain source code structures. When this happens, you need to go clean your project and do another rebuild. For my case, I was trying to test out the databinding = true feature.
Encounter this issue when running autorest (that uses nodejs). I am using node version: v18.20.2 After i revert to use node v18.12.0, then i was able to get my app to run without this error. Since i am working with Azure devops yaml, i was able to use the following to resolve my issue. - task : NodeTool@0 inputs : versionSource : 'spec' versionSpec : '18.18.0'
Comments