I setup a llama server which has a GPU card, i deployed llama.cpp on that server.
Now I want to connect the service from another computer, I modified llmam.vim configuration in vimrc.
let let g:llama_config.endpoint = "http://192.168.1.10:8012/infill"
with the configuration above, vim will throw error message:
"Job failed with exit code: 7"
need your help
thanks