-
Notifications
You must be signed in to change notification settings - Fork 10.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Misc. bug: llama-server throws "Unsupported param: tools" #10920
Comments
Running into same issue |
+1, Facing the same issue with the latest build request: POST /v1/chat/completions 127.0.0.1 200 |
+1, same problem with the latest code. Built with Vulkan support. It's never worked for me. |
Same problem, when llama.cpp server will support tools? |
Did anyone try initializing llama-server using the --jinja param? |
The answer (llama3.2 but anyway) is:
It sounds that the model can use the schema but the response is not following the openai toolcalling api. Using
See this discussion #6429 HTH |
Name and Version
version: 4369 (21ae3b9)
built with Apple clang version 15.0.0 (clang-1500.3.9.4) for arm64-apple-darwin23.6.0
Operating systems
Mac
Which llama.cpp modules do you know to be affected?
llama-server
Problem description & steps to reproduce
llama-server throws:
{"code":500,"message":"Unsupported param: tools","type":"server_error"}
if the request has "tools" parameter.
To reproduce:
First Bad Commit
No response
Relevant log output
The text was updated successfully, but these errors were encountered: