-
Notifications
You must be signed in to change notification settings - Fork 148
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] ml_inference ingest processor incorrectly parsing input field #2904
Comments
Found that even if I do not use full json path as input
I get the next error
So it is not possible to have nested objects as output?? |
Hi @IanMenendez what's your index setting? if you set up the knn field to be similar to here
the mapping will check if the field is an integer type. It's not allowed by ml inference ingest processor. It's not allowed by the mapping in this case. to further troubleshoot the issue, can you provide more information about the model? It seems you are using a local model? what is the predict request looks like? so I can help checking the model_input and mapping for you |
@mingshl I fixed the issue. But have another one :) my index looks something like this
Now my ml inference processor looks like this
I have several documents with special characters that break the ML inference processor. For example:
This is because the " inside the document are not escaped. Is there a way to escape them from inside the ML inference processor? We have no easy way to escape them before this docs are ingested to our index |
check out this PR. I introduced a method in your pipeline config. try set the model_input as
|
This worked thanks! |
What is the bug?
ml_inference ingest processor not correctly parsing input field when given in "full json path"
This is tested with an OS hosted ml model.
How can one reproduce the bug?
returns
What is the expected behavior?
I expect the processor to yield text embeddings
The text was updated successfully, but these errors were encountered: