-
Notifications
You must be signed in to change notification settings - Fork 143
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enable pass query string to input_map in ml inference search response processor #2899
Enable pass query string to input_map in ml inference search response processor #2899
Conversation
Add more details/examples to description ? |
|
299aefa
to
350ab60
Compare
CI failed
|
350ab60
to
c1d112a
Compare
common/src/main/java/org/opensearch/ml/common/utils/StringUtils.java
Outdated
Show resolved
Hide resolved
common/src/main/java/org/opensearch/ml/common/utils/StringUtils.java
Outdated
Show resolved
Hide resolved
Signed-off-by: Mingshi Liu <[email protected]>
Signed-off-by: Mingshi Liu <[email protected]>
Signed-off-by: Mingshi Liu <[email protected]>
Signed-off-by: Mingshi Liu <[email protected]>
cf59bb5
to
c501cbf
Compare
Signed-off-by: Mingshi Liu <[email protected]>
c501cbf
to
6b1d783
Compare
after discussion, decided to read the query string from input_map, similar to search request processor. requested review again @ylwu-amzn |
@@ -155,6 +154,8 @@ public void processResponseAsync( | |||
try { | |||
SearchHit[] hits = response.getHits().getHits(); | |||
// skip processing when there is no hit | |||
|
|||
String queryString = request.source().toString(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does this also have size
sort
, or just contain query
part ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes. added test testProcessResponseSuccessReadRequestMetaFieldFromInputMap
Signed-off-by: Mingshi Liu <[email protected]>
The backport to
To backport manually, run these commands in your terminal: # Fetch latest updates from GitHub
git fetch
# Create a new working tree
git worktree add .worktrees/backport-2.x 2.x
# Navigate to the new working tree
cd .worktrees/backport-2.x
# Create a new branch
git switch --create backport/backport-2899-to-2.x
# Cherry-pick the merged commit of this pull request and resolve the conflicts
git cherry-pick -x --mainline 1 083abad726933629557028047cb27c482fd950ec
# Push it to GitHub
git push --set-upstream origin backport/backport-2899-to-2.x
# Go back to the original working tree
cd ../..
# Delete the working tree
git worktree remove .worktrees/backport-2.x Then, create a pull request where the |
backport failed because of the previous backport 2.x is not merged yet. #3127 |
… processor (#2899) * enable add query_text to model_config Signed-off-by: Mingshi Liu <[email protected]> * change javadoc Signed-off-by: Mingshi Liu <[email protected]> * add more tests Signed-off-by: Mingshi Liu <[email protected]> * use standard json path config Signed-off-by: Mingshi Liu <[email protected]> * add example in javadoc Signed-off-by: Mingshi Liu <[email protected]> * read query mapping from input_map Signed-off-by: Mingshi Liu <[email protected]> * recognize query mapping by prefix _request. Signed-off-by: Mingshi Liu <[email protected]> --------- Signed-off-by: Mingshi Liu <[email protected]> (cherry picked from commit 083abad)
… processor (#2899) (#3129) * enable add query_text to model_config Signed-off-by: Mingshi Liu <[email protected]> * change javadoc Signed-off-by: Mingshi Liu <[email protected]> * add more tests Signed-off-by: Mingshi Liu <[email protected]> * use standard json path config Signed-off-by: Mingshi Liu <[email protected]> * add example in javadoc Signed-off-by: Mingshi Liu <[email protected]> * read query mapping from input_map Signed-off-by: Mingshi Liu <[email protected]> * recognize query mapping by prefix _request. Signed-off-by: Mingshi Liu <[email protected]> --------- Signed-off-by: Mingshi Liu <[email protected]> (cherry picked from commit 083abad) Co-authored-by: Mingshi Liu <[email protected]>
Description
Enable pass query string to input_map in ml inference search response processor
setting cluster
register cross-encoders local model
get register task status
GET /_plugins/_ml/tasks/tQ5p1ZEB4iWlnHsIf2Xw
deploy cross-encoders local model
`POST /_plugins/_ml/models/tg5p1ZEB4iWlnHsIh2U9/_deploy
`
get deploy task status
GET /_plugins/_ml/tasks/tw5q1ZEB4iWlnHsIo2WX
wait until completed
test model predict
upload index
create search pipeline with query text passing in input_map
search with search pipeline, scores are added in the response
ToDo
currently ml inference processor only support single tensor for local model, need to support multiple tensor parsing as well.
Related Issues
#2897
#2878
Check List
--signoff
.By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
For more information on following Developer Certificate of Origin and signing off your commits, please check here.