Skip to content

Commit

Permalink
respond to vale warnings
Browse files Browse the repository at this point in the history
  • Loading branch information
epugh committed Sep 12, 2024
1 parent 103165e commit f2b6bba
Show file tree
Hide file tree
Showing 4 changed files with 16 additions and 16 deletions.
2 changes: 1 addition & 1 deletion .github/vale/styles/Vocab/OpenSearch/Products/accept.txt
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,7 @@ Query Workbench
RCF Summarize
RPM Package Manager
Ruby
Ranklib
RankLib
XGBoost
Simple Schema for Observability
Tableau
Expand Down
6 changes: 3 additions & 3 deletions _search-plugins/ltr/building-features.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ what this plugin does to help you use OpenSearch as a learning to
rank system.

This section covers the functionality built into the OpenSearch LTR
plugin to build & upload features with the plugin.
plugin to build and upload features with the plugin.

## What is a feature in OpenSearch LTR

Expand Down Expand Up @@ -91,7 +91,7 @@ This lets you inject various query or user-specific variables into the
search template. Perhaps information about the user for personalization?
Or the location of the searcher's phone?

For now, we'll simply focus on typical keyword searches.
For now, we'll focus on typical keyword searches.

## Uploading and Naming Features

Expand All @@ -108,7 +108,7 @@ Let's look how to work with sets of features.
A *feature store* corresponds to an OpenSearch index used to store
metadata about the features and models. Typically, one feature store
corresponds to a major search site/implementation. For example,
[wikipedia](http://wikipedia.org) vs [wikitravel](http://wikitravel.org)
[wikipedia](http://wikipedia.org) compared to [wikitravel](http://wikitravel.org)

For most use cases, you can simply get by with the single, default
feature store and never think about feature stores ever again. This
Expand Down
2 changes: 1 addition & 1 deletion _search-plugins/ltr/feature-engineering.md
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ its average is `5.33`, and monkey has average `2.5` for
positions [1, 4]. So the returned average is `3.91`, computed
by `(5.33 + 2.5)/2`.

Finally a special stat exists for just counting the number of search
Finally a special stat exists for only counting the number of search
terms. That stat is `unique_terms_count`.

## Document-specific features
Expand Down
22 changes: 11 additions & 11 deletions _search-plugins/ltr/training-models.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,16 +6,16 @@ parent: LTR search
has_children: false
---

# Uploading A Trained Model
# Uploading a trained model

Training models occurs outside OpenSearch LTR. You use the plugin to
log features (as mentioned in [Logging Feature Scores]({{site.url}}{{site.baseurl}}/search-plugins/ltr/logging-features/)). Then with whichever technology you choose, you train a
ranking model. You upload a model to OpenSearch LTR in the available
serialization formats (RankLib, XGboost, and others). Let's first
serialization formats (RankLib, XGBoost, and others). Let's first
talk briefly about training in supported technologies (though not at all
an extensive overview) and then dig into uploading a model.

## Ranklib training
## RankLib training

We provide two demos for training a model. A fully-fledged [Ranklib
Demo](http://github.com/o19s/elasticsearch-learning-to-rank/tree/master/demo)
Expand All @@ -24,7 +24,7 @@ how features are
[logged](http://github.com/o19s/elasticsearch-learning-to-rank-learning-to-rank/tree/master/demo/collectFeatures.py)
and how models are
[trained](http://github.com/o19s/elasticsearch-learning-to-rank-learning-to-rank/tree/master/demo/train.py)
. In particular, you\'ll note that logging create a ranklib consumable
. In particular, you\'ll note that logging create a RankLib consumable
judgment file that looks like:

4 qid:1 1:9.8376875 2:12.318446 # 7555 rambo
Expand Down Expand Up @@ -68,12 +68,12 @@ score. You'll note features are referred to by ordinal, starting by
"1" with Ranklib (this corresponds to the 0th feature in your feature
set). Ranklib does not use feature names when training.

## XGBoost Example
## XGBoost example

There's also an example of how to train a model [using
XGBoost](http://github.com/o19s/elasticsearch-learning-to-rank/tree/master/demo/xgboost-demo).
Examining this demo, you'll see the difference in how Ranklib is
executed vs XGBoost. XGBoost will output a serialization format for
Examining this demo, you'll see the difference in how RankLib is
executed compared to XGBoost. XGBoost will output a serialization format for
gradient boosted decision tree that looks like:

```json
Expand All @@ -83,11 +83,11 @@ gradient boosted decision tree that looks like:
...
```

## XGBoost Parameters
## XGBoost parameters

Additional parameters can optionally be passed for an XGBoost model.
This can be done by specifying the definition as an object, with the
decision trees as the 'splits' field. See the example below.
decision trees as the 'splits' field. See the following example.

Currently supported parameters:

Expand All @@ -103,7 +103,7 @@ Currently supported values: 'binary:logistic', 'binary:logitraw',

## Simple linear models |

Many types of models simply output linear weights of each feature such as linear SVM. The LTR model supports simple linear weights for each features, such as those learned from an SVM model or linear regression:
Many types of models naively output linear weights of each feature such as linear SVM. The LTR model supports simple linear weights for each features, such as those learned from an SVM model or linear regression:

```json
{
Expand Down Expand Up @@ -290,7 +290,7 @@ sets.
The associated features are *copied into* the model. This is for your
safety: modifying the feature set or deleting the feature set after
model creation doesn't have an impact on a model in production. For
example, if we delete the feature set above:
example, if we delete the feature we previously created:

DELETE _ltr/_featureset/more_movie_features

Expand Down

0 comments on commit f2b6bba

Please sign in to comment.