Skip to content

Commit

Permalink
Merge branch 'main' into mapping-content-gap
Browse files Browse the repository at this point in the history
  • Loading branch information
vagimeli authored Jul 17, 2024
2 parents 00fc9c1 + e3ee238 commit ae99ef0
Show file tree
Hide file tree
Showing 161 changed files with 4,019 additions and 460 deletions.
3 changes: 2 additions & 1 deletion .github/vale/styles/Vocab/OpenSearch/Plugins/accept.txt
Original file line number Diff line number Diff line change
Expand Up @@ -26,4 +26,5 @@ Search Relevance plugin
Security plugin
Security Analytics plugin
SQL plugin
Trace Analytics plugin
Trace Analytics plugin
User Behavior Insights
1 change: 0 additions & 1 deletion .github/vale/styles/Vocab/OpenSearch/Products/accept.txt
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,6 @@ Painless
Peer Forwarder
Performance Analyzer
Piped Processing Language
Point in Time
Powershell
Python
PyTorch
Expand Down
43 changes: 43 additions & 0 deletions .github/workflows/pr_checklist.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
name: PR Checklist

on:
pull_request_target:
types: [opened]

permissions:
pull-requests: write

jobs:
add-checklist:
runs-on: ubuntu-latest

steps:
- name: Comment PR with checklist
uses: peter-evans/create-or-update-comment@v3
with:
token: ${{ secrets.GITHUB_TOKEN }}
issue-number: ${{ github.event.pull_request.number }}
body: |
Thank you for submitting your PR. The PR states are In progress (or Draft) -> Tech review -> Doc review -> Editorial review -> Merged.
Before you submit your PR for doc review, make sure the content is technically accurate. If you need help finding a tech reviewer, tag a [maintainer](https://github.com/opensearch-project/documentation-website/blob/main/MAINTAINERS.md).
**When you're ready for doc review, tag the assignee of this PR**. The doc reviewer may push edits to the PR directly or leave comments and editorial suggestions for you to address (let us know in a comment if you have a preference). The doc reviewer will arrange for an editorial review.
- name: Auto assign PR to repo owner
uses: actions/github-script@v6
with:
script: |
let assignee = context.payload.pull_request.user.login;
const prOwners = ['Naarcha-AWS', 'kolchfa-aws', 'vagimeli', 'natebower'];
if (!prOwners.includes(assignee)) {
assignee = 'hdhalter'
}
github.rest.issues.addAssignees({
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
assignees: [assignee]
});
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -4,4 +4,5 @@ _site
.DS_Store
Gemfile.lock
.idea
*.iml
.jekyll-cache
1 change: 1 addition & 0 deletions .ruby-version
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
3.3.2
6 changes: 3 additions & 3 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -100,10 +100,10 @@ Follow these steps to set up your local copy of the repository:

#### Troubleshooting

If you encounter an error while trying to build the documentation website, find the error in the following troubleshooting list:
Try the following troubleshooting steps if you encounter an error when trying to build the documentation website:

- When running `rvm install 3.2` if you receive a `Error running '__rvm_make -j10'`, resolve this by running `rvm install 3.2.0 -C --with-openssl-dir=/opt/homebrew/opt/[email protected]` instead of `rvm install 3.2`.
- If receive a `bundle install`: `An error occurred while installing posix-spawn (0.3.15), and Bundler cannot continue.` error when trying to run `bundle install`, resolve this by running `gem install posix-spawn -v 0.3.15 -- --with-cflags=\"-Wno-incompatible-function-pointer-types\"`. Then, run `bundle install`.
- If you see the `Error running '__rvm_make -j10'` error when running `rvm install 3.2`, you can resolve it by running `rvm install 3.2.0 -C --with-openssl-dir=/opt/homebrew/opt/[email protected]` instead of `rvm install 3.2`.
- If you see the `bundle install`: `An error occurred while installing posix-spawn (0.3.15), and Bundler cannot continue.` error when trying to run `bundle install`, you can resolve it by running `gem install posix-spawn -v 0.3.15 -- --with-cflags=\"-Wno-incompatible-function-pointer-types\"` and then `bundle install`.



Expand Down
1 change: 1 addition & 0 deletions _about/version-history.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,7 @@ OpenSearch version | Release highlights | Release date
[2.0.1](https://github.com/opensearch-project/opensearch-build/blob/main/release-notes/opensearch-release-notes-2.0.1.md) | Includes bug fixes and maintenance updates for Alerting and Anomaly Detection. | 16 June 2022
[2.0.0](https://github.com/opensearch-project/opensearch-build/blob/main/release-notes/opensearch-release-notes-2.0.0.md) | Includes document-level monitors for alerting, OpenSearch Notifications plugins, and Geo Map Tiles in OpenSearch Dashboards. Also adds support for Lucene 9 and bug fixes for all OpenSearch plugins. For a full list of release highlights, see the Release Notes. | 26 May 2022
[2.0.0-rc1](https://github.com/opensearch-project/opensearch-build/blob/main/release-notes/opensearch-release-notes-2.0.0-rc1.md) | The Release Candidate for 2.0.0. This version allows you to preview the upcoming 2.0.0 release before the GA release. The preview release adds document-level alerting, support for Lucene 9, and the ability to use term lookup queries in document level security. | 03 May 2022
[1.3.18](https://github.com/opensearch-project/opensearch-build/blob/main/release-notes/opensearch-release-notes-1.3.18.md) | Includes maintenance updates for OpenSearch security. | 16 July 2024
[1.3.17](https://github.com/opensearch-project/opensearch-build/blob/main/release-notes/opensearch-release-notes-1.3.17.md) | Includes maintenance updates for OpenSearch security and OpenSearch Dashboards security. | 06 June 2024
[1.3.16](https://github.com/opensearch-project/opensearch-build/blob/main/release-notes/opensearch-release-notes-1.3.16.md) | Includes bug fixes and maintenance updates for OpenSearch security, index management, performance analyzer, and reporting. | 23 April 2024
[1.3.15](https://github.com/opensearch-project/opensearch-build/blob/main/release-notes/opensearch-release-notes-1.3.15.md) | Includes bug fixes and maintenance updates for cross-cluster replication, SQL, OpenSearch Dashboards reporting, and alerting. | 05 March 2024
Expand Down
256 changes: 256 additions & 0 deletions _aggregations/metric/geocentroid.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,256 @@
---
layout: default
title: Geocentroid
parent: Metric aggregations
grand_parent: Aggregations
nav_order: 45
---

# Geocentroid

The OpenSearch `geo_centroid` aggregation is a powerful tool that allows you to calculate the weighted geographic center or focal point of a set of spatial data points. This metric aggregation operates on `geo_point` fields and returns the centroid location as a latitude-longitude pair.

## Using the aggregation

Follow these steps to use the `geo_centroid` aggregation:

**1. Create an index with a `geopoint` field**

First, you need to create an index with a `geo_point` field type. This field stores the geographic coordinates you want to analyze. For example, to create an index called `restaurants` with a `location` field of type `geo_point`, use the following request:

```json
PUT /restaurants
{
"mappings": {
"properties": {
"name": {
"type": "text"
},
"location": {
"type": "geo_point"
}
}
}
}
```
{% include copy-curl.html %}

**2. Index documents with spatial data**

Next, index your documents containing the spatial data points you want to analyze. Make sure to include the `geo_point` field with the appropriate latitude-longitude coordinates. For example, index your documents using the following request:

```json
POST /restaurants/_bulk?refresh
{"index": {"_id": 1}}
{"name": "Cafe Delish", "location": "40.7128, -74.0059"}
{"index": {"_id": 2}}
{"name": "Tasty Bites", "location": "51.5074, -0.1278"}
{"index": {"_id": 3}}
{"name": "Sushi Palace", "location": "48.8566, 2.3522"}
{"index": {"_id": 4}}
{"name": "Burger Joint", "location": "34.0522, -118.2437"}
```
{% include copy-curl.html %}

**3. Run the `geo_centroid` aggregation**

To caluculate the centroid location across all documents, run a search with the `geo_centroid` aggregation on the `geo_point` field. For example, use the following request:

```json
GET /restaurants/_search
{
"size": 0,
"aggs": {
"centroid": {
"geo_centroid": {
"field": "location"
}
}
}
}
```
{% include copy-curl.html %}

The response includes a `centroid` object with `lat` and `lon` properties representing the weighted centroid location of all indexed data point, as shown in the following example:

```json
"aggregations": {
"centroid": {
"location": {
"lat": 43.78224998130463,
"lon": -47.506300045643
},
"count": 4
```
{% include copy-curl.html %}

**4. Nest under other aggregations (optional)**

You can also nest the `geo_centroid` aggregation under other bucket aggregations, such as `terms`, to calculate the centroid for subsets of your data. For example, to find the centroid location for each city, use the following request:

```json
GET /restaurants/_search
{
"size": 0,
"aggs": {
"cities": {
"terms": {
"field": "city.keyword"
},
"aggs": {
"centroid": {
"geo_centroid": {
"field": "location"
}
}
}
}
}
}
```
{% include copy-curl.html %}

This returns a centroid location for each city bucket, allowing you to analyze the geographic center of data points in different cities.

## Using `geo_centroid` with the `geohash_grid` aggregation

The `geohash_grid` aggregation partitions geospatial data into buckets based on geohash prefixes.

When a document contains multiple geopoint values in a field, the `geohash_grid` aggregation assigns the document to multiple buckets, even if one or more of its geopoints are outside the bucket boundaries. This behavior is different from how individual geopoints are treated, where only those within the bucket boundaries are considered.

When you nest the `geo_centroid` aggregation under the `geohash_grid` aggregation, each centroid is calculated using all geopoints in a bucket, including those that may be outside the bucket boundaries. This can result in centroid locations that fall outside the geographic area represented by the bucket.

#### Example

In this example, the `geohash_grid` aggregation with a `precision` of `3` creates buckets based on geohash prefixes of length `3`. Because each document has multiple geopoints, they may be assigned to multiple buckets, even if some of the geopoints fall outside the bucket boundaries.

The `geo_centroid` subaggregation calculates the centroid for each bucket using all geopoints assigned to that bucket, including those outside the bucket boundaries. This means that the resulting centroid locations may not necessarily lie within the geographic area represented by the corresponding geohash bucket.

First, create an index and index documents containing multiple geopoints:

```json
PUT /locations
{
"mappings": {
"properties": {
"name": {
"type": "text"
},
"coordinates": {
"type": "geo_point"
}
}
}
}

POST /locations/_bulk?refresh
{"index": {"_id": 1}}
{"name": "Point A", "coordinates": ["40.7128, -74.0059", "51.5074, -0.1278"]}
{"index": {"_id": 2}}
{"name": "Point B", "coordinates": ["48.8566, 2.3522", "34.0522, -118.2437"]}
```

Then, run `geohash_grid` with the `geo_centroid` subaggregation:

```json
GET /locations/_search
{
"size": 0,
"aggs": {
"grid": {
"geohash_grid": {
"field": "coordinates",
"precision": 3
},
"aggs": {
"centroid": {
"geo_centroid": {
"field": "coordinates"
}
}
}
}
}
}
```
{% include copy-curl.html %}

<details markdown="block">
  <summary>
    Response
  </summary>
  {: .text-delta}

```json
{
"took": 26,
"timed_out": false,
"_shards": {
"total": 1,
"successful": 1,
"skipped": 0,
"failed": 0
},
"hits": {
"total": {
"value": 2,
"relation": "eq"
},
"max_score": null,
"hits": []
},
"aggregations": {
"grid": {
"buckets": [
{
"key": "u09",
"doc_count": 1,
"centroid": {
"location": {
"lat": 41.45439997315407,
"lon": -57.945750039070845
},
"count": 2
}
},
{
"key": "gcp",
"doc_count": 1,
"centroid": {
"location": {
"lat": 46.11009998945519,
"lon": -37.06685005221516
},
"count": 2
}
},
{
"key": "dr5",
"doc_count": 1,
"centroid": {
"location": {
"lat": 46.11009998945519,
"lon": -37.06685005221516
},
"count": 2
}
},
{
"key": "9q5",
"doc_count": 1,
"centroid": {
"location": {
"lat": 41.45439997315407,
"lon": -57.945750039070845
},
"count": 2
}
}
]
}
}
}
```
{% include copy-curl.html %}

</details>
Loading

0 comments on commit ae99ef0

Please sign in to comment.