-
Notifications
You must be signed in to change notification settings - Fork 509
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Document new experimental ingestion streaming APIs #8123
Merged
Merged
Changes from all commits
Commits
Show all changes
16 commits
Select commit
Hold shift + click to select a range
87a8204
Document new experimental ingestion streaming APIs
reta 23e02e5
Merge branch 'main' into issue-8111
reta 2069515
Merge branch 'main' into issue-8111
reta 04f94a4
Merge branch 'main' into issue-8111
reta 1c93b69
Merge branch 'main' into issue-8111
reta a2b50fc
Doc review
kolchfa-aws 8dd48ad
Small rewording
kolchfa-aws dd36b31
Address review comments
reta 1466d66
Merge branch 'main' into issue-8111
reta e0d2d7a
Address review comments
reta 9d7c44d
Merge branch 'main' into issue-8111
reta e723ad5
Merge branch 'main' into issue-8111
reta 749ec32
Address review comments
reta 57f1b55
Address review comments
reta 2f0d2d9
Address review comments
reta 5b24c16
Address review comments
reta File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,81 @@ | ||
--- | ||
layout: default | ||
title: Streaming bulk | ||
parent: Document APIs | ||
nav_order: 25 | ||
redirect_from: | ||
- /opensearch/rest-api/document-apis/bulk/streaming/ | ||
--- | ||
|
||
# Streaming bulk | ||
**Introduced 2.17.0** | ||
{: .label .label-purple } | ||
|
||
This is an experimental feature and is not recommended for use in a production environment. For updates on the progress of the feature or if you want to leave feedback, see the associated [GitHub issue](https://github.com/opensearch-project/OpenSearch/issues/9065). | ||
{: .warning} | ||
|
||
The streaming bulk operation lets you add, update, or delete multiple documents by streaming the request and getting the results as a streaming response. In comparison to the traditional [Bulk API]({{site.url}}{{site.baseurl}}/api-reference/document-apis/bulk/), streaming ingestion eliminates the need to estimate the batch size (which is affected by the cluster operational state at any given time) and naturally applies backpressure between many clients and the cluster. The streaming works over HTTP/2 or HTTP/1.1 (using chunked transfer encoding), depending on the capabilities of the clients and the cluster. | ||
|
||
The default HTTP transport method does not support streaming. You must install the [`transport-reactor-netty4`]({{site.url}}{{site.baseurl}}/install-and-configure/configuring-opensearch/network-settings/#selecting-the-transport) HTTP transport plugin and use it as the default HTTP transport layer. Both the `transport-reactor-netty4` plugin and the Streaming Bulk API are experimental. | ||
{: .note} | ||
|
||
## Path and HTTP methods | ||
|
||
```json | ||
POST _bulk/stream | ||
POST <index>/_bulk/stream | ||
``` | ||
|
||
If you specify the index in the path, then you don't need to include it in the [request body chunks]({{site.url}}{{site.baseurl}}/api-reference/document-apis/bulk/#request-body). | ||
|
||
OpenSearch also accepts PUT requests to the `_bulk/stream` path, but we highly recommend using POST. The accepted usage of PUT---adding or replacing a single resource on a given path---doesn't make sense for streaming bulk requests. | ||
{: .note } | ||
|
||
|
||
## Query parameters | ||
|
||
The following table lists the available query parameters. All query parameters are optional. | ||
|
||
Parameter | Data type | Description | ||
:--- | :--- | :--- | ||
`pipeline` | String | The pipeline ID for preprocessing documents. | ||
`refresh` | Enum | Whether to refresh the affected shards after performing the indexing operations. Default is `false`. `true` causes the changes show up in search results immediately but degrades cluster performance. `wait_for` waits for a refresh. Requests take longer to return, but cluster performance isn't degraded. | ||
`require_alias` | Boolean | Set to `true` to require that all actions target an index alias rather than an index. Default is `false`. | ||
`routing` | String | Routes the request to the specified shard. | ||
`timeout` | Time | How long to wait for the request to return. Default is `1m`. | ||
`type` | String | (Deprecated) The default document type for documents that don't specify a type. Default is `_doc`. We highly recommend ignoring this parameter and using the `_doc` type for all indexes. | ||
`wait_for_active_shards` | String | Specifies the number of active shards that must be available before OpenSearch processes the bulk request. Default is `1` (only the primary shard). Set to `all` or a positive integer. Values greater than 1 require replicas. For example, if you specify a value of 3, the index must have 2 replicas distributed across 2 additional nodes in order for the request to succeed. | ||
`batch_interval` | Time | Specifies for how long bulk operations should be accumulated into a batch before sending the batch to data nodes. | ||
`batch_size` | Time | Specifies how many bulk operations should be accumulated into a batch before sending the batch to data nodes. Default is `1`. | ||
{% comment %}_source | List | asdf | ||
`_source_excludes` | List | asdf | ||
`_source_includes` | List | asdf{% endcomment %} | ||
|
||
## Request body | ||
|
||
The Streaming Bulk API request body is fully compatible with the [Bulk API request body]({{site.url}}{{site.baseurl}}/api-reference/document-apis/bulk/#request-body), where each bulk operation (create/index/update/delete) is sent as a separate chunk. | ||
|
||
## Example request | ||
|
||
```json | ||
curl -X POST "http://localhost:9200/_bulk/stream" -H "Transfer-Encoding: chunked" -H "Content-Type: application/json" -d' | ||
{ "delete": { "_index": "movies", "_id": "tt2229499" } } | ||
{ "index": { "_index": "movies", "_id": "tt1979320" } } | ||
{ "title": "Rush", "year": 2013 } | ||
{ "create": { "_index": "movies", "_id": "tt1392214" } } | ||
{ "title": "Prisoners", "year": 2013 } | ||
{ "update": { "_index": "movies", "_id": "tt0816711" } } | ||
{ "doc" : { "title": "World War Z" } } | ||
' | ||
``` | ||
{% include copy.html %} | ||
|
||
## Example response | ||
|
||
Depending on the batch settings, each streamed response chunk may report the results of one or many (batch) bulk operations. For example, for the preceding request with no batching (default), the streaming response may appear as follows: | ||
|
||
```json | ||
{"took": 11, "errors": false, "items": [ { "index": {"_index": "movies", "_id": "tt1979320", "_version": 1, "result": "created", "_shards": { "total": 2 "successful": 1, "failed": 0 }, "_seq_no": 1, "_primary_term": 1, "status": 201 } } ] } | ||
{"took": 2, "errors": true, "items": [ { "create": { "_index": "movies", "_id": "tt1392214", "status": 409, "error": { "type": "version_conflict_engine_exception", "reason": "[tt1392214]: version conflict, document already exists (current version [1])", "index": "movies", "shard": "0", "index_uuid": "yhizhusbSWmP0G7OJnmcLg" } } } ] } | ||
{"took": 4, "errors": true, "items": [ { "update": { "_index": "movies", "_id": "tt0816711", "status": 404, "error": { "type": "document_missing_exception", "reason": "[_doc][tt0816711]: document missing", "index": "movies", "shard": "0", "index_uuid": "yhizhusbSWmP0G7OJnmcLg" } } } ] } | ||
``` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should this and the following two rows be here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
They should (by API specs)