Skip to content

Commit

Permalink
gigamon: Update package description and cleanup (#10879)
Browse files Browse the repository at this point in the history
Update package description and readme cleanup.

Current package refers to be ingesting data from Filebeat.
Updated this to Elastic Agent inside the package description.

Other Cleanup:

Fix comments positioning in ingest pipeline.
Fix README indentation.
Fix dashboards containing undefined index-pattern.
  • Loading branch information
kcreddy authored Aug 29, 2024
1 parent 56a4c02 commit 3a8db73
Show file tree
Hide file tree
Showing 17 changed files with 269 additions and 260 deletions.
26 changes: 13 additions & 13 deletions packages/gigamon/_dev/build/docs/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -60,30 +60,30 @@ To add AMX application:
2. Enter the Alias for the application. Enter a port number for the Cloud Tool Ingestor Port. Then, click the Add button for Cloud Tool Exports.
3. You can export your Application Metadata Intelligence output to cloud tools. Enter the following details for the Cloud tool export in the Application quick view:

-**Alias**:Enter the alias name for the cloud tool export.
- **Alias**: Enter the alias name for the cloud tool export.

-**Cloud Tool**:Select the Cloud tool from the drop-down menu.If it is not available click "others".
- **Cloud Tool**: Select the Cloud tool from the drop-down menu.If it is not available click "others".

-**Endpoint**:Give the URL of the cloud tool instance with the correct port number in which the port is listening.
- **Endpoint**: Give the URL of the cloud tool instance with the correct port number in which the port is listening.

-**Headers**:Enter the secret header and enable secure keys
- **Headers**: Enter the secret header and enable secure keys

-**Enable Export**:Enable the box to export the Application Metadata Intelligence output in JSON format.
- **Enable Export**: Enable the box to export the Application Metadata Intelligence output in JSON format.

-**Zip**:Enable the box to compress the output file.
- **Zip**: Enable the box to compress the output file.

-**Interval**:The time interval (in seconds) in which the data should be uploaded periodically. The recommended minimum time interval is 10 seconds and the maximum time interval is 30 minutes.
- **Interval**: The time interval (in seconds) in which the data should be uploaded periodically. The recommended minimum time interval is 10 seconds and the maximum time interval is 30 minutes.

-**Parallel Writer**:Specifies the number of simultaneous JSON exports done.
- **Parallel Writer**: Specifies the number of simultaneous JSON exports done.

-**Export Retries**:The number of times the application tries to export the entries to Cloud Tool. The recommended minimum value is 4 and the maximum is 10.
- **Export Retries**: The number of times the application tries to export the entries to Cloud Tool. The recommended minimum value is 4 and the maximum is 10.

-**Maximum Entries**:The number of JSON entries in a file. The maximum number of allowed entries is 5000 and the minimum is 10, however 1000 is the default value.
- **Maximum Entries**: The number of JSON entries in a file. The maximum number of allowed entries is 5000 and the minimum is 10, however 1000 is the default value.

-**Labels**:Click Add. Enter the following details:
- **Labels**: Click Add. Enter the following details:

o Enter the Key .
o Enter the Value.
- Enter the **Key**.
- Enter the **Value**.


4. Click Deploy to deploy the monitoring session. The Select nodes to deploy the Monitoring Session dialog box appears. Select the GigaVUE V Series Node for which you wish to deploy the monitoring session.
Expand Down
14 changes: 14 additions & 0 deletions packages/gigamon/changelog.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,18 @@
# newer versions go on top
- version: "0.2.0"
changes:
- description: Update package description.
type: enhancement
link: https://github.com/elastic/integrations/pull/10879
- description: Update README to fix indentation.
type: enhancement
link: https://github.com/elastic/integrations/pull/10879
- description: Fix dashboards containing undefined index-pattern.
type: bugfix
link: https://github.com/elastic/integrations/pull/10879
- description: Fix comments positioning in ingest pipeline.
type: bugfix
link: https://github.com/elastic/integrations/pull/10879
- version: "0.1.0"
changes:
- description: Initial release
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -266,12 +266,12 @@ processors:
- append:
field: error.message
value: 'Processor {{{_ingest.on_failure_processor_type}}} with tag {{{_ingest.on_failure_processor_tag}}} in pipeline {{{_ingest.pipeline}}} failed with message: {{{_ingest.on_failure_message}}}'
# convert dns_ fields to ip
- convert:
field: gigamon.ami.dns_arcount
if: ctx.gigamon?.ami?.dns_arcount != null
tag: convert_dns_arcount
type: long
# convert dns_ fields to ip
on_failure:
- remove:
field: gigamon.ami.dns_arcount
Expand All @@ -284,27 +284,27 @@ processors:
if: ctx.gigamon?.ami?.dns_reverse_addr != null
tag: convert_dns_reverse_addr
type: ip
# convert dns_ fields to double
on_failure:
- remove:
field: gigamon.ami.dns_reverse_addr
ignore_missing: true
- append:
field: error.message
value: 'Processor {{{_ingest.on_failure_processor_type}}} with tag {{{_ingest.on_failure_processor_tag}}} in pipeline {{{_ingest.pipeline}}} failed with message: {{{_ingest.on_failure_message}}}'
# convert dns_ fields to double
- convert:
field: gigamon.ami.dns_response_time
if: ctx.gigamon?.ami?.dns_response_time != null
tag: convert_dns_response_time
type: double
# convert http_ fields to long
on_failure:
- remove:
field: gigamon.ami.dns_response_time
ignore_missing: true
- append:
field: error.message
value: 'Processor {{{_ingest.on_failure_processor_type}}} with tag {{{_ingest.on_failure_processor_tag}}} in pipeline {{{_ingest.pipeline}}} failed with message: {{{_ingest.on_failure_message}}}'
# convert http_ fields to long
- convert:
field: gigamon.ami.http_code
if: ctx.gigamon?.ami?.http_code != null
Expand Down Expand Up @@ -831,11 +831,6 @@ processors:
- json
if: ctx.tags == null || !(ctx.tags.contains('preserve_duplicate_custom_fields'))
ignore_missing: true
- remove:
field: event.original
if: ctx.tags == null || !(ctx.tags.contains('preserve_original_event'))
ignore_failure: true
ignore_missing: true
- script:
lang: painless
description: Drops null/empty values recursively.
Expand Down
26 changes: 13 additions & 13 deletions packages/gigamon/docs/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -60,30 +60,30 @@ To add AMX application:
2. Enter the Alias for the application. Enter a port number for the Cloud Tool Ingestor Port. Then, click the Add button for Cloud Tool Exports.
3. You can export your Application Metadata Intelligence output to cloud tools. Enter the following details for the Cloud tool export in the Application quick view:

-**Alias**:Enter the alias name for the cloud tool export.
- **Alias**: Enter the alias name for the cloud tool export.

-**Cloud Tool**:Select the Cloud tool from the drop-down menu.If it is not available click "others".
- **Cloud Tool**: Select the Cloud tool from the drop-down menu.If it is not available click "others".

-**Endpoint**:Give the URL of the cloud tool instance with the correct port number in which the port is listening.
- **Endpoint**: Give the URL of the cloud tool instance with the correct port number in which the port is listening.

-**Headers**:Enter the secret header and enable secure keys
- **Headers**: Enter the secret header and enable secure keys

-**Enable Export**:Enable the box to export the Application Metadata Intelligence output in JSON format.
- **Enable Export**: Enable the box to export the Application Metadata Intelligence output in JSON format.

-**Zip**:Enable the box to compress the output file.
- **Zip**: Enable the box to compress the output file.

-**Interval**:The time interval (in seconds) in which the data should be uploaded periodically. The recommended minimum time interval is 10 seconds and the maximum time interval is 30 minutes.
- **Interval**: The time interval (in seconds) in which the data should be uploaded periodically. The recommended minimum time interval is 10 seconds and the maximum time interval is 30 minutes.

-**Parallel Writer**:Specifies the number of simultaneous JSON exports done.
- **Parallel Writer**: Specifies the number of simultaneous JSON exports done.

-**Export Retries**:The number of times the application tries to export the entries to Cloud Tool. The recommended minimum value is 4 and the maximum is 10.
- **Export Retries**: The number of times the application tries to export the entries to Cloud Tool. The recommended minimum value is 4 and the maximum is 10.

-**Maximum Entries**:The number of JSON entries in a file. The maximum number of allowed entries is 5000 and the minimum is 10, however 1000 is the default value.
- **Maximum Entries**: The number of JSON entries in a file. The maximum number of allowed entries is 5000 and the minimum is 10, however 1000 is the default value.

-**Labels**:Click Add. Enter the following details:
- **Labels**: Click Add. Enter the following details:

o Enter the Key .
o Enter the Value.
- Enter the **Key**.
- Enter the **Value**.


4. Click Deploy to deploy the monitoring session. The Select nodes to deploy the Monitoring Session dialog box appears. Select the GigaVUE V Series Node for which you wish to deploy the monitoring session.
Expand Down
Loading

0 comments on commit 3a8db73

Please sign in to comment.