Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory spike while using suppress and throttling plugin #4808

Open
akshaysharama opened this issue Jan 29, 2025 · 5 comments
Open

Memory spike while using suppress and throttling plugin #4808

akshaysharama opened this issue Jan 29, 2025 · 5 comments
Assignees

Comments

@akshaysharama
Copy link

akshaysharama commented Jan 29, 2025

Describe the bug

Using these plugins to discard the duplicate logs from the fluentd.

Trying to remove duplicate logs from the k8s cluster microservices, reading logs from the /var/log/containers and removing duplicate log to improve readbility.

To Reproduce

<filter kubernetes.**>
          @type suppress
          log_suppress_interval 10      # Suppress duplicates for 10 seconds
          num 1                          # Suppress after 2 identical logs
          max_slot_num 100000           # Max unique logs to track
          attr_keys short_message         # Attributes to check for duplicates
          add_tag_prefix sp.            # Prefix for suppressed logs
</filter>
<filter sp.*>
          @type throttle
          threshold 1                   # Limit to 5 logs per interval
          interval 300                   # Time interval in seconds
          add_tag_prefix th.            # Prefix for throttled logs
          enable_throttle true
          discard_throttled true
</filter>

Expected behavior

Memory shoot-up should not happen

Your Environment

- Fluentd version:5.9.11 (Chart: https://charts.bitnami.com/bitnami)
- Package version:
- Operating system:Ubuntu 22.04 LTS
- Kernel version::5.15.0-124-generic

Your Configuration

<filter kubernetes.**>
          @type suppress
          log_suppress_interval 10      # Suppress duplicates for 10 seconds
          num 1                          # Suppress after 2 identical logs
          max_slot_num 100000           # Max unique logs to track
          attr_keys short_message         # Attributes to check for duplicates
          add_tag_prefix sp.            # Prefix for suppressed logs
</filter>
<filter sp.*>
          @type throttle
          threshold 1                   # Limit to 5 logs per interval
          interval 300                   # Time interval in seconds
          add_tag_prefix th.            # Prefix for throttled logs
          enable_throttle true
          discard_throttled true
</filter>.

Your Error Log

No error logs only seeing memory spike

Additional context

No response

@akshaysharama
Copy link
Author

@daipom can you please here? sorry to tag you directly, I have gone through issues raised and observed you are actively looks on plugin issues. sorry once again, but memory spike is messing our test bed so we need some help here.

@daipom daipom moved this to Triage in Fluentd Kanban Jan 30, 2025
@Watson1978 Watson1978 self-assigned this Jan 30, 2025
@Watson1978
Copy link
Contributor

I found fluent-plugin-suppress increase memory usage when many long log messages are duplicated

@Watson1978 Watson1978 moved this from Triage to Work-In-Progress in Fluentd Kanban Jan 30, 2025
@akshaysharama
Copy link
Author

@Watson1978 , will be getting a fix soon or how will proceed currently?

@Watson1978
Copy link
Contributor

Watson1978 commented Jan 31, 2025

The following modifications might have reduced memory usage slightly.

  • Log generate script
require "json"

path = File.expand_path("~/tmp/access.log")
File.open(path, "w") do |f|
  log = { "message": "a" * 1_000_000 }.to_json
  loop do
    f.puts log
    sleep 0.005
  end
end
  • config
<source>
  @type tail
  path "#{File.expand_path '~/tmp/access*.log'}"
  pos_file "#{File.expand_path '~/tmp/fluentd/access.log.pos'}"
  tag log
  refresh_interval 5s
  <parse>
    @type none
  </parse>
</source>

<filter log.**>
    @type suppress
    log_suppress_interval 10
    num 1
    max_slot_num 10
    attr_keys short_message
</filter>

<match **>
  @type stdout
</match>

@Watson1978
Copy link
Contributor

Watson1978 commented Jan 31, 2025

@akshaysharama I think your problem will be solved when
fujiwara/fluent-plugin-suppress#17 is merged and released.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Status: Work-In-Progress
Development

No branches or pull requests

2 participants