You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi! Thanks for the nice library that solving many things!
Describe the bug
In my gitlab runner when I have 150+ images it says "too many open files" (on local machine works ok).
I see the problem is that you try to optimize all the images simultaneously.
It would be awesome if you could add a setting to set the maximum number of images to be processed simultaneously.
To Reproduce
Use 1 CPU 8GB ram droplet with docker to optimize more than 150 images
Thanks!
The text was updated successfully, but these errors were encountered:
Looking at the error details, it seems that the error is occurring during the copying of the configuration file, rather than during image optimization.
Also, I'm concerned that this process is being performed repeatedly.
Hi! Thanks for the nice library that solving many things!
Describe the bug
In my gitlab runner when I have 150+ images it says "too many open files" (on local machine works ok).
I see the problem is that you try to optimize all the images simultaneously.
It would be awesome if you could add a setting to set the maximum number of images to be processed simultaneously.
To Reproduce
Use 1 CPU 8GB ram droplet with docker to optimize more than 150 images
Thanks!
The text was updated successfully, but these errors were encountered: