You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
One of my USB-hubs has a built-in memory card reader which makes the agent hog huge system resources, never report any data and not prompt any error messages if not manually checking hetrixtools_cron.log.
I don't know how I can assist in the troubleshooting, but here is that log (the device in question is /dev/sdc).
Seems like something additional is wrong here; the awk problem remains even after disabling disk health checking.
Is there any way to get any useful info at all except for dissecting the script statement by statement? hetrixtools_cron2.log
Couldn't resist doing a bit of digging. Once again (#38) you assume that everyone has a swap file, just like last time. That's the division by zero. The disk problem is probably a separate issue though but please, not everyone uses swap.
zejjnt
changed the title
Option to skip disk check for a certain node
Option to skip disk check for a certain node (also, no swap causes division by zero once again) label:bug
Dec 4, 2023
zejjnt
changed the title
Option to skip disk check for a certain node (also, no swap causes division by zero once again) label:bug
Option to skip disk check for a certain node (also, no swap causes division by zero once again)
Dec 4, 2023
The division by zero error has now been fixed in version 2.0.10c76788e
For the disk related issue, please open a support ticket on our platform, as we'll need more info in order to try and reproduce the issue on our testing environments.
One of my USB-hubs has a built-in memory card reader which makes the agent hog huge system resources, never report any data and not prompt any error messages if not manually checking hetrixtools_cron.log.
I don't know how I can assist in the troubleshooting, but here is that log (the device in question is /dev/sdc).
hetrixtools_cron.log
The text was updated successfully, but these errors were encountered: