-
Notifications
You must be signed in to change notification settings - Fork 21
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to analyse data accumalated over a long period of time? #35
Comments
Hello! The system does not provide a way to view performance data averaged by week, month, etc., it only provides reporting by 24hr day. The system does not retain any performance data older than 7 days; a daily task prunes the database to remove any older database rows and compacts the database to conserve disk storage space: “Database rows older than 7 days are deleted, and the database is compacted to recover freed space” https://github.com/mr-canoehead/network_performance_monitor/wiki/Schedule-network-performance-tests While it is certainly possible to increase the data retention time by modifying the prune_db.py script I’m not sure what the side effects of doing this might be; as the database grows performance may be degraded when executing queries etc. which may impact operation of the system. There is also the implication of increased disk space usage which I have not characterized (e.g. database size for 1 month of data, 2 months, etc.). There may also be some limitations with SQLite with regards to table row counts, maximum database file size, etc. The system does calculate daily averages for upload/download speeds and latency, and storing these values in a new database table should be a trivial exercise. That table could be omitted from the daily database pruning task so that many months of daily averages could be retained. Since there would only be one row per day generated, the size of this table would not be an issue of concern. I could develop a script that exports all the daily averages data to a CSV (comma-separated values) file, the exported data could then be easily imported into a spreadsheet application (Excel, LibreOffice Calc, etc.) for further analysis (e.g. averaging by week, month, etc.). Please let me know if this CSV export approach might be useful, I think that the effort required to implement it should be fairly minimal and is something I’d be able to tackle in a reasonable time frame. Thanks, |
Closing due to lack of feedback from submitter. |
Hello, sorry for taking too long to reply. |
Thanks for the feedback, I'm taking a look at the system to see what changes (db schema etc.) are needed to support this enhancement. |
hello! |
I got a suggestion. |
I have been running this monitor since April 2023, so it is gonna be alomost an year now.
I was wondering if, there is a way to show me the average speeds over a week, a month, 2 months, one year or so, and give more insights about the ISP I am subscribed to, so that it can be useful to make decisions.
The text was updated successfully, but these errors were encountered: