Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Loading main page makes server CPU max out #902

Open
1 task done
freakshock88 opened this issue Jan 18, 2025 · 9 comments
Open
1 task done

Loading main page makes server CPU max out #902

freakshock88 opened this issue Jan 18, 2025 · 9 comments

Comments

@freakshock88
Copy link

freakshock88 commented Jan 18, 2025

Describe the Bug

Hi!

As I described earlier in this issue, I seem to have a quite annoying bug that makes my VM I run Hoarder on go to 100% CPU usage and makes my VM unusable for like 10-20 minutes.

My hoarder data consists of 16 imported articles, however these URLs point to hosted versions of Singlefile HTML archives.
It could perhaps be that these articles contain a lot of data since they are full page archives and this somehow causes this bug.

I can reproduce this bug by simply opening the page again, so it's not a one-off issue.

Steps to Reproduce

  1. Setup a new hoarder instance
  2. Store a page using the Singlefile extension
  3. Host this page on some domain
  4. Import the page (I have done this for around 10 pages)
  5. Open the hoarder main url.

Expected Behaviour

Open hoarder main page and see my 16 articles there, without any performance issues.

Screenshots or Additional Context

This is the log from around the time I loaded the page:

2025-01-18T06:00:00.509854854Z 2025-01-18T06:00:00.509Z info: [feed] Scheduling feed refreshing jobs ...

2025-01-18T07:00:00.539351197Z 2025-01-18T07:00:00.537Z info: [feed] Scheduling feed refreshing jobs ...

2025-01-18T07:55:35.020365794Z [next-auth][warn][NEXTAUTH_URL] 

2025-01-18T07:55:35.020414003Z https://next-auth.js.org/warnings#nextauth_url

2025-01-18T07:55:35.537965486Z (node:202699) [DEP0040] DeprecationWarning: The `punycode` module is deprecated. Please use a userland alternative instead.

2025-01-18T07:55:35.538023097Z (Use `node --trace-deprecation ...` to show where the warning was created)

2025-01-18T08:00:27.515782960Z   ▲ Next.js 14.2.21

2025-01-18T08:00:28.080782753Z   - Local:        http://localhost:3000

2025-01-18T08:00:28.080814011Z   - Network:      http://0.0.0.0:3000

2025-01-18T08:00:28.080816616Z 

2025-01-18T08:00:28.080818296Z  ✓ Starting...

2025-01-18T08:00:28.080821170Z  ✓ Ready in 2.1s

2025-01-18T08:00:41.553641263Z [next-auth][warn][NEXTAUTH_URL] 

2025-01-18T08:00:41.554382312Z https://next-auth.js.org/warnings#nextauth_url

2025-01-18T08:00:41.985948533Z (node:210717) [DEP0040] DeprecationWarning: The `punycode` module is deprecated. Please use a userland alternative instead.

2025-01-18T08:00:41.985991513Z (Use `node --trace-deprecation ...` to show where the warning was created)

2025-01-18T09:00:00.079025455Z 2025-01-18T09:00:00.076Z info: [feed] Scheduling feed refreshing jobs ...

2025-01-18T10:00:00.072558066Z 2025-01-18T10:00:00.071Z info: [feed] Scheduling feed refreshing jobs `...`

Device Details

No response

Exact Hoarder Version

0.21.0

Have you checked the troubleshooting guide?

  • I have checked the troubleshooting guide and I haven't found a solution to my problem
@MohamedBassem
Copy link
Collaborator

Can you go to the admin panel and share how big is the size of your assets?

Also, btw, Hoarder now has official support for single file (#172 (comment)) though it's still in the nightly release.

@freakshock88
Copy link
Author

freakshock88 commented Jan 18, 2025

When I look inside the docker volumes, the folder assets contains just 126MB.

Image

BTW the bug also seems to be triggered by opening the 'Broken links' page, if that's any help.

I read about the direct singlefile integration, but I'm using a companion API I wrote myself to send articles to 2 services (Zipline and Hoarder), for the purpose of being able to share the link to others as well.

@MohamedBassem
Copy link
Collaborator

The fact that it gets trigged in the broken links page is very interesting because this page is very lightweight in general (it doesn't even touch the content of the URLs themselves). It's basically just a single database query. And given that you have only 16 bookmarks, it should be extremely fast.

How big is the db.db file?

@freakshock88
Copy link
Author

freakshock88 commented Jan 18, 2025

The db.db file is 17MB.

@freakshock88
Copy link
Author

I tried the broken links page again, and this time it didnt happen, so that was perhaps not a trigger. Sorry about that. Going back to the main bookmarks page again for the third time triggered 100% CPU again though..

@MohamedBassem
Copy link
Collaborator

17MB for 16 bookmarks is a lot, but it's not crazy high. Is your database file local or on NFS/Samba?

@freakshock88
Copy link
Author

It's local storage.

@MohamedBassem
Copy link
Collaborator

hmmm, it's going to be hard to debug this without a repro unfortunately.

@freakshock88
Copy link
Author

Is there not a way for me to deliver more logs, like a debug mode or something?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants