Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inchworm process fails #81

Open
sergiogonmoll opened this issue May 19, 2023 · 4 comments
Open

Inchworm process fails #81

sergiogonmoll opened this issue May 19, 2023 · 4 comments

Comments

@sergiogonmoll
Copy link

Hi Clem,

I'm trying to run dnaPipeTE on fastq.gz files with trimmed and post-QC reads and it seems to work fine until the Inchworm bit of the Trinity run, where it gives me this message:

If it indicates bad_alloc(), then Inchworm ran out of memory. You'll need to either reduce the size of your data set or run Trinity on a server with more memory available.

I think it's strange because I'm running it interactively in a cluster with massive storage space. The error keeps occuring even when decreasing the genome coverage to 0.01. What could be going wrong? Thank you in advance for your help!

@clemgoub
Copy link
Owner

Hi Sergio!

The most likely cause could be that your interactive allocation has less than 10Gb or RAM, which is the minimum required to run inchworm (jellyfish actually, to get the k-mers). This is definitely a RAM issue, not a storage problem. Are you using the container version?

Let me know if increasing the RAM works, otherwise, send me the full log and command you used, we'll find a fix!

Cheers,

Clément

@sergiogonmoll
Copy link
Author

Hi Clément,

I tried increasing the RAM allocation, but the job would still crash. I resorted to lowering the coverage, but increasing the number of samples (i.e. 5 samples of 0.1 coverage instead of 2 samples for 0.25 coverage). This stabilized the run and I managed to have them finish. Probably there is a limitation on samples size when running Trinity? I think I would consider the issue solved. Thank you!

Cheers,

Sergio

@clemgoub
Copy link
Owner

clemgoub commented Jun 5, 2023

Hello Sergio,

Thanks for the feedback, and I'm happy you found a solution that works. Before I close the thread, would you mind sharing the sample size that worked in reads/bp (as well as the amount that didn't)? Though 0.25X is usually fine, if it's a big genome that can be a lot?

Still I'm surprised that Trinity would not handle it -- so I'd like to keep an eye on that! Thanks again for reporting! It helps a lot!

Cheers,

Clément

@sergiogonmoll
Copy link
Author

Hi Clément,

Yes, the genome size is 1091184475 bp. I thought so too. We just transitioned to a new cluster and it's turning out to be quite temperamental, so maybe it has something to do with that? Best of luck!

Cheers,

Sergio

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants