You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I ran the ProteomicsLFQ workflow on some test data on a machine with 64 GB RAM. If I don't change the "--max_memory" setting (default: "128.GB"), the pipeline fails during the "proteomicslfq" step with the following message:
Error executing process > 'proteomicslfq (1)'
Caused by:
Process requirement exceed available memory -- req: 64 GB; avail: 62.8 GB
However, if I set "--max_memory 48.GB", the error goes away and the step finishes. So obviously this isn't a big problem, but I don't understand why "proteomicslfq" asks for a large amount of memory (that it doesn't need), but not for the maximum amount specified by the parameter. In addition, if "--max_memory" needs to be adapted to the available RAM, this should be documented and the parameter not hidden by default.
The text was updated successfully, but these errors were encountered:
I ran the ProteomicsLFQ workflow on some test data on a machine with 64 GB RAM. If I don't change the "--max_memory" setting (default: "128.GB"), the pipeline fails during the "proteomicslfq" step with the following message:
However, if I set "--max_memory 48.GB", the error goes away and the step finishes. So obviously this isn't a big problem, but I don't understand why "proteomicslfq" asks for a large amount of memory (that it doesn't need), but not for the maximum amount specified by the parameter. In addition, if "--max_memory" needs to be adapted to the available RAM, this should be documented and the parameter not hidden by default.
The text was updated successfully, but these errors were encountered: