-
Notifications
You must be signed in to change notification settings - Fork 21
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Help with an error message when processing huge VTU file #183
Comments
Hello, |
Thanks for having a look @ZwFink. The error message provided results from a
Regarding data distribution, I am not fully sure, and it is highly possible I am not doing something ideal. Data from the VTU is read in the I am using the
|
Hi,
I have a program written in Python which post-processes an output VTU file from a fluid dynamics framework. The program is massively parallel in the sense that it repeats the exact same calculations at different locations in space, distributed on a grid. I have used Charm4Py to parallelise the workload on a large, university-like cluster, especially the pool functionality as it seemed to be the most appropriate to me. Everything is working properly on "regular" VTU files, and I am obtaining results which compare really well to
multiprocessing
on a single node andRay
on a multi-node environment. However, I am encountering an issue when I provide a "huge" VTU. What I mean by huge is 3 GB, whereas other files are on the order of a few 100 MB. I am pasting below the output generated by Charm4Py and the traceback for the first PE; the computer I have run this on has 64 GB of RAM. I would be really grateful if anyone could help me with this error and explain to me what it means so that I can attempt to fix it. I am more than happy to provide additional information about the program in itself.Thank you for any help.
The text was updated successfully, but these errors were encountered: