Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error in performing example/04-parallel_mol_hf.py #22

Closed
jiangth1997 opened this issue Sep 2, 2022 · 0 comments
Closed

Error in performing example/04-parallel_mol_hf.py #22

jiangth1997 opened this issue Sep 2, 2022 · 0 comments

Comments

@jiangth1997
Copy link

I am trying to install and use mpi4pyscf to accelerate an ROHF calculation on a cluster containing (1784o,1265e). Now I've just installed mpi4py and mpi4pyscf, and trying a test calculation of UHF on water trimer, which is revised based on example/04-parallel_mol_hf.py.

#!/usr/bin/env python

import numpy
from pyscf import gto, scf
from mpi4pyscf import scf as mpi_scf

mol = gto.M(atom='''O -1.340565 0.901693 0.069532
H -1.199374 -0.057160 0.031951
H -1.953650 1.097896 -0.638937
O 1.436646 0.695966 -0.004172
H 0.571354 1.130319 -0.001360
H 2.096731 1.386248 -0.026654
O -0.108265 -1.612993 -0.063996
H 0.696797 -1.077090 -0.043847
H -0.017920 -2.236192 0.657113''',
basis='cc-pvtz')

mf = mpi_scf.UHF(mol)
mf.direct_scf_tol = 1e-9
mf.verbose = 6
mf.kernel()

The Python script is performed with the following bash command.

mpirun -np 2 python 04-parallel_mol_hf.py

I got the following errors.

(base) [1600011363@a7u09n08 mpi4pyscf_molecule]$ mpirun -np 2 python mol_mpi4pyscf.py > mol_mpirun.out
Traceback (most recent call last):
File "/gpfs/share/home/1600011363/.local/lib/python3.7/site-packages/mpi4pyscf/init.py", line 31, in
mpi.pool.wait()
File "/gpfs/share/home/1600011363/.local/lib/python3.7/site-packages/mpi4pyscf/tools/mpi_pool.py", line 112, in wait
ans = self.function(*task) # task = worker_args
File "/gpfs/share/home/1600011363/.local/lib/python3.7/site-packages/mpi4pyscf/tools/mpi.py", line 577, in _distribute_call
return fn(dev, *args, **kwargs)
File "/gpfs/share/home/1600011363/.local/lib/python3.7/site-packages/mpi4pyscf/scf/hf.py", line 43, in get_jk
vj, vk = _eval_jk(mf, dm, hermi, _jk_jobs_s8)
File "/gpfs/share/home/1600011363/.local/lib/python3.7/site-packages/mpi4pyscf/scf/hf.py", line 115, in _eval_jk
dm = numpy.asarray(dm).reshape(-1,nao,nao)
ValueError: cannot reshape array of size 1 into shape (174,174)
application called MPI_Abort(MPI_COMM_WORLD, 1) - process 1
(base) [1600011363@a7u09n08 mpi4pyscf_molecule]$ *** Error in `/bin/srun': double free or corruption (fasttop): 0x00000000016f52b0 ***
======= Backtrace: =========
/lib64/libc.so.6(+0x81679)[0x7f37d5679679]
/usr/lib64/slurm/libslurmfull.so(slurm_xfree+0x25)[0x7f37d6274668]
/usr/lib64/slurm/libslurmfull.so(+0x5c482)[0x7f37d6159482]
/usr/lib64/slurm/libslurmfull.so(list_destroy+0xbc)[0x7f37d61ae649]
/usr/lib64/slurm/libslurmfull.so(client_io_handler_destroy+0xe3)[0x7f37d615b8a4]
/usr/lib64/slurm/libslurmfull.so(slurm_step_launch_wait_finish+0x4ba)[0x7f37d615fbb1]
/usr/lib64/slurm/launch_slurm.so(launch_p_step_wait+0x22)[0x7f37d456a681]
/bin/srun(launch_g_step_wait+0x2f)[0x40be1d]
/bin/srun[0x407c4d]
/bin/srun(srun+0x1527)[0x409278]
/bin/srun(main+0x9)[0x409730]
/lib64/libc.so.6(__libc_start_main+0xf5)[0x7f37d561a505]
/bin/srun[0x407079]
======= Memory map: ========
......

It seems mpi4pyscf has been installed, but an error occurs in mpi4pyscf.scf.hf. How to fix this problem?

The version of dependencies are listed as follows.
conda = 4.10.3 python = 3.7.1 mpi4py = 3.1.3 mpi4pyscf = 0.3.0
mpi4py was installed with OpenMPI (version = 3.0.0) compiled with Intel compilers (2018.0).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants