[gpaw-users] Installing GPAW with parallel and scalapack support
Gaël Donval
gael.donval at cnrs-imn.fr
Tue Apr 30 17:41:41 CEST 2013
> >> Still not working scalapack even though OpenBlas, PBLAS, BLACS and
> >> Scalapack itself passed all their respective built-in tests perfectly.
> > Got it working by disabling no_numpy_depreciated_api precompiler flag.
> > However, TypeError: Not a proper NumPy array for MPI communication.
> > remains as well as the RunetimeError.
>
> What are the stack traces?
>
> Regards
> Ask
Here we are:
parallel/ut_hsblacs.py 1.345 FAILED! (rank 0,1,2,3)
#############################################################################
RANK 0,1,2,3:
Traceback (most recent call last):
File "/u/gdonval/opt_gcc/gpaw_python/lib/python2.7/site-packages/gpaw/test/__init__.py", line 489, in run_one
execfile(filename, loc)
File "/u/gdonval/opt_gcc/gpaw_python/lib/python2.7/site-packages/gpaw/test/parallel/ut_hsblacs.py", line 389, in <module>
raise SystemExit('Test failed. Check ut_hsblacs.log for details.')
SystemExit: Test failed. Check ut_hsblacs.log for details.
#############################################################################
fileio/parallel.py Traceback (most recent call last):
File "/u/gdonval/opt_gcc/gpaw_python/bin/gpaw-test", line 150, in <module>
nfailed = run()
File "/u/gdonval/opt_gcc/gpaw_python/lib/python2.7/site-packages/gpaw/test/test.py", line 125, in run
failed = TestRunner(tests, jobs=opt.jobs, show_output=opt.show_output).run()
File "/u/gdonval/opt_gcc/gpaw_python/lib/python2.7/site-packages/gpaw/test/__init__.py", line 417, in run
self.run_single()
File "/u/gdonval/opt_gcc/gpaw_python/lib/python2.7/site-packages/gpaw/test/__init__.py", line 439, in run_single
self.run_one(test)
File "/u/gdonval/opt_gcc/gpaw_python/lib/python2.7/site-packages/gpaw/test/__init__.py", line 505, in run_one
mpi.ibarrier(timeout=60.0) # guard against parallel hangs
File "/u/gdonval/opt_gcc/gpaw_python/lib/python2.7/site-packages/gpaw/mpi/__init__.py", line 750, in ibarrier
raise RuntimeError('MPI barrier timeout.')
RuntimeError: MPI barrier timeout.
GPAW CLEANUP (node 0): <type 'exceptions.RuntimeError'> occurred. Calling MPI_Abort!
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 42.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
parallel/overlap.py 0.363 FAILED! (rank 0,1)
#############################################################################
RANK 0,1:
Traceback (most recent call last):
File "/u/gdonval/opt_gcc/gpaw_python/lib/python/gpaw/test/__init__.py", line 489, in run_one
execfile(filename, loc)
File "/u/gdonval/opt_gcc/gpaw_python/lib/python/gpaw/test/parallel/overlap.py", line 124, in <module>
psit_mG = run(psit_mG)
File "/u/gdonval/opt_gcc/gpaw_python/lib/python/gpaw/test/parallel/overlap.py", line 87, in run
S_nn = overlap.calculate_matrix_elements(psit_mG, P_ani, S, dS)
File "/u/gdonval/opt_gcc/gpaw_python/lib/python/gpaw/hs_operators.py", line 365, in calculate_matrix_elements
sbuf_In, rbuf_In, cycle_P_ani)
File "/u/gdonval/opt_gcc/gpaw_python/lib/python/gpaw/hs_operators.py", line 207, in _initialize_cycle
self.req2.append(band_comm.send(sbuf_In, rankm, 31, False))
TypeError: Not a proper NumPy array for MPI communication.
#############################################################################
pw/slab.py 1.499 FAILED! (rank 0,1)
#############################################################################
RANK 0,1:
Traceback (most recent call last):
File "/u/gdonval/opt_gcc/gpaw_python/lib/python/gpaw/test/__init__.py", line 489, in run_one
execfile(filename, loc)
File "/u/gdonval/opt_gcc/gpaw_python/lib/python/gpaw/test/pw/slab.py", line 20, in <module>
BFGS(slab).run(fmax=0.01)
File "/u/gdonval/opt_gcc/ase/ase/optimize/optimize.py", line 114, in run
f = self.atoms.get_forces()
File "/u/gdonval/opt_gcc/ase/ase/atoms.py", line 669, in get_forces
forces = self._calc.get_forces(self)
File "/u/gdonval/opt_gcc/gpaw_python/lib/python/gpaw/aseinterface.py", line 69, in get_forces
force_call_to_set_positions=force_call_to_set_positions)
File "/u/gdonval/opt_gcc/gpaw_python/lib/python/gpaw/paw.py", line 269, in calculate
self.occupations):
File "/u/gdonval/opt_gcc/gpaw_python/lib/python/gpaw/scf.py", line 46, in run
wfs.eigensolver.iterate(hamiltonian, wfs)
File "/u/gdonval/opt_gcc/gpaw_python/lib/python/gpaw/eigensolvers/eigensolver.py", line 63, in iterate
wfs.overlap.orthonormalize(wfs, kpt)
File "/u/gdonval/opt_gcc/gpaw_python/lib/python/gpaw/overlap.py", line 76, in orthonormalize
S_nn = operator.calculate_matrix_elements(psit_nG, P_ani, S, dS)
File "/u/gdonval/opt_gcc/gpaw_python/lib/python/gpaw/hs_operators.py", line 365, in calculate_matrix_elements
sbuf_In, rbuf_In, cycle_P_ani)
File "/u/gdonval/opt_gcc/gpaw_python/lib/python/gpaw/hs_operators.py", line 207, in _initialize_cycle
self.req2.append(band_comm.send(sbuf_In, rankm, 31, False))
TypeError: Not a proper NumPy array for MPI communication.
#############################################################################
exx_acdf.py 5.825 FAILED! (rank 0,1)
#############################################################################
RANK 0,1:
Traceback (most recent call last):
File "/u/gdonval/opt_gcc/gpaw_python/lib/python/gpaw/test/__init__.py", line 489, in run_one
execfile(filename, loc)
File "/u/gdonval/opt_gcc/gpaw_python/lib/python/gpaw/test/exx_acdf.py", line 25, in <module>
E_k = E + calc.get_xc_difference(exx)
File "/u/gdonval/opt_gcc/gpaw_python/lib/python/gpaw/aseinterface.py", line 423, in get_xc_difference
return self.hamiltonian.get_xc_difference(xc, self.density) * Hartree
File "/u/gdonval/opt_gcc/gpaw_python/lib/python/gpaw/hamiltonian.py", line 405, in get_xc_difference
xc.calculate_exx()
File "/u/gdonval/opt_gcc/gpaw_python/lib/python/gpaw/xc/hybridk.py", line 273, in calculate_exx
kpt2_q[0].start_sending(srank)
File "/u/gdonval/opt_gcc/gpaw_python/lib/python/gpaw/xc/hybridk.py", line 77, in start_sending
self.kd.comm.send(P_In, rank, block=False, tag=5)]
TypeError: Not a proper NumPy array for MPI communication.
#############################################################################
More information about the gpaw-users
mailing list