[gpaw-users] Failed to orthogonalize: %d' % info
Vladislav Ivanistsev
olunet at gmail.com
Mon Jan 10 16:04:45 CET 2011
Hi!
I've got several times the same error by running a common script:
calc = GPAW(h=0.16, kpts=(4, 4, 1), txt='BiH2O.txt',parallel={'domain':4},
xc='RPBE')
slab.set_calculator(calc)
slab.get_potential_energy()
qn=optimize.QuasiNewton(slab,trajectory='BiH2O.traj',restart='BiH2O.pckl')
qn.run(fmax=0.05)
What could be the reason and how to avoid it?
Greetings from Estonia,
Vlad
Output is the following:
mpirun -np 4 gpaw-python BiH2O.py
BFGS: 0 12:55:16 -42.675246 0.6915
BFGS: 1 13:47:46 -42.681960 0.0309
Traceback (most recent call last):
File "BiH2O.py", line 32, in <module>
qn.run(fmax=0.02)
File "/usr/lib/python2.6/dist-packages/ase/optimize/optimize.py", line
114, in run
f = self.atoms.get_forces()
File "/usr/lib/python2.6/dist-packages/ase/atoms.py", line 536, in
get_forces
forces = self.calc.get_forces(self)
File "/usr/lib/python2.6/dist-packages/gpaw/aseinterface.py", line 61, in
get_forces
force_call_to_set_positions=force_call_to_set_positions)
File "/usr/lib/python2.6/dist-packages/gpaw/paw.py", line 265, in
calculate
self.occupations):
File "/usr/lib/python2.6/dist-packages/gpaw/scf.py", line 46, in run
wfs.eigensolver.iterate(hamiltonian, wfs)
File "/usr/lib/python2.6/dist-packages/gpaw/eigensolvers/eigensolver.py",
line 71, in iterate
wfs.orthonormalize()
File "/usr/lib/python2.6/dist-packages/gpaw/wavefunctions/fdpw.py", line
190, in orthonormalize
self.overlap.orthonormalize(self, kpt)
File "/usr/lib/python2.6/dist-packages/gpaw/overlap.py", line 76, in
orthonormalize
self.ksl.inverse_cholesky(S_nn)
File "/usr/lib/python2.6/dist-packages/gpaw/blacs.py", line 620, in
inverse_cholesky
raise RuntimeError('Failed to orthogonalize: %d' % info)
RuntimeError: Failed to orthogonalize: 1
GPAW CLEANUP (node 0): <type 'exceptions.RuntimeError'> occurred. Calling
MPI_Abort!
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 42.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 0 with PID 4368 on
node quantum-desktop exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://listserv.fysik.dtu.dk/pipermail/gpaw-users/attachments/20110110/b1789c07/attachment.html
More information about the gpaw-users
mailing list