[ase-users] RuntimeError: use fewer bands or more basis functions

Jussi Enkovaara jussi.enkovaara at csc.fi
Tue May 20 13:22:14 CEST 2014


Hi Heine,
you can also check the number of bands etc. with the --dry-run flag, e.g.

python input.py --dry-run=4

which will perform only the unexpensive initializations and show in text 
output something like:

----------------------
...
Total number of cores used: 4
Parallelization over spin
Parallelization over states: 2
...
Number of Atoms: 1
Number of Atomic Orbitals: 1
Adjusting Number of Bands by +1 to Match Parallelization
Number of Bands in Calculation:         2
...
----------------------

Best regards,
Jussi

On 2014-05-20 12:14, Marcin Dulak wrote:
> Hi,
>
> On 05/20/2014 10:43 AM, heineahansen wrote:
>> Hello mailing list,
>>
>> I have tried to do calculations in GPAW (v 0.10.0.11364) on a slab
>> with stoichiometry H1C8O1Pt10.
>> GPAW crashes during or after the initial “memory estimate” with the
>> following error
>>
>> Traceback (most recent call last):
>>    File "relax_slab.py", line 55, in <module>
>>      dyn.run(fmax=0.02)
>>    File "/home/niflheim/xxx/usr/lib/svn/ase/ase/optimize/optimize.py",
>> line 114, in run
>>      f = self.atoms.get_forces()
>>    File "/home/niflheim/xxx/usr/lib/svn/ase/ase/atoms.py", line 711,
>> in get_forces
>>      forces = self._calc.get_forces(self)
>>    File
>> "/home/opt/el6/dl160g6/gpaw-0.10.0.11364-dl160g6-tm-gfortran-openmpi-1.6.3-acml-4.4.0-hdf5-1.8.10-sl-1/lib64/python2.6/site-packages/gpaw/aseinterface.py",
>> line 69, in get_forces
>>      force_call_to_set_positions=force_call_to_set_positions)
>>    File
>> "/home/opt/el6/dl160g6/gpaw-0.10.0.11364-dl160g6-tm-gfortran-openmpi-1.6.3-acml-4.4.0-hdf5-1.8.10-sl-1/lib64/python2.6/site-packages/gpaw/paw.py",
>> line 224, in calculate
>>      self.set_positions(atoms)
>>    File
>> "/home/opt/el6/dl160g6/gpaw-0.10.0.11364-dl160g6-tm-gfortran-openmpi-1.6.3-acml-4.4.0-hdf5-1.8.10-sl-1/lib64/python2.6/site-packages/gpaw/paw.py",
>> line 304, in set_positions
>>      self.wfs.initialize(self.density, self.hamiltonian, spos_ac)
>>    File
>> "/home/opt/el6/dl160g6/gpaw-0.10.0.11364-dl160g6-tm-gfortran-openmpi-1.6.3-acml-4.4.0-hdf5-1.8.10-sl-1/lib64/python2.6/site-packages/gpaw/wavefunctions/fdpw.py",
>> line 68, in initialize
>>      basis_functions, density, hamiltonian, spos_ac)
>>    File
>> "/home/opt/el6/dl160g6/gpaw-0.10.0.11364-dl160g6-tm-gfortran-openmpi-1.6.3-acml-4.4.0-hdf5-1.8.10-sl-1/lib64/python2.6/site-packages/gpaw/wavefunctions/fdpw.py",
>> line 75, in initialize_wave_functions_from_basis_functions
>>      raise RuntimeError('use fewer bands or more basis functions')
>> RuntimeError: use fewer bands or more basis functions
>> GPAW CLEANUP (node 5): <type 'exceptions.RuntimeError'> occurred.
>> Calling MPI_Abort!
>>
>> My calculator looks like this
>>
>> calc = GPAW(mode=PW(pw),
>>              xc='BEEF-vdW',
>>              kpts=(6,6,1),
>>              eigensolver='rmm-diis',
>>              #eigensolver='dav',
>>              mixer=mymixer(beta=0.05, nmaxold=5, weight=50),
>>              occupations=FermiDirac(0.10),
>>              maxiter=400,
>>              spinpol=spinpol,
>>              txt=token+'.out')
>>
>> Does anybody have suggestions how to deal with this error? Because of
>> the very early crash, I have no idea how many bands or basis functions
>> GPAW tried to use before crashing.
> this is due to band-parallelization being switched on what makes the
> number of bands to be automatically adjusted to match the number of cores.
> Add basis='dzp' - this should provide enough basis.
> You could also consider staying with eigensolver='dav', disabling band
> parallelization with parallel={'band':1} and
> allowing imbalance for k-point parallelization with idiotproof=False.
> With these setting you cannot run the job on more cores
> than you have k-points (+ spin in your case), but dav can still be
> faster than rmm-diis. You will need to test this two alternatives.
>
> Best regards,
>
> Marcin
>>
>> Best Regards,
>> Heine
>>
>>
>> _______________________________________________
>> ase-users mailing list
>> ase-users at listserv.fysik.dtu.dk
>> https://listserv.fysik.dtu.dk/mailman/listinfo/ase-users
>>
>>
>
>
> _______________________________________________
> ase-users mailing list
> ase-users at listserv.fysik.dtu.dk
> https://listserv.fysik.dtu.dk/mailman/listinfo/ase-users
>
>




More information about the ase-users mailing list