[ase-users] parallel settings

Marcin Dulak Marcin.Dulak at fysik.dtu.dk
Thu Mar 12 10:56:11 CET 2015


Hi,

please continue on the gpaw-users mailing list.

On 03/12/2015 04:49 AM, Zhiyao Duan wrote:
> Hello everyone,
>
> I  am just starting to use gpaw and have a problem in setting parallel 
> parameters.
> my system contains 96 atoms 840 valence electrons and 720 orbitals, 2 
> kpts in IBZ and
> grid points is  28*180*336. I was trying to
> run the calculation with 48 cores using the following script:
>
> import numpy as np
> from ase import Atoms
> from gpaw import GPAW, PW, FermiDirac, MethfesselPaxton
> from ase.optimize import FIRE,BFGS
>
> np.seterr(under="ignore")
>
> model=read('au_tio2.vasp',format='vasp')
>
> calc = GPAW(xc='PBE',
>             mode='pw',
>             kpts=(4, 1, 1),
>             random=True,
>             eigensolver='rmm-diis',
>             txt='au_tio2_relax.txt')
which version of GPAW are you using?
>
> model.set_calculator(calc)
> opt = BFGS(model)
> opt.run(0.05)
>
> write('final_relax.vasp',model, format='vasp',direct=True)
>
> the job can run normally but much slower than vasp with similar setups.
similar vasp potentials would be: setups={'Ti': '_sv', 'O': _'h'}.
The default VASP oxygen is rather inaccurate (which by the way makes 
many of materialsproject.org results incorrect).
You can compare for example VASP 5.2.2 results (which has a choice of 
potentials similar to the one of materialsproject.org)
with the ones that use '_h' potentials at 
https://molmod.ugent.be/deltacodesdft
When using O_h you would need to go to ~600 eV ENCUT in VASP and you 
need also PW(600) in GPAW.
> I thought the slowness is due to parallelization, so I add
> parallel={'band':4} to divide bands to 4 groups. This time the job 
> failed with
> error:
for a large cell consider using the grid mode.
>
> File "/usr/local/gpaw/lib/python/gpaw/mpi/__init__.py", line 939, in 
> autofinalize
>     raise RuntimeError('All the CPUs must be used')
> RuntimeError: All the CPUs must be used
>
> can anybody help me figuring out why this happens?
>
> Another question is how to compare the speed of gpaw and vasp?

> As I mentioned above, on my system vasp seems much faster, maybe this 
> is due to number of valence electrons included in the calculation? In 
> my case, Ti atom has 12 valence electrons in gpaw comparing to 4 in 
> vasp. Or the better performance of vasp is
> due to some other factors like parallelization?
comparing programs is very difficult, but my impression is that the most 
popular programs (VASP, ESPRESSO, ABINIT, GPAW, ...) have similar speed
when used in the same way. This is expected as at the end they try to 
make use of math libraries as much as possible.
First don't compare optimization (that's a different story) - only 
single point SCFs.
You need to pay attention to many defaults, for example VASP users 0.2 
eV width by default
and that makes SCF converge faster.

My advise would be: don't try to compare performance of different 
programs on a single example,
and before comparing them get familiar with (almost) all options available.
For your GPAW case, please keep the defaults (no random=True or 
eigensolver settings).
Make sure you don't exceed available memory on the compute nodes by 
running with dry-run option first:
https://wiki.fysik.dtu.dk/gpaw/documentation/parallel_runs/parallel_runs.html
As the Oxygen requires a bit more grid points (or planewave cutoff), use 
h=0.18.

calc = GPAW(xc='PBE',
             kpts=(4, 1, 1),
             h=0.18,
             txt='au_tio2_relax.txt')

Compare the performance of the above with:

calc = GPAW(xc='PBE',
             kpts=(4, 1, 1),
             mode=PW(550),
             eigensolver='rmm-diis',
             random=True,
             txt='au_tio2_relax.txt')

Let us know the result.

Best regards,

Marcin
>
> Thank you guys!
>
> Zhiyao
>
>
> _______________________________________________
> ase-users mailing list
> ase-users at listserv.fysik.dtu.dk
> https://listserv.fysik.dtu.dk/mailman/listinfo/ase-users


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://listserv.fysik.dtu.dk/pipermail/ase-users/attachments/20150312/c64a00d5/attachment-0001.html>


More information about the ase-users mailing list