[gpaw-users] Eigensolver cg parallel problem

Jussi Enkovaara jussi.enkovaara at csc.fi
Wed Apr 20 16:10:13 CEST 2011


On 2011-04-20 12:24, Jun Yan wrote:
> Dear Jussi,
>
>      Is it possible to include parallelization over bands for cg ?  For my calculation I have to keep parallel={'domain':1} since at each step, each cpu has to get kpt_u[k].psit_nG[n] at a certain n and k.  If the system doesn't have enough kpoints to be parallelized,  for example, a slab that has 400 kpoints and 200 bands need 800 cpus, then I have to switch on parallelization over bands.

Hi,
this is getting more into development issue, so let's switch the discussion to 
gpaw-developers list.

In principle, also CG can be parallelized over bands, it just requires some 
(non-trivial) coding. The main problem is that while updating a single band, it has 
to be orthornormalized to other bands (with the same k-point/spin),
lines 91-114 in eigensolvers/cg.py

With band parallelization, quite extensive communication is required in that part. 
Overlapping communication and computation could be useful (when supported by 
hardware) but some one has to do the implementation...

As a side note, already in medium size calculations with CG time is spent largely 
in orthonormalization, and I am pretty confident that the implementation could be 
optimized at least in some extent.

Best regards,
Jussi





> Jussi Enkovaara wrote:
>
> On 2011-04-20 10:40, Jun Yan wrote:
>
>
> Hi, developers,
>
>        I have met a problem with eigensolver cg when I perform ground
> state calculation with parallel = {'domain':1, 'nband':N}.
>
>
>
> Dear Jun,
> currently, the cg eigensolver does not support parallelization over bands, but I
> think there should be better error message when one tries to do that.
>
> Best regards,
> Jussi
> _______________________________________________
> gpaw-users mailing list
> gpaw-users at listserv.fysik.dtu.dk<mailto:gpaw-users at listserv.fysik.dtu.dk>
> https://listserv.fysik.dtu.dk/mailman/listinfo/gpaw-users
>
>
> ________________________________
>




More information about the gpaw-users mailing list