[gpaw-users] running different calculations in parallel
Jens Jørgen Mortensen
jjmo at dtu.dk
Tue Jun 2 09:56:01 CEST 2020
On 5/30/20 1:43 PM, Punit Kumar wrote:
> Hi Jens
>
> Thanks a lot for your response and for clarifying my doubts. The
> calculation runs correctly and I got four output and four trajectory
> files separately. But,
>
> On Wed, 27 May 2020 at 13:11, Jens Jørgen Mortensen <jjmo at dtu.dk
> <mailto:jjmo at dtu.dk>> wrote:
>
> Den 26.05.2020 kl. 12.06 skrev Punit Kumar via gpaw-users:
> > Dear gpaw-users
> >
> > I have made four different initial structures of H atom
> adsorption on
> > Ni(111) surface and club all these structures into a single
> trajectory
> > file (attached below). Now, I want to perform the structure
> optimization
> > calculation for all the initial structures parallelly over 40 cores
> > (with each structure running on 10 cores). While doing this I don't
> > want my final outputs and trajectories to be mixed. I have
> searched for
> > this problem (or tutorial) in gpaw documentation
> >
> (https://wiki.fysik.dtu.dk/gpaw/documentation/parallel_runs/parallel_runs.html#parallel-runs)
> > and write down a piece of code for the same which looks like that
>
> You have this line:
>
> qn = BFGS(images[i],logfile='opt%d.log' % j)
>
> but i is always 3. Also, you don't need the for-loop. And you should
> add this:
>
> master=(comm.rank == 0)
>
> when you create the BFGS object.
>
> Here is my version:
>
> from gpaw import GPAW, PW, mpi
> from ase import Atoms
> from ase.io <http://ase.io> import Trajectory
> from ase.optimize import BFGS
>
> images = [Atoms('HH',
> [(0, 0, 0), (0, 0, d)],
> pbc=1,
> cell=(2, 2, 3))
> for d in [0.7, 0.725, 0.75, 0.775]]
>
> n = mpi.size // 4
> j = 1 + mpi.rank // n
> assert 4 * n == mpi.size
> i = j - 1
> ranks = range(i * n, (i + 1) * n)
> print(i, ranks)
> comm = mpi.world.new_communicator(ranks)
> calc = GPAW(mode=PW(340),
> txt='H_ads_%d.txt' % j,
> communicator=comm)
> images[i].set_calculator(calc)
>
> qn = BFGS(images[i], logfile='opt%d.log' % j, master=(comm.rank == 0))
>
>
> one thing I have observed that after the calculation is finished only
> one log file is generated. Since there are four structures in images
> atom object there should be four log files also. But after the
> calculation is done I'm getting only one log file i.e. opt1.log. Do you
> have any idea why it produces only one log file in spite of the fact
> that *"% j " *is mentioned in the above line of code in order to
> generate four different log files?
No, I don't understand that. I get four log-files.
Jens Jørgen
> traj = Trajectory('H_ads_%d.traj' % j, 'w', images[i],
> master=(comm.rank
> == 0))
>
> qn.attach(traj)
> qn.run(fmax=0.5)
>
> Jens Jørgen
>
> > *f/rom ase.io <http://ase.io> <http://ase.io> import read, Trajectory
> > from ase.optimize import *
> > from ase.parallel import rank, size
> > from ase.visualize import view
> > from gpaw import GPAW, PW, Mixer, FermiDirac, Davidson, mpi
> >
> > bands = -60
> >
> > images = read('initial.traj at -4:')/*
> > /*
> > n = size // 4 # number of cpu's per image
> > j = 1 + rank // n # my image number
> > assert 4 * n == size*/
> > /*
> > for i in range(len(images)):
> > ranks = range(i * n, (i+1) * n)
> > print(i,ranks)
> > comm = mpi.world.new_communicator(ranks)
> > calc = GPAW(mode=PW(340),xc='RPBE',spinpol=False,kpts=(6, 6,
> > 1),maxiter=1000,
> >
> nbands=bands,mixer=Mixer(beta=0.05,nmaxold=5,weight=50),
> > eigensolver='rmm-diis',symmetry={'point_group':
> False},
> >
> > occupations=FermiDirac(0.1),txt='H_ads_%d.txt'%j,communicator=comm)
> > images[i].set_calculator(calc)
> >
> > qn = BFGS(images[i],logfile='opt%d.log' % j)
> >
> > traj = Trajectory('H_ads_%d.traj' % j, 'w', images[j],
> master=(rank % n
> > == 0))
> >
> > qn.attach(traj)
> > qn.run(fmax=0.5)
> > #view(slab)*/
> >
> > After running the calculation I'm not able to get different
> outputs and
> > trajectories. Since it is the first time I'm trying something
> like this
> > maybe I have made errors in my code. Can someone help me out to
> resolve
> > my problem? Any help or suggestion is highly appreciated.
> >
> > Thank you
> > Punit
> >
> > _______________________________________________
> > gpaw-users mailing list
> > gpaw-users at listserv.fysik.dtu.dk
> <mailto:gpaw-users at listserv.fysik.dtu.dk>
> > https://listserv.fysik.dtu.dk/mailman/listinfo/gpaw-users
> >
>
More information about the gpaw-users
mailing list