[ase-users] ase-users Digest, Vol 142, Issue 9
Martin Hangaard Hansen
mh at hafniumlabs.com
Wed Apr 8 10:53:53 CEST 2020
Hi Louie and other ase NEB users,
Before putting too much effort into parallelizing NEB over images, I would
like to advertise for the alternative option of using an ML-NEB method such
as:
https://github.com/SUNCAT-Center/CatLearn/tree/master/tutorials/11_NEB
(reference:
https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.122.156001)
It removes the need to parallelize over images. It also reduces the number
of electronic structure calculations you need by fitting a surrogate
potential energy surface to data from a few force calls. CatLearn/ML-NEB's
built-in active learning algorithm samples the expensive energy/structure
data you need, to find the saddle point. The algorithm only does one force
call at a time and re-fits the surrogate PES in every iteration. The
overall calculation costs usually ends up around the same as for the two
minima optimizations you did to get the initial and final state structures
- no matter if you run with 5 NEB images or 50.
Best regards
Martin Hangaard Hansen, PhD
On Tue, Apr 7, 2020 at 7:26 PM <ase-users-request at listserv.fysik.dtu.dk>
wrote:
> Send ase-users mailing list submissions to
> ase-users at listserv.fysik.dtu.dk
>
> To subscribe or unsubscribe via the World Wide Web, visit
> https://listserv.fysik.dtu.dk/mailman/listinfo/ase-users
> or, via email, send a message with subject or body 'help' to
> ase-users-request at listserv.fysik.dtu.dk
>
> You can reach the person managing the list at
> ase-users-owner at listserv.fysik.dtu.dk
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of ase-users digest..."
>
>
> Today's Topics:
>
> 1. Re: Parallel NEB with CASTEP calculator (Offermans Willem)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Tue, 7 Apr 2020 15:13:31 +0000
> From: Offermans Willem <willem.offermans at vito.be>
> To: ase users <ase-users at listserv.fysik.dtu.dk>
> Subject: Re: [ase-users] Parallel NEB with CASTEP calculator
> Message-ID: <1042A0E5-C2F9-4716-AB09-BFDFE6618495 at Vito.be>
> Content-Type: text/plain; charset="utf-8"
>
> Dear Ask and ASE friends,
>
> OK, then I will put the user story here:
>
> I like to run NEB calculations with ASE on a Hadoop Cluster.
> I would like to run the NEB in parallel over images only.
>
> The way I imagine it will work:
>
> I will use PySpark.
>
> I will create a SparkContext object
>
> I will parallelise over NEB images (maybe with or without optimisation of
> initial and final image).
>
> The images will be relaxed separately on the client nodes.
>
> The client nodes will return total energy and forces back to the
> ``executor``.
>
> The executor will evaluate the energies and forces and will generate new
> images to be calculated.
>
> Does the above make sense?
>
>
> In `ase-3.19.0/ase/neb.py`, I found the following:
>
> <snip>
> ...
> if not self.parallel:
> # Do all images - one at a time:
> for i in range(1, self.nimages - 1):
> energies[i] = images[i].get_potential_energy()
> forces[i - 1] = images[i].get_forces()
> elif self.world.size == 1:
> def run(image, energies, forces):
> energies[:] = image.get_potential_energy()
> forces[:] = image.get_forces()
> threads = [threading.Thread(target=run,
> args=(images[i],
> energies[i:i + 1],
> forces[i - 1:i]))
> for i in range(1, self.nimages - 1)]
> for thread in threads:
> thread.start()
> for thread in threads:
> thread.join()
> ...
> </snip>
>
> Can I use ``elif self.world.size == 2:`` for my project?
>
>
> Met vriendelijke groeten,
> Mit freundlichen Gr??en,
> With kind regards,
>
>
> Willem Offermans
> Researcher Electrocatalysis SCT
> VITO NV | Boeretang 200 | 2400 Mol
> Phone:+32(0)14335263 Mobile:+32(0)492182073
>
> Willem.Offermans at Vito.be<mailto:Willem.Offermans at Vito.be>
>
> [cid:982BA063-B96A-4A1B-89AB-5A01CA9FC70D at vito.local]
>
> On 7 Apr 2020, at 16:53, Ask Hjorth Larsen <asklarsen at gmail.com<mailto:
> asklarsen at gmail.com>> wrote:
>
> Dear Willem,
>
> Am Di., 7. Apr. 2020 um 15:16 Uhr schrieb Offermans Willem <
> willem.offermans at vito.be<mailto:willem.offermans at vito.be>>:
> Dear Ask and ASE friends,
>
> Gitlab?
>
> What is the function of gitlab? To collect user stories or even epics?
>
> ASE on gitlab is not even documented in ``
> https://wiki.fysik.dtu.dk/ase/tutorials/tutorials.html#further-reading``
> <https://wiki.fysik.dtu.dk/ase/tutorials/tutorials.html#further-reading><
> https://wiki.fysik.dtu.dk/ase/tutorials/tutorials.html#further-reading%60%60
> >
>
> Well, no, Gitlab is where the source code lives. It's also the bugtracker.
>
>
> Ahah, I found something by searching for gitlab on the wiki:
>
> https://wiki.fysik.dtu.dk/ase/development/contribute.html?highlight=gitlab
>
> It is quite interesting and related to my following little project.
> I?m running my calculations on a Hadoop cluster. I would like to
> do some coarse grained parallelisation for NEB calculations.
> Maybe I can do it in the way suggested in above link.
>
> How do developers normally communicate? Via this mailing list?
>
> We sometimes see questions about parallel NEB not working. Since this
> happens repeatedly, I assume there's some problem with how we parallel NEB
> is implemented, or how we document it. Most likely something needs to be
> fixed. It would be good to understand what needs to be fixed.
>
> Best regards
> Ask
>
>
>
>
>
> Met vriendelijke groeten,
> Mit freundlichen Gr??en,
> With kind regards,
>
>
> Willem Offermans
> Researcher Electrocatalysis SCT
> VITO NV | Boeretang 200 | 2400 Mol
> Phone:+32(0)14335263 Mobile:+32(0)492182073
>
> Willem.Offermans at Vito.be<mailto:Willem.Offermans at Vito.be>
>
> <vito.jpg>
>
> On 7 Apr 2020, at 14:19, Ask Hjorth Larsen <asklarsen at gmail.com<mailto:
> asklarsen at gmail.com>> wrote:
>
> Dear Willem and Louie,
>
> If something stops ASE-NEB to work in parallel alongside other calculators
> using MPI, then we should open an issue on Gitlab and fix it. If there is
> a way and it isn't obvious, then at least documentation or API should be
> improved.
>
> I didn't read through the whole discussion. If and once you come up with
> a final example script which should work but doesn't, or any other helpful
> material, could you think of opening the issue?
>
> Best regards
> Ask
>
> Am Di., 7. Apr. 2020 um 14:02 Uhr schrieb Offermans Willem via ase-users <
> ase-users at listserv.fysik.dtu.dk<mailto:ase-users at listserv.fysik.dtu.dk>>:
> Dear Louie and ASE friends,
>
> Ahah, now we have more details.
>
> Parallel over and within images is more delicate.
>
> I was speaking about parallel over images only, not over 2D
> parallelisation, as I called it.
> I remember that I had a similar question related to this topic some time
> ago. Unfortunately
> I forgot the final response, but I do know that I had to give up to go in
> the direction of 2D parallelisation.
> I only realise now that the link you sent is a response to my original
> e-mail about this topic :)
> My calculator was ABINIT, but nowadays I use Quantum Espresso. I also
> remember that
> I was looking at mpi4py and related stuff.
>
> Anyway, I?m afraid I cannot help you out, since I have abandoned this
> route for now.
> I?m running my jobs on a Hadoop closer and I?m trying to have a coarse
> grained parallelisation running.
> So I would already be happy to run NEB parallel over images.
>
> Thank you for sharing the python script. It might be very helpful to read
> over it and understand your approach.
> Not only for me, but for the total ASE community. It would be nice to have
> a collection of these scripts on the wiki.
>
>
>
> Met vriendelijke groeten,
> Mit freundlichen Gr??en,
> With kind regards,
>
>
> Willem Offermans
> Researcher Electrocatalysis SCT
> VITO NV | Boeretang 200 | 2400 Mol
> Phone:+32(0)14335263 Mobile:+32(0)492182073
>
> Willem.Offermans at Vito.be<mailto:Willem.Offermans at Vito.be>
>
> <vito.jpg>
>
> On 7 Apr 2020, at 13:21, Louie Slocombe <l.slocombe at surrey.ac.uk<mailto:
> l.slocombe at surrey.ac.uk>> wrote:
>
> Dear Willem and the ASE community,
>
> Thanks for your response. I also initially agreed with this idea, I
> thought that provided you split up the MPI tasks correctly it was possible
> to have parallel over and within images. However, there was a post on the
> forum by Ivan in Nov 2018 I quote,
> "I think that the ASE/NEB and the Calculator can run both in parallel only
> if the Calculator is GPAW. This is because the NEB and the process spawned
> by the Calculator have to share an MPI communicator (I am not sure about
> any existing tricks with application/machinefiles and mpirun options). Thus
> parallel NEB will work with VASP or Abinit in serial mode only. Also serial
> NEB will work with the parallel versions of VASP, Abinit etc."
> Here is a link to the thread
> https://listserv.fysik.dtu.dk/pipermail/ase-users/2018-November/004632.html
> I was wondering if there were any recent developments with the code that
> avoids the issue mentioned by Ivan.
>
> I made a first attempt however the parallel calculators failed to
> communicate with each other. See attachment for a full example. It is also
> unclear to me how I submit the job. Assuming I have a total of 16 tasks, I
> want the calculator to use 4 MPI tasks within each image,
> export CASTEP_COMMAND="mpirun -np 4 castep.mpi"
> With 4 parallel instances of NEB,
> mpirun -n 4 python3 ase_parallel_neb_example.py
> Would this be correct?
>
> Any suggestions or advice would be greatly appreciated.
>
> Many thanks,
> Louie
>
> Attachment also pasted here:
>
> from ase.calculators.castep import Castep
> from ase.constraints import FixAtoms
> from ase.io<http://ase.io/> import read
> from ase.neb import NEB
> from ase.optimize import BFGS
> from mpi4py import MPI
> import ase.parallel
> from ase.build import fcc100
>
> f_parallel = True
> f_gen_in = True
> n_images = 4
> f_max = 0.5
>
> if f_gen_in:
> # 2x2-Al(001) surface with 3 layers and an
> # Au atom adsorbed in a hollow site:
> slab = fcc100('Al', size=(2, 2, 3))
> ase.build.add_adsorbate(slab, 'Au', 1.7, 'hollow')
> slab.center(axis=2, vacuum=4.0)
>
> # Fix second and third layers:
> mask = [atom.tag > 1 for atom in slab]
>
> calc = ase.calculators.castep.Castep(keyword_tolerance=1)
> calc._export_settings = True
> calc._pedantic = True
> calc.param.num_dump_cycles = 0.
> calc.param.reuse = True
>
> # Initial state:
> slab.set_constraint(FixAtoms(mask=mask))
> slab.set_calculator(calc)
> slab.calc.set_pspot('C19_LDA_OTF')
> qn = BFGS(slab, trajectory='initial.traj',logfile='initial.log')
> qn.run(fmax=f_max)
>
> # Final state:
> slab.set_constraint(FixAtoms(mask=mask))
> slab.set_calculator(calc)
> slab.calc.set_pspot('C19_LDA_OTF')
> slab[-1].x += slab.get_cell()[0, 0] / 2
> qn = BFGS(slab, trajectory='final.traj',logfile='final.log')
> qn.run(fmax=f_max)
>
>
> initial = read('initial.traj')
> final = read('final.traj')
> constraint = FixAtoms(mask=[atom.tag > 1 for atom in initial])
>
> if f_parallel:
> world = ase.parallel.MPI4PY(mpi4py_comm=MPI.COMM_WORLD)
> rank = world.rank
> size = world.size
>
> n = size // n_images # number of cpu's per image
> if rank == 0:
> print('number of cpus per image:',n,flush=True)
> j = 1 + rank // n # my image number
> assert size >= n_images, print('fail 1')
> assert n_images * n == size, print('fail 2')
>
> images = [initial]
> for i in range(n_images):
> ranks = range(i * n, (i + 1) * n)
> image = initial.copy()
> # seed = 'data_'+str(i + 1) #nc='out%02i.nc<http://02i.nc/>' %
> (index + 1)
> calc = Castep(keyword_tolerance=1)
> calc._export_settings = True
> calc._pedantic = True
> calc.param.num_dump_cycles = 0.
> calc.param.reuse = True
> # Set working directory
> # calc._seed = seed
> calc._label = 'data'
> calc._directory = 'data_' + str(i)
>
> if rank in ranks:
> image.set_calculator(calc)
> image.calc.set_pspot('C19_LDA_OTF')
> image.set_constraint(constraint)
> images.append(image)
>
> images.append(final)
>
> neb = NEB(images, parallel=True)
> neb.interpolate('idpp')
> qn = BFGS(neb,trajectory='neb.traj', logfile='neb.log')
> qn.run(fmax=f_max)
>
> else:
> images = [initial]
> for i in range(n_images):
> image = initial.copy()
> calc = Castep(keyword_tolerance=1)
> calc._export_settings = True
> calc._pedantic = True
> calc.param.num_dump_cycles = 0.
> calc.param.reuse = True
> # Set working directory
> calc._label = 'data'
> calc._directory = 'data_' + str(i)
> image.set_calculator(calc)
> image.calc.set_pspot('C19_LDA_OTF')
> image.set_constraint(constraint)
> images.append(image)
>
> images.append(final)
>
> neb = NEB(images, parallel=f_parallel)
> neb.interpolate('idpp')
> qn = BFGS(neb, trajectory='neb.traj', logfile='neb.log')
> qn.run(fmax=f_max)
>
> From: Offermans Willem <willem.offermans at vito.be<mailto:
> willem.offermans at vito.be>>
> Sent: 03 April 2020 22:10
> To: Slocombe, Louie (PG/R - Sch of Biosci & Med) <l.slocombe at surrey.ac.uk
> <mailto:l.slocombe at surrey.ac.uk>>
> Cc: ase users <ase-users at listserv.fysik.dtu.dk<mailto:
> ase-users at listserv.fysik.dtu.dk>>
> Subject: Re: [ase-users] Parallel NEB with CASTEP calculator
>
> Dear Louie and ASE friends,
>
> I don?t see any objection to run a NEB calculation in parallel over images
> with CASTEP or any other suitable calculator, if you have the right
> computer infrastructure.
> ASE does support it, according the code. So it should be possible in
> principle.
>
> What made you think that it isn?t possible?
>
>
>
>
>
>
> Met vriendelijke groeten,
> Mit freundlichen Gr??en,
> With kind regards,
>
>
> Willem Offermans
> Researcher Electrocatalysis SCT
> VITO NV | Boeretang 200 | 2400 Mol
> Phone:+32(0)14335263 Mobile:+32(0)492182073
>
> Willem.Offermans at Vito.be<mailto:Willem.Offermans at Vito.be>
>
> <image001.jpg>
>
>
> On 2 Apr 2020, at 19:44, Louie Slocombe via ase-users <
> ase-users at listserv.fysik.dtu.dk<mailto:ase-users at listserv.fysik.dtu.dk>>
> wrote:
>
> parallel NEB calculation
>
> Indien u VITO Mol bezoekt, hou aub er dan rekening mee dat de hoofdingang
> voortaan enkel bereikbaar is vanuit de richting Dessel-Retie, niet vanuit
> richting Mol, zie vito.be/route.<
> https://eur02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.vito.be%2Froute&data=02%7C01%7Cl.slocombe%40surrey.ac.uk%7C373480e121b24c49d46408d7d8135d01%7C6b902693107440aa9e21d89446a2ebb5%7C0%7C1%7C637215449990987038&sdata=v5hpUA6utB%2BI7zCKZh7RZ16jDnRUGxhShqq4Br2EytY%3D&reserved=0
> >
> If you plan to visit VITO at Mol, then please note that the main entrance
> can only be reached coming from Dessel-Retie and no longer coming from Mol,
> see vito.be/en/contact/locations.<
> https://eur02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.vito.be%2Fen%2Fcontact%2Flocations&data=02%7C01%7Cl.slocombe%40surrey.ac.uk%7C373480e121b24c49d46408d7d8135d01%7C6b902693107440aa9e21d89446a2ebb5%7C0%7C1%7C637215449990997030&sdata=6F5QFY9geglZKN%2BK7vaCm%2BN4LJnZncqpbdZZFxHeH6o%3D&reserved=0
> >
> VITO Disclaimer: http://www.vito.be/e-maildisclaimer<
> https://eur02.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.vito.be%2Fe-maildisclaimer&data=02%7C01%7Cl.slocombe%40surrey.ac.uk%7C373480e121b24c49d46408d7d8135d01%7C6b902693107440aa9e21d89446a2ebb5%7C0%7C1%7C637215449990997030&sdata=EJV5taPHjYDfA%2FLiV5LJcW4ldHPU7M9wq4QQLAW7qSM%3D&reserved=0
> >
> <ase_parallel_neb_example.py>
>
> _______________________________________________
> ase-users mailing list
> ase-users at listserv.fysik.dtu.dk<mailto:ase-users at listserv.fysik.dtu.dk>
> https://listserv.fysik.dtu.dk/mailman/listinfo/ase-users
>
>
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <
> http://listserv.fysik.dtu.dk/pipermail/ase-users/attachments/20200407/121f19aa/attachment.html
> >
> -------------- next part --------------
> A non-text attachment was scrubbed...
> Name: vito.jpg
> Type: image/jpeg
> Size: 15232 bytes
> Desc: vito.jpg
> URL: <
> http://listserv.fysik.dtu.dk/pipermail/ase-users/attachments/20200407/121f19aa/attachment.jpg
> >
>
> ------------------------------
>
> _______________________________________________
> ase-users mailing list
> ase-users at listserv.fysik.dtu.dk
> https://listserv.fysik.dtu.dk/mailman/listinfo/ase-users
>
> End of ase-users Digest, Vol 142, Issue 9
> *****************************************
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://listserv.fysik.dtu.dk/pipermail/ase-users/attachments/20200408/21054bd2/attachment-0001.html>
More information about the ase-users
mailing list