[gpaw-users] GPAW parallel installation on El Capitan

Ask Hjorth Larsen asklarsen at gmail.com
Fri Apr 8 12:42:21 CEST 2016


Dear Vardha

2016-04-08 8:53 GMT+02:00 Varadharajan Srinivasan via gpaw-users
<gpaw-users at listserv.fysik.dtu.dk>:
> Dear all,
>
> Thanks to help from this forum I have successfully installed GPAW on my
> MacPro (running El Capitan). After the serial installation and testing was
> successful I decided to try the parallel installation. In preparation I got
> scalapack 2.0.2 brew installed, modified the customize.py script to include
> the following lines :
>
> scalapack = True
>  libraries += ['scalapack']
> libraries += ['openblas']
> library_dirs += ['/usr/local/opt/openblas/lib','/usr/local/lib']
>
> (I compiled openblas and scalapack with openblas for this installation.)
>
> 'python setup.py install ' worked without complaining and running 'gpaw
> info' gives me :
> python-2.7.11   /Users/vardha/Virtualenvs/gpaw-trunk/bin/python
> gpaw-1.0.1b1
> /Users/vardha/Virtualenvs/gpaw-trunk/lib/python2.7/site-packages/gpaw/
> ase-3.10.0
> /Users/vardha/Virtualenvs/gpaw-trunk/lib/python2.7/site-packages/ase/
> numpy-1.11.0
> /Users/vardha/Virtualenvs/gpaw-trunk/lib/python2.7/site-packages/numpy/
> scipy-0.17.0
> /Users/vardha/Virtualenvs/gpaw-trunk/lib/python2.7/site-packages/scipy/
> _gpaw
> /Users/vardha/Virtualenvs/gpaw-trunk/lib/python2.7/site-packages/_gpaw.so
> parallel        /Users/vardha/Virtualenvs/gpaw-trunk/bin/gpaw-python
> FFTW            no
> scalapack       no
> libvdwxc        no
> PAW-datasets    /usr/local/share/gpaw-setups:
>                 /usr/share/gpaw-setups
>
> Although during compilation I could see that -lscalapack was used I am first
> confused why it says 'scalapack no' above. Also how do I include FFTW?

You run "gpaw info" in serial, so it has no scalapack.

Try "gpaw-python $(which gpaw) info".

In fact, someone should probably add that trick to the web page if it
isn't there already.

>
> I assumed first that this was not a problem and proceeded to do some tests.
> I took the following input file from the manual :
>
> from ase import Atoms
> from gpaw import GPAW
>
> d = 0.74
> a = 6.0
>
> atoms = Atoms('H2',
>               positions=[(0, 0, 0),
>                          (0, 0, d)],
>               cell=(a, a, a))
> atoms.center()
>
> calc = GPAW(nbands=2, txt='h2.txt')
> atoms.set_calculator(calc)
> print(atoms.get_forces())
>
> ...and ran 'mpirun -np 2 gpaw-python h2.py' which gave me the following
> error :
>
> Traceback (most recent call last):
>   File "h2.py", line 1, in <module>
>     from ase import Atoms
> ImportError: No module named ase
> Traceback (most recent call last):
>   File "h2.py", line 1, in <module>
>     from ase import Atoms
> ImportError: No module named ase
> -------------------------------------------------------
> Primary job  terminated normally, but 1 process returned
> a non-zero exit code.. Per user-direction, the job has been aborted.
> -------------------------------------------------------
> --------------------------------------------------------------------------
> mpirun detected that one or more processes exited with non-zero status, thus
> causing
> the job to be terminated. The first process to do so was:
>
>   Process name: [[50486,1],0]
>   Exit code:    1
> --------------------------------------------------------------------------
>
>
> It seems that each processor is trying to read from h2.py. If this is true
> then either my input script is incorrect or I don't have parallelisation
> working. Secondly, the missing module error only shows up in the parallel
> run. The serial run of the file executes without error.
>
> I searched in the archives of the forum and found a similar posting but I
> could not understand the resolution. I was hoping I could ask it again and
> rely on your kindness to re-explain. What do you think I have done
> incorrectly?
>

They are both supposed to read the file.

They don't find ASE installed.  Please check that you can import ase
from gpaw-python, e.g. gpaw-python -c "import ase"

Then try within mpirun.

Best regards
Ask

> Thanks,
> Vardha.
>
> _______________________________________________
> gpaw-users mailing list
> gpaw-users at listserv.fysik.dtu.dk
> https://listserv.fysik.dtu.dk/mailman/listinfo/gpaw-users


More information about the gpaw-users mailing list