[gpaw-users] gpaw-users Digest, Vol 75, Issue 22

Toma Susi toma.susi at univie.ac.at
Sat Apr 23 13:34:51 CEST 2016


Hi,

Yes, but the issue might still be related. 

I'm not actually sure if I ever resolved this, but I don't really run anything heavy on my desktop anyways, and all that I have used has worked fine.

Dr. Toma Susi
Principal investigator
HeQuCoG FWF project
University of Vienna, Austria

http://mostlyphysics.net

"Love is the only emotion that enhances our intelligence." 
	-Humberto Maturana

> On 23 Apr 2016, at 12:14, Varadharajan Srinivasan <varadharajan.srinivasan at gmail.com> wrote:
> 
> Dear Toma,
> 
> Thanks for this. I checked and I have a more recent version of numpy 1.11.0 and openmpi-1.10.2 installed. If I understood corrected these were the libraries giving errors in your case right?
> 
> Best,
> Vardha.
> 
>> On Wed, Apr 20, 2016 at 1:17 AM, Toma Susi via gpaw-users <gpaw-users at listserv.fysik.dtu.dk> wrote:
>> Hi,
>> 
>> I reported segmentation faults on El Capitan 10.11.1. in November 2015:
>> https://listserv.fysik.dtu.dk/pipermail/gpaw-developers/2015-November/006007.html
>> 
>> Contains some info on mpirun crashes; I guess we decided it might be a problem with other libraries on my system.
>> 
>> -Toma
>> 
>> > On 19 Apr 2016, at 21:42, gpaw-users-request at listserv.fysik.dtu.dk wrote:
>> >
>> > Send gpaw-users mailing list submissions to
>> >       gpaw-users at listserv.fysik.dtu.dk
>> >
>> > To subscribe or unsubscribe via the World Wide Web, visit
>> >       https://listserv.fysik.dtu.dk/mailman/listinfo/gpaw-users
>> > or, via email, send a message with subject or body 'help' to
>> >       gpaw-users-request at listserv.fysik.dtu.dk
>> >
>> > You can reach the person managing the list at
>> >       gpaw-users-owner at listserv.fysik.dtu.dk
>> >
>> > When replying, please edit your Subject line so it is more specific
>> > than "Re: Contents of gpaw-users digest..."
>> >
>> >
>> > Today's Topics:
>> >
>> >   1. Re: GPAW parallel installation on El Capitan (Tristan Maxson)
>> >
>> >
>> > ----------------------------------------------------------------------
>> >
>> > Message: 1
>> > Date: Tue, 19 Apr 2016 15:42:06 -0400
>> > From: Tristan Maxson <tgmaxson at gmail.com>
>> > To: Varadharajan Srinivasan <varadharajan.srinivasan at gmail.com>,
>> >       gpaw-users <gpaw-users at listserv.fysik.dtu.dk>
>> > Subject: Re: [gpaw-users] GPAW parallel installation on El Capitan
>> > Message-ID:
>> >       <CAGZ-oeqnc2iNRZUWqt7nyQa1uv=qQF8cB9pAnQDBip36CRcwQA at mail.gmail.com>
>> > Content-Type: text/plain; charset="utf-8"
>> >
>> > On Apr 19, 2016 3:41 PM, tgmaxson at gmail.com wrote:
>> >
>> >> Can you possibly attach your build log for what you are using?  that could
>> >> shed some light on the issue.
>> >> On Apr 19, 2016 3:37 PM, "Varadharajan Srinivasan via gpaw-users" <
>> >> gpaw-users at listserv.fysik.dtu.dk> wrote:
>> >>
>> >>> Dear Ask,
>> >>>
>> >>> I did try both with 2 and 4 cores and indeed I get segmentation faults. I
>> >>> attach the log file for the tests run with 4 cores.
>> >>> So far in my trial runs (on the tutorials) I haven't faced any problem
>> >>> but your insight into this issue would be very helpful.
>> >>>
>> >>> Best,
>> >>> Vardha.
>> >>>
>> >>> On Mon, Apr 18, 2016 at 2:13 PM, Ask Hjorth Larsen <asklarsen at gmail.com>
>> >>> wrote:
>> >>>
>> >>>> The test suite is designed for 1, 2, 4, or 8 cores.  Definitely there
>> >>>> will be some tests that do not run on 12.  However the segmentation
>> >>>> fault should never happen.  Can you reproduce that with 2, 4 or 8
>> >>>> cores?
>> >>>>
>> >>>> Best regards
>> >>>> Ask
>> >>>>
>> >>>> 2016-04-09 14:42 GMT+02:00 Varadharajan Srinivasan
>> >>>> <varadharajan.srinivasan at gmail.com>:
>> >>>>> Dear Jakob,
>> >>>>>
>> >>>>> I installed ASE and GPAW without using virtualenvs. gnaw-python
>> >>>> $(which paw)
>> >>>>> info  gives:
>> >>>>> python-2.7.11   /usr/local/bin/gpaw-python
>> >>>>> gpaw-1.0.1b1
>> >>>>>
>> >>>> /usr/local/opt/python/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/gpaw/
>> >>>>> ase-3.11.0b1    /Users/vardha/SWARE/ase/ase/
>> >>>>> numpy-1.11.0
>> >>>>>
>> >>>> /usr/local/opt/python/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/numpy/
>> >>>>> scipy-0.17.0
>> >>>>>
>> >>>> /usr/local/opt/python/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/scipy/
>> >>>>> _gpaw           built-in
>> >>>>> parallel        /usr/local/bin/gpaw-python
>> >>>>> FFTW            no
>> >>>>> scalapack       yes
>> >>>>> libvdwxc        no
>> >>>>> PAW-datasets    /usr/local/share/gpaw-setups:
>> >>>>>                /usr/share/gpaw-setups
>> >>>>>
>> >>>>>
>> >>>>> Serial tests run fine. However, on running parallel tests several
>> >>>> tests fail
>> >>>>> complaining of errors like :
>> >>>>> (a) ValueError: Cannot construct interpolator.  Grid 12 x 12 x 12 may
>> >>>> be too
>> >>>>> small
>> >>>>> (b) RuntimeError: Cannot distribute 80 bands to 12 processors
>> >>>>> (c) RuntimeError: use fewer bands or more basis functions
>> >>>>>
>> >>>>> I think most of these errors are because I got greedy and used 12 cores
>> >>>>> which might be too much for the tests. But one error was from
>> >>>> xc/atomize.py
>> >>>>> which was a segmentation fault :
>> >>>>> xc/atomize.py                            [helios:47447] *** Process
>> >>>> received
>> >>>>> signal ***
>> >>>>> [helios:47447] Signal: Segmentation fault: 11 (11)
>> >>>>> [helios:47447] Signal code: Address not mapped (1)
>> >>>>> [helios:47447] Failing at address: 0x1230ea000
>> >>>>> [helios:47447] [ 0] 0   libsystem_platform.dylib
>> >>>>> 0x00007fff9371352a _sigtramp + 26
>> >>>>> [helios:47447] [ 1] 0   ???
>> >>>>> 0x00000000000003e8 0x0 + 1000
>> >>>>> [helios:47447] [ 2] 0   gpaw-python
>> >>>>> 0x000000010b8ed035 calc_mgga + 581
>> >>>>> [helios:47447] [ 3] 0   gpaw-python
>> >>>>> 0x000000010b8ec613 XCFunctional_calculate + 435
>> >>>>> [helios:47447] [ 4] 0   Python
>> >>>>> 0x000000011051dbcd PyEval_EvalFrameEx + 26858
>> >>>>> [helios:47447] [ 5] 0   Python
>> >>>>> 0x00000001105170f1 PyEval_EvalCodeEx + 1583
>> >>>>> [helios:47447] [ 6] 0   Python
>> >>>>> 0x000000011052169a fast_function + 117
>> >>>>> [helios:47447] [ 7] 0   Python
>> >>>>> 0x000000011051daf3 PyEval_EvalFrameEx + 26640
>> >>>>> [helios:47447] [ 8] 0   Python
>> >>>>> 0x000000011052172d fast_function + 264
>> >>>>> [helios:47447] [ 9] 0   Python
>> >>>>> 0x000000011051daf3 PyEval_EvalFrameEx + 26640
>> >>>>> [helios:47447] [10] 0   Python
>> >>>>> 0x000000011052172d fast_function + 264
>> >>>>> [helios:47447] [11] 0   Python
>> >>>>> 0x000000011051daf3 PyEval_EvalFrameEx + 26640
>> >>>>> [helios:47447] [12] 0   Python
>> >>>>> 0x00000001105170f1 PyEval_EvalCodeEx + 1583
>> >>>>> [helios:47447] [13] 0   Python
>> >>>>> 0x000000011052169a fast_function + 117
>> >>>>> [helios:47447] [14] 0   Python
>> >>>>> 0x000000011051daf3 PyEval_EvalFrameEx + 26640
>> >>>>> [helios:47447] [15] 0   Python
>> >>>>> 0x000000011052172d fast_function + 264
>> >>>>> [helios:47447] [16] 0   Python
>> >>>>> 0x000000011051daf3 PyEval_EvalFrameEx + 26640
>> >>>>> [helios:47447] [17] 0   Python
>> >>>>> 0x000000011052172d fast_function + 264
>> >>>>> [helios:47447] [18] 0   Python
>> >>>>> 0x000000011051daf3 PyEval_EvalFrameEx + 26640
>> >>>>> [helios:47447] [19] 0   Python
>> >>>>> 0x00000001105170f1 PyEval_EvalCodeEx + 1583
>> >>>>> [helios:47447] [20] 0   Python
>> >>>>> 0x00000001105174fc PyEval_EvalFrameEx + 537
>> >>>>> [helios:47447] [21] 0   Python
>> >>>>> 0x00000001105170f1 PyEval_EvalCodeEx + 1583
>> >>>>> [helios:47447] [22] 0   Python
>> >>>>> 0x000000011052169a fast_function + 117
>> >>>>> [helios:47447] [23] 0   Python
>> >>>>> 0x000000011051daf3 PyEval_EvalFrameEx + 26640
>> >>>>> [helios:47447] [24] 0   Python
>> >>>>> 0x000000011052172d fast_function + 264
>> >>>>> [helios:47447] [25] 0   Python
>> >>>>> 0x000000011051daf3 PyEval_EvalFrameEx + 26640
>> >>>>> [helios:47447] [26] 0   Python
>> >>>>> 0x000000011052172d fast_function + 264
>> >>>>> [helios:47447] [27] 0   Python
>> >>>>> 0x000000011051daf3 PyEval_EvalFrameEx + 26640
>> >>>>> [helios:47447] [28] 0   Python
>> >>>>> 0x00000001105170f1 PyEval_EvalCodeEx + 1583
>> >>>>> [helios:47447] [29] 0   Python
>> >>>>> 0x00000001104bbfb1 function_call + 352
>> >>>>> [helios:47447] *** End of error message ***
>> >>>>>
>> >>>>>
>> >>>>> And that's where the tests stopped. I will try with lesser processors
>> >>>> but do
>> >>>>> you think this is a compilation/installation issue ?
>> >>>>>
>> >>>>> Thanks,
>> >>>>> Vardha.
>> >>>>>
>> >>>>>
>> >>>>> On Sat, Apr 9, 2016 at 2:07 AM, Varadharajan Srinivasan
>> >>>>> <varadharajan.srinivasan at gmail.com> wrote:
>> >>>>>>
>> >>>>>> Dear Jakob,
>> >>>>>>
>> >>>>>> Thank you very much for your suggestion. I will try it and report
>> >>>> back.
>> >>>>>>
>> >>>>>> Best,
>> >>>>>> Vardha.
>> >>>>>>
>> >>>>>> On Sat, Apr 9, 2016 at 1:38 AM, Jakob Schi?tz <schiotz at fysik.dtu.dk>
>> >>>>>> wrote:
>> >>>>>>>
>> >>>>>>> This is VirtualEnv doing its ?magic?.  I was never able to get gpaw
>> >>>> or
>> >>>>>>> asap to work with virtualenv, because they use their own python
>> >>>> interpreter
>> >>>>>>> (which breaks virtualenv).  But strangely, it always worked for
>> >>>> Marcin who
>> >>>>>>> wrote the instructions.  I think it was a difference between using
>> >>>> the
>> >>>>>>> build-in python or the homebrew python.
>> >>>>>>>
>> >>>>>>> Try without using virtualenv.  But in that case it is essential to
>> >>>> use
>> >>>>>>> the homebrew Python, so you can clean up if something goes horribly
>> >>>> wrong.
>> >>>>>>> Try for example to follow these instructions:
>> >>>>>>>
>> >>>> https://wiki.fysik.dtu.dk/asap/Installing%20ASE%2C%20Asap%20and%20GPAW%20on%20a%20Mac
>> >>>>>>> Just skip the installation of Asap, you do not need it.
>> >>>>>>>
>> >>>>>>> Best regards
>> >>>>>>>
>> >>>>>>> Jakob
>> >>>>>>>
>> >>>>>>>
>> >>>>>>>> On 08 Apr 2016, at 13:43, Varadharajan Srinivasan via gpaw-users
>> >>>>>>>> <gpaw-users at listserv.fysik.dtu.dk> wrote:
>> >>>>>>>>
>> >>>>>>>> Dear Ask,
>> >>>>>>>>
>> >>>>>>>> Thank you for the prompt reply.
>> >>>>>>>>
>> >>>>>>>> You run "gpaw info" in serial, so it has no scalapack.
>> >>>>>>>>
>> >>>>>>>> Try "gpaw-python $(which gpaw) info".
>> >>>>>>>>
>> >>>>>>>> I ran this and got an error :
>> >>>>>>>> gpaw-python $(which gpaw) info
>> >>>>>>>> Traceback (most recent call last):
>> >>>>>>>>  File "/Users/vardha/Virtualenvs/gpaw-trunk/bin/gpaw", line 2, in
>> >>>>>>>> <module>
>> >>>>>>>>    from gpaw.cli.main import main
>> >>>>>>>> ImportError: No module named gpaw.cli.main
>> >>>>>>>>
>> >>>>>>>> In fact, someone should probably add that trick to the web page if
>> >>>> it
>> >>>>>>>> isn't there already.
>> >>>>>>>>
>> >>>>>>>> They don't find ASE installed.  Please check that you can import
>> >>>> ase
>> >>>>>>>> from gpaw-python, e.g. gpaw-python -c "import ase"
>> >>>>>>>>
>> >>>>>>>> I then tried  gpaw-python -c "import ase" and got :
>> >>>>>>>> Traceback (most recent call last):
>> >>>>>>>>  File "<string>", line 1, in <module>
>> >>>>>>>> ImportError: No module named ase
>> >>>>>>>>
>> >>>>>>>> Is there something wrong with my paths? I installed GPAW in
>> >>>> virtualenvs
>> >>>>>>>> and used  the .bash_profile  below.
>> >>>>>>>>
>> >>>>>>>> Thanks,
>> >>>>>>>> Vardha.
>> >>>>>>>>
>> >>>>>>>> # Set architecture flags
>> >>>>>>>> export ARCHFLAGS="-arch x86_64"
>> >>>>>>>>
>> >>>>>>>> # personal installation of pip
>> >>>>>>>> export PATH=/Users/$USER/pip_latest:$PATH
>> >>>>>>>> export PYTHONPATH=/Users/$USER/pip_latest:$PYTHONPATH
>> >>>>>>>>
>> >>>>>>>> pyver=`python -c "from distutils import sysconfig; print
>> >>>>>>>> sysconfig.get_python_version()"`
>> >>>>>>>>
>> >>>>>>>> # pip --user installations of packages
>> >>>>>>>> export PATH=/Users/$USER/Library/Python/${pyver}/bin:$PATH
>> >>>>>>>> export
>> >>>>>>>>
>> >>>> PYTHONPATH=/Users/$USER/Library/Python/${pyver}/lib/python/site-packages:$PYTHONPATH
>> >>>>>>>>
>> >>>>>>>> # homebrew
>> >>>>>>>> # Ensure user-installed binaries take precedence
>> >>>>>>>> export PATH=/usr/local/bin:$PATH
>> >>>>>>>> export
>> >>>>>>>> PYTHONPATH=/usr/local/lib/python${pyver}/site-packages:$PYTHONPATH
>> >>>>>>>> # hack gtk-2.0
>> >>>>>>>> export
>> >>>>>>>>
>> >>>> PYTHONPATH=/usr/local/lib/python${pyver}/site-packages/gtk-2.0:$PYTHONPATH
>> >>>>>>>> # https://github.com/mxcl/homebrew/issues/16891
>> >>>>>>>> export PKG_CONFIG_PATH=`brew --prefix
>> >>>>>>>> libffi`/lib/pkgconfig:$PKG_CONFIG_PATH
>> >>>>>>>> export PKG_CONFIG_PATH=/opt/X11/lib/pkgconfig:$PKG_CONFIG_PATH
>> >>>>>>>> export PKG_CONFIG_PATH=/usr/local/lib/pkgconfig:$PKG_CONFIG_PATH
>> >>>>>>>>
>> >>>>>>>> # virtualenv
>> >>>>>>>> # virtualenv should use Distribute instead of legacy setuptools
>> >>>>>>>> export VIRTUALENV_DISTRIBUTE=true
>> >>>>>>>> # Centralized location for new virtual environments
>> >>>>>>>> export PIP_VIRTUALENV_BASE=$HOME/Virtualenvs
>> >>>>>>>> # pip should only run if there is a virtualenv currently activated
>> >>>>>>>> export PIP_REQUIRE_VIRTUALENV=true
>> >>>>>>>> # cache pip-installed packages to avoid re-downloading
>> >>>>>>>> export PIP_DOWNLOAD_CACHE=$HOME/.pip/cache
>> >>>>>>>>
>> >>>>>>>> Then try within mpirun.
>> >>>>>>>>
>> >>>>>>>> Best regards
>> >>>>>>>> Ask
>> >>>>>>>>
>> >>>>>>>>> Thanks,
>> >>>>>>>>> Vardha.
>> >>>>>>>>>
>> >>>>>>>>> _______________________________________________
>> >>>>>>>>> gpaw-users mailing list
>> >>>>>>>>> gpaw-users at listserv.fysik.dtu.dk
>> >>>>>>>>> https://listserv.fysik.dtu.dk/mailman/listinfo/gpaw-users
>> >>>>>>>>
>> >>>>>>>> _______________________________________________
>> >>>>>>>> gpaw-users mailing list
>> >>>>>>>> gpaw-users at listserv.fysik.dtu.dk
>> >>>>>>>> https://listserv.fysik.dtu.dk/mailman/listinfo/gpaw-users
>> >>>>>>>
>> >>>>>>> --
>> >>>>>>> Jakob Schi?tz, professor, Ph.D.
>> >>>>>>> Department of Physics
>> >>>>>>> Technical University of Denmark
>> >>>>>>> DK-2800 Kongens Lyngby, Denmark
>> >>>>>>> http://www.fysik.dtu.dk/~schiotz/
>> >>>>>>>
>> >>>>>>>
>> >>>>>>>
>> >>>>>>
>> >>>>>
>> >>>>
>> >>>
>> >>>
>> >>> _______________________________________________
>> >>> gpaw-users mailing list
>> >>> gpaw-users at listserv.fysik.dtu.dk
>> >>> https://listserv.fysik.dtu.dk/mailman/listinfo/gpaw-users
>> >>>
>> >>
>> > -------------- next part --------------
>> > An HTML attachment was scrubbed...
>> > URL: <http://listserv.fysik.dtu.dk/pipermail/gpaw-users/attachments/20160419/a93cce1f/attachment.html>
>> >
>> > ------------------------------
>> >
>> > _______________________________________________
>> > gpaw-users mailing list
>> > gpaw-users at listserv.fysik.dtu.dk
>> > https://listserv.fysik.dtu.dk/mailman/listinfo/gpaw-users
>> >
>> > End of gpaw-users Digest, Vol 75, Issue 22
>> > ******************************************
>> 
>> 
>> _______________________________________________
>> gpaw-users mailing list
>> gpaw-users at listserv.fysik.dtu.dk
>> https://listserv.fysik.dtu.dk/mailman/listinfo/gpaw-users
> 
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://listserv.fysik.dtu.dk/pipermail/gpaw-users/attachments/20160423/3658aab7/attachment-0001.html>


More information about the gpaw-users mailing list