[gpaw-users] can't using gpaw-python
謝其軒
z955018 at gmail.com
Thu Oct 10 05:42:18 CEST 2013
Hi,
Marcin
this is the situation when compiling gpaw-python:
--
mpicc -o build/bin.linux-x86_64-2.7//gpaw-python
build/temp.linux-x86_64-2.7/c/plane_wave.o
build/temp.linux-x86_64-2.7/c/point_charges.o
build/temp.linux-x86_64-2.7/c/wigner_seitz.o
build/temp.linux-x86_64-2.7/c/lapack.o
build/temp.linux-x86_64-2.7/c/spline.o build/temp.linux-x86_64-2.7/c/lfc2.o
build/temp.linux-x86_64-2.7/c/hdf5.o build/temp.linux-x86_64-2.7/c/mpi.o
build/temp.linux-x86_64-2.7/c/symmetry.o
build/temp.linux-x86_64-2.7/c/plt.o build/temp.linux-x86_64-2.7/c/fftw.o
build/temp.linux-x86_64-2.7/c/_gpaw.o
build/temp.linux-x86_64-2.7/c/transformers.o
build/temp.linux-x86_64-2.7/c/operators.o
build/temp.linux-x86_64-2.7/c/bc.o build/temp.linux-x86_64-2.7/c/cerf.o
build/temp.linux-x86_64-2.7/c/blacs.o
build/temp.linux-x86_64-2.7/c/localized_functions.o
build/temp.linux-x86_64-2.7/c/lfc.o build/temp.linux-x86_64-2.7/c/lcao.o
build/temp.linux-x86_64-2.7/c/mlsqr.o
build/temp.linux-x86_64-2.7/c/utilities.o
build/temp.linux-x86_64-2.7/c/blas.o
build/temp.linux-x86_64-2.7/c/bmgs/bmgs.o
build/temp.linux-x86_64-2.7/c/xc/pbe.o
build/temp.linux-x86_64-2.7/c/xc/m06l.o
build/temp.linux-x86_64-2.7/c/xc/vdw.o
build/temp.linux-x86_64-2.7/c/xc/xc.o
build/temp.linux-x86_64-2.7/c/xc/tpss.o
build/temp.linux-x86_64-2.7/c/xc/revtpss_c_pbe.o
build/temp.linux-x86_64-2.7/c/xc/ensemble_gga.o
build/temp.linux-x86_64-2.7/c/xc/libxc.o
build/temp.linux-x86_64-2.7/c/xc/pw91.o
build/temp.linux-x86_64-2.7/c/xc/rpbe.o
build/temp.linux-x86_64-2.7/c/xc/xc_mgga.o
build/temp.linux-x86_64-2.7/c/xc/revtpss.o -L/opt/libxc/lib
-L/home/z955018/Canopy/appdata/canopy-1.0.3.1262.rh5-x86_64/lib/python2.7/config
-lxc -lblas -llapack -lscalapack -lpython2.7 -lpthread -ldl -lutil -lm
-Wl,-rpath,'$ORIGIN/../lib' -g -L/home/z955018/Canopy/appld: cannot find
-lscalapack
-------
* Using standard lapack
* Compiling gpaw with mpicc
* Compiling with ScaLapack
* Architecture: linux-x86_64
* Building a custom interpreter
* linking FAILED! Only serial version of code will work.
I can't find the "numpy/core/include" thing, where is it?
chi-hsuan
2013/10/9 Marcin Dulak <Marcin.Dulak at fysik.dtu.dk>
> Hi,
>
> please continue CC to the gpaw-users list.
>
>
> On 10/09/2013 05:08 AM, 謝其軒 wrote:
>
> Marcin,
> Yes, that's the OS I use.
> It may take me few days to figure out what's going on.
> By the way, how do you relink numpy when installing gpaw if you already
> have a old version numpy in root?
>
> One should run gpaw-python with the numpy version that is used during
> gpaw-python build - during the compilation you will see lines like
> -I.../numpy/core/include
> Mixing numpy versions will result in crashes or wrong results.
>
> I'm guessing your gpaw-python is built with python 2.4 and numpy with
> python 2.7.
> It's sufficient to have your Enthought fully setup before starting GPAW
> compilation:
> both python and numpy must belong to Enthought (verify that on the command
> line).
> You will need a fresh build of gpaw-python, the best way to start is to
> build C-extensions only:
>
> https://wiki.fysik.dtu.dk/gpaw/devel/developer_installation.html#developer-installation
> Only after gpaw-test passes one can consider doing python setup.py install
>
> Best regards,
>
> Marcin
>
>
> sincerely,
> chi-hsuan
>
>
> 2013/10/8 Marcin Dulak <Marcin.Dulak at fysik.dtu.dk>
>
>> Please continue on the list - this will be useful for other users.
>>
>>
>> On 10/08/2013 03:21 AM, 謝其軒 wrote:
>>
>> Hi,
>> I'm on a cluster CentOS system
>> Using openmpi 1.4.4 numpy 1.7.0 scipy 0.12.0
>> The error appear when I simply started gpaw-python on the command line,
>> not relavant to anyother modules
>> the error :
>> ImportError: numpy.core.multiarray failed to import
>>
>>
>>
>> ----------------------------------
>> python -c "import numpy; print numpy.__config__.show(); print
>> numpy.__version__"
>>
>> lapack_opt_info:
>> libraries = ['mkl_lapack95_lp64', 'mkl_gf_lp64', 'mkl_intel_thread',
>> 'mkl_core', 'iomp5', 'pthread']
>> library_dirs = ['/home/vagrant/src/master-env/lib']
>> define_macros = [('SCIPY_MKL_H', None)]
>> include_dirs = ['/home/vagrant/src/master-env/include']
>> blas_opt_info:
>> libraries = ['mkl_gf_lp64', 'mkl_intel_thread', 'mkl_core', 'iomp5',
>> 'pthread']
>> library_dirs = ['/home/vagrant/src/master-env/lib']
>> define_macros = [('SCIPY_MKL_H', None)]
>> include_dirs = ['/home/vagrant/src/master-env/include']
>> lapack_mkl_info:
>> libraries = ['mkl_lapack95_lp64', 'mkl_gf_lp64', 'mkl_intel_thread',
>> 'mkl_core', 'iomp5', 'pthread']
>> library_dirs = ['/home/vagrant/src/master-env/lib']
>> define_macros = [('SCIPY_MKL_H', None)]
>> include_dirs = ['/home/vagrant/src/master-env/include']
>> blas_mkl_info:
>> libraries = ['mkl_gf_lp64', 'mkl_intel_thread', 'mkl_core', 'iomp5',
>> 'pthread']
>> library_dirs = ['/home/vagrant/src/master-env/lib']
>> define_macros = [('SCIPY_MKL_H', None)]
>> include_dirs = ['/home/vagrant/src/master-env/include']
>> mkl_info:
>> libraries = ['mkl_gf_lp64', 'mkl_intel_thread', 'mkl_core', 'iomp5',
>> 'pthread']
>> library_dirs = ['/home/vagrant/src/master-env/lib']
>> define_macros = [('SCIPY_MKL_H', None)]
>> include_dirs = ['/home/vagrant/src/master-env/include']
>> None
>> 1.7.1
>>
>>
>>
>> -------------------------------
>>
>>
>> ldd `python -c "from numpy.core import multiarray; print
>> multiarray.__file__"`
>> linux-vdso.so.1 => (0x00007ffffd5fd000)
>> libm.so.6 => /lib64/libm.so.6 (0x00002af51a2dc000)
>> libpython2.7.so.1.0 =>
>> /home/z955018/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/numpy/core/../../../../../lib/libpython2.7.so.1.0
>> (0x00002af51a55f000)
>>
>> this is the python used by numpy - you need to make sure that you build
>> gpaw-python using the same python version.
>>
>> libpthread.so.0 => /lib64/libpthread.so.0 (0x00002af51a90e000)
>> libc.so.6 => /lib64/libc.so.6 (0x00002af51ab2a000)
>> libdl.so.2 => /lib64/libdl.so.2 (0x00002af51ae81000)
>> libutil.so.1 => /lib64/libutil.so.1 (0x00002af51b086000)
>> /lib64/ld-linux-x86-64.so.2 (0x0000003836400000)
>>
>>
>> ----------------------------
>>
>> ldd `which gpaw-python`
>> linux-vdso.so.1 => (0x00007fff325d1000)
>> libxc.so.1 => /opt/etsf/lib/libxc.so.1 (0x00002ab9fb1e7000)
>> libblas.so.3 => /usr/lib64/atlas/libblas.so.3 (0x00002ab9fb495000)
>> liblapack.so.3 => /usr/lib64/atlas/liblapack.so.3 (0x00002ab9fbe65000)
>> libpthread.so.0 => /lib64/libpthread.so.0 (0x0000003837800000)
>> libdl.so.2 => /lib64/libdl.so.2 (0x0000003837400000)
>> libutil.so.1 => /lib64/libutil.so.1 (0x0000003846600000)
>> libm.so.6 => /lib64/libm.so.6 (0x00000037c7000000)
>> librdmacm.so.1 => /usr/lib64/librdmacm.so.1 (0x0000003832200000)
>> libibverbs.so.1 => /usr/lib64/libibverbs.so.1 (0x0000003831e00000)
>> libtorque.so.2 => /opt/torque//lib/libtorque.so.2 (0x00002ab9fc57e000)
>> libnsl.so.1 => /lib64/libnsl.so.1 (0x000000383ee00000)
>> libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x0000003848a00000)
>> libc.so.6 => /lib64/libc.so.6 (0x0000003836800000)
>> libgfortran.so.1 => /usr/lib64/libgfortran.so.1 (0x00002ab9fc881000)
>> /lib64/ld-linux-x86-64.so.2 (0x0000003836400000)
>> ------------------------------
>>
>> i don't see here libpython2.7.so.1.0 =>
>> /home/z955018/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/numpy/core/../../../../../lib/libpython2.7.so.1.0
>> (0x00002af51a55f000)
>> so i'm guessing gpaw-python is built using the default CentOS 5 python
>> 2.4 (is it the system you run?).
>>
>> A numpy build using mkl is a fast and good solution for standalone numpy
>> runs, but will most likely
>> fail when running with gpaw-python, and other custom python interpreters
>> that link their own blas/lapack.
>> This can be serious: I have seen cases when it does not crash, but gives
>> wrong numbers instead.
>> For the moment let's keep numpy/mkl, but if running gpaw-test fails - try
>> to build numpy with it's
>> internal blas/lapack as described here:
>> https://wiki.fysik.dtu.dk/gpaw/install/Linux/sun_chpc.html
>>
>> When you have gpaw-python starting,
>>
>> export OMP_NUM_THREADS=1
>>
>> and run the full tests in parallel (on 1, 2, 4, 8 cores):
>> mpiexec gpaw-python `which gpaw-test`
>>
>> Remember to use the version of setups that corresponds to the gpaw
>> version installed:
>> http://listserv.fysik.dtu.dk/pipermail/gpaw-users/2013-October/002398.html
>> gpaw-test may show problems with numpy/mkl,
>> but even if all tests pass you may still expect problems in some cases.
>>
>> Best regards,
>>
>> Marcin
>>
>> I use numpy and scipy from the canopy package, funny thing is that when
>> I change the package to anaconda, the error message changes.
>>
>> ***********************************
>>
>
>
>
> --
> ***********************************
>
> Marcin Dulak
> Technical University of Denmark
> Department of Physics
> Building 307, Room 229
> DK-2800 Kongens Lyngby
> Denmark
> Tel.: (+45) 4525 3157
> Fax.: (+45) 4593 2399
> email: Marcin.Dulak at fysik.dtu.dk
>
> ***********************************
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://listserv.fysik.dtu.dk/pipermail/gpaw-users/attachments/20131010/8801c19f/attachment-0001.html>
More information about the gpaw-users
mailing list