[gpaw-users] Fwd: can't using gpaw-python

謝其軒 z955018 at gmail.com
Wed Oct 23 13:27:38 CEST 2013


=================

 python2.7 -c "from numpy.linalg import lapack_lite; print
lapack_lite.__file__"
/home/z955018/Ultimatum/ALLINSTALLATIONS/numpy-1.7.0/lib/python2.7/site-packages/numpy/linalg/lapack_lite.so

================

 ldd `python2.7 -c "from numpy.linalg import lapack_lite; print
lapack_lite.__file__"`
        linux-vdso.so.1 =>  (0x00007fffa2bfd000)
        liblapack.so.3 => /usr/lib64/atlas/liblapack.so.3
(0x00002b4d40420000)
        libblas.so.3 => /usr/lib64/atlas/libblas.so.3 (0x00002b4d40b37000)
        libgfortran.so.1 => /usr/lib64/libgfortran.so.1 (0x00002b4d41507000)
        libm.so.6 => /lib64/libm.so.6 (0x00002b4d4179f000)
        libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x00002b4d41a22000)
        libc.so.6 => /lib64/libc.so.6 (0x00002b4d41c30000)
        /lib64/ld-linux-x86-64.so.2 (0x0000003836400000)

===============

Does it impy numpy has nothing to do with mkl?

and :

==============


 ldd `which gpaw-python` | awk '{print $3}' | xargs -I f sh -c "echo f; ldd
f"


sh: -c: line 0: syntax error near unexpected token `0x00007fff9a547000'
sh: -c: line 0: `echo (0x00007fff9a547000); ldd (0x00007fff9a547000)'
/opt/libxc/lib/libxc.so.1
        linux-vdso.so.1 =>  (0x00007fffc4dfd000)
        libgfortran.so.1 => /usr/lib64/libgfortran.so.1 (0x00002b5d205df000)
        libm.so.6 => /lib64/libm.so.6 (0x00002b5d20876000)
        libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x00002b5d20af9000)
        libc.so.6 => /lib64/libc.so.6 (0x00002b5d20d08000)
        /lib64/ld-linux-x86-64.so.2 (0x0000003836400000)
/usr/lib64/atlas/libblas.so.3
        linux-vdso.so.1 =>  (0x00007fff82282000)
        libgfortran.so.1 => /usr/lib64/libgfortran.so.1 (0x00002b13df9d4000)
        libc.so.6 => /lib64/libc.so.6 (0x00002b13dfc6b000)
        libm.so.6 => /lib64/libm.so.6 (0x00002b13dffc2000)
        /lib64/ld-linux-x86-64.so.2 (0x0000003836400000)
/usr/lib64/atlas/liblapack.so.3
        linux-vdso.so.1 =>  (0x00007fff44ffd000)
        libblas.so.3 => /usr/lib64/atlas/libblas.so.3 (0x00002b9fb4c2f000)
        libgfortran.so.1 => /usr/lib64/libgfortran.so.1 (0x00002b9fb55ff000)
        libc.so.6 => /lib64/libc.so.6 (0x00002b9fb5896000)
        libm.so.6 => /lib64/libm.so.6 (0x00002b9fb5bee000)
        /lib64/ld-linux-x86-64.so.2 (0x0000003836400000)
/lib64/libpthread.so.0
        linux-vdso.so.1 =>  (0x00007fffd31fd000)
        libc.so.6 => /lib64/libc.so.6 (0x0000003836800000)
        /lib64/ld-linux-x86-64.so.2 (0x0000003836400000)
/lib64/libdl.so.2
        linux-vdso.so.1 =>  (0x00007fffe63fd000)
        libc.so.6 => /lib64/libc.so.6 (0x0000003836800000)
        /lib64/ld-linux-x86-64.so.2 (0x0000003836400000)
/lib64/libutil.so.1
        linux-vdso.so.1 =>  (0x00007fff88b3f000)
        libc.so.6 => /lib64/libc.so.6 (0x0000003836800000)
        /lib64/ld-linux-x86-64.so.2 (0x0000003836400000)
/lib64/libm.so.6
        linux-vdso.so.1 =>  (0x00007fff131fd000)
        libc.so.6 => /lib64/libc.so.6 (0x0000003836800000)
        /lib64/ld-linux-x86-64.so.2 (0x0000003836400000)
/usr/lib64/librdmacm.so.1
        linux-vdso.so.1 =>  (0x00007fffa91bd000)
        libibverbs.so.1 => /usr/lib64/libibverbs.so.1 (0x0000003831e00000)
        libc.so.6 => /lib64/libc.so.6 (0x0000003836800000)
        libpthread.so.0 => /lib64/libpthread.so.0 (0x0000003837800000)
        libdl.so.2 => /lib64/libdl.so.2 (0x0000003837400000)
        /lib64/ld-linux-x86-64.so.2 (0x0000003836400000)
/usr/lib64/libibverbs.so.1
        linux-vdso.so.1 =>  (0x00007fff6c5c1000)
        libpthread.so.0 => /lib64/libpthread.so.0 (0x0000003837800000)
        libdl.so.2 => /lib64/libdl.so.2 (0x0000003837400000)
        libc.so.6 => /lib64/libc.so.6 (0x0000003836800000)
        /lib64/ld-linux-x86-64.so.2 (0x0000003836400000)
/opt/torque//lib/libtorque.so.2
        linux-vdso.so.1 =>  (0x00007fff57bfd000)
        libc.so.6 => /lib64/libc.so.6 (0x00002b6ba5ff8000)
        /lib64/ld-linux-x86-64.so.2 (0x0000003836400000)
/lib64/libnsl.so.1
        linux-vdso.so.1 =>  (0x00007fffc11fd000)
        libc.so.6 => /lib64/libc.so.6 (0x0000003836800000)
        /lib64/ld-linux-x86-64.so.2 (0x0000003836400000)
/lib64/libgcc_s.so.1
        linux-vdso.so.1 =>  (0x00007ffffed8a000)
        libc.so.6 => /lib64/libc.so.6 (0x0000003836800000)
        /lib64/ld-linux-x86-64.so.2 (0x0000003836400000)
/lib64/libc.so.6
        /lib64/ld-linux-x86-64.so.2 (0x0000003836400000)
        linux-vdso.so.1 =>  (0x00007fff0c3fd000)
/usr/lib64/libgfortran.so.1
        linux-vdso.so.1 =>  (0x00007fffae4cb000)
        libm.so.6 => /lib64/libm.so.6 (0x00002af62fbf1000)
        libc.so.6 => /lib64/libc.so.6 (0x00002af62fe74000)
        /lib64/ld-linux-x86-64.so.2 (0x0000003836400000)

===================


it seems that I can't find any mkl stuff in there.
it's so awkward and....well, there're no clues of mkl?




BR,
chi-hsuan


2013/10/23 Marcin Dulak <Marcin.Dulak at fysik.dtu.dk>

>  On 10/23/2013 11:43 AM, 謝其軒 wrote:
>
> I can't import _dotblas, is there anything wrong with my numpy?
>
> no, it just means numpy does not use optimized blas.
> Actually it's a good thing for gpaw, because optimized blas in numpy is
> often causing problems when run with gpaw-python,
> but we have another place to check in numpy:
> python -c "from numpy.linalg import lapack_lite; print
> lapack_lite.__file__"
> ldd `python -c "from numpy.linalg import lapack_lite; print
> lapack_lite.__file__"`
>
>  and this is my configuration.log
>  ================================
> Current configuration
> libraries ['xc', 'blas', 'lapack']
> library_dirs ['/usr/lib64', '/opt/libxc/lib']
> include_dirs
> ['/home/z955018/Ultimatum/ALLINSTALLATIONS/numpy-1.7.0/lib/python2.7/site-packages/numpy/core/include',
> '/opt/libxc/include']
> define_macros [('NPY_NO_DEPRECATED_API', 7)]
> extra_link_args []
> extra_compile_args ['-Wall', '-std=c99']
> runtime_library_dirs []
> extra_objects []
>
>  Parallel configuration
> mpicompiler mpicc
> mpi_libraries []
> mpi_library_dirs []
> mpi_include_dirs []
> mpi_define_macros []
> mpi_runtime_library_dirs []
>  =====================================
>
>  I think the numpy compiled is alright.
>
>  and :
>
>  which gpaw-python
>
>  ~/Ultimatum/ALLINSTALLATIONS/gpaw-0.9.1.10596/bin/gpaw-python
>
>  I think it's fine directory for my system
>
>  should I "ignore numpy" when compiling gpaw?
>
> we need to find out what is linked to
> /opt/intel/composerxe-2011.3.174/mkl/lib/intel64/libmkl_intel_thread.so.
> If it's not numpy above, then I would go with ldd through all libraries
> gpaw-python links:
> ldd `which gpaw-python` | awk '{print $3}' | xargs -I f sh -c "echo f; ldd
> f"
> If it's numpy then you have two choices: correct the numpy build with mkl,
> by linking libiomp5.so library
> (which resides in the intel compiler directory) - this may be tricky
> http://www.jjask.com/76860/installing-numpy-1-7-1-with-mkl-10-3-using-gcc-4-7-2-on-linux
> or build numpy with it's internal blas/lapack as suggested previously:
> http://listserv.fysik.dtu.dk/pipermail/gpaw-users/2013-October/002401.html
> Both cases will require manual patching of numpy build system.
>
> Let's continue on the list because we are probably getting closer to find
> the problem.
>
> Marcin
>
>
>  BR,
> chi-hsuan
>
>
>
> 2013/10/23 Marcin Dulak <Marcin.Dulak at fysik.dtu.dk>
>
>>  On 10/23/2013 11:12 AM, 謝其軒 wrote:
>>
>> sorry for replying so hush,
>> this is my situation now:
>> ====================
>>   ldd `which gpaw-python`
>>         linux-vdso.so.1 =>  (0x00007fff14bfd000)
>>         libxc.so.1 => /opt/libxc/lib/libxc.so.1 (0x00002ae69ce83000)
>>         libblas.so.3 => /usr/lib64/atlas/libblas.so.3 (0x00002ae69d131000)
>>         liblapack.so.3 => /usr/lib64/atlas/liblapack.so.3
>> (0x00002ae69db01000)
>>         libpthread.so.0 => /lib64/libpthread.so.0 (0x0000003837800000)
>>         libdl.so.2 => /lib64/libdl.so.2 (0x0000003837400000)
>>         libutil.so.1 => /lib64/libutil.so.1 (0x0000003846600000)
>>         libm.so.6 => /lib64/libm.so.6 (0x00000037c7000000)
>>         librdmacm.so.1 => /usr/lib64/librdmacm.so.1 (0x0000003832200000)
>>         libibverbs.so.1 => /usr/lib64/libibverbs.so.1 (0x0000003831e00000)
>>         libtorque.so.2 => /opt/torque//lib/libtorque.so.2
>> (0x00002ae69e21a000)
>>         libnsl.so.1 => /lib64/libnsl.so.1 (0x000000383ee00000)
>>         libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x0000003848a00000)
>>         libc.so.6 => /lib64/libc.so.6 (0x0000003836800000)
>>         libgfortran.so.1 => /usr/lib64/libgfortran.so.1
>> (0x00002ae69e51d000)
>>         /lib64/ld-linux-x86-64.so.2 (0x0000003836400000)
>>
>>  i don't see any reference to mkl here - this means even if gpaw-python
>> was built with your customize.py below - it is not used here. Check what
>> is used with:
>> which gpaw-python
>>
>>  =================
>> and still
>>
>> gpaw-python `which gpaw-test` 2>&1 | tee test.log
>>
>>  python 2.7.5 GCC 4.1.2 20080704 (Red Hat 4.1.2-52) 64bit ELF on Linux
>> x86_64 redhat 5.8 Final
>> Running tests in /tmp/gpaw-test-xnaRic
>> Jobs: 1, Cores: 4, debug-mode: False
>>
>> =============================================================================
>> gemm_complex.py                         0.044  OK
>> mpicomm.py                              0.009  OK
>> ase3k_version.py                        0.007  OK
>> numpy_core_multiarray_dot.py            0.007  OK
>> eigh.py                                 0.009  OK
>> lapack.py                               0.009  OK
>> dot.py                                  0.010  OK
>> lxc_fxc.py                              0.009  OK
>> blas.py                                 0.010  OK
>> erf.py                                  0.008  OK
>> gp2.py
>>
>>  .................
>>
>>  er.py                              0.058  OK
>> pw/interpol.py                          0.011  OK
>> poisson.py                              0.062  OK
>> pw/lfc.py                          gpaw-python: symbol lookup error:
>> /opt/intel/composerxe-2011.3.174/mkl/lib/intel64/libmkl_intel_thread.so:
>> undefined symbol: omp_get_num_procs
>> gpaw-python: symbol lookup error:
>> /opt/intel/composerxe-2011.3.174/mkl/lib/intel64/libmkl_intel_thread.so:
>> undefined symbol: omp_get_num_procs
>> gpaw-python: symbol lookup error:
>> /opt/intel/composerxe-2011.3.174/mkl/lib/intel64/libmkl_intel_thread.so:
>> undefined symbol: omp_get_num_procs
>>
>>  something is referencing libmkl_intel_thread.so - maybe numpy, or maybe
>> this time a different gpaw-python is run?
>> Check that with:
>> python -c "from numpy.core import _dotblas; print _dotblas.__file__"
>> ldd `python -c "from numpy.core import _dotblas; print _dotblas.__file__"`
>> Is it the numpy you intend to use?
>>
>>
>> --------------------------------------------------------------------------
>> mpirun has exited due to process rank 2 with PID 18057 on
>> node breadserver.physics.ntu exiting without calling "finalize". This may
>> have caused other processes in the application to be
>> terminated by signals sent by mpirun (as reported here).
>> --------------------------------------------------------------------------
>>  (things went back to the old problem)
>> ====================================================
>>
>>  and I set
>>
>> export MKL_THREADING_LAYER=MKL_THREADING_SEQUENTIAL
>>
>> export OMP_NUM_THREADS=1
>>
>>  in my ~/.bashrc
>>
>>  my customize.py is :
>>
>>  compiler = 'gcc'
>> libraries = ['mkl_intel_lp64' ,'mkl_sequential' ,'mkl_core',
>>              'mkl_lapack95_lp64',
>>              'mkl_scalapack_lp64', 'mkl_blacs_intelmpi_lp64',
>>              'pthread']
>>  define_macros += [('GPAW_NO_UNDERSCORE_CBLACS', '1')]
>>  define_macros += [('GPAW_NO_UNDERSCORE_CSCALAPACK', '1')]
>> define_macros += [("GPAW_ASYNC",1)]
>> define_macros += [("GPAW_MPI2",1)]
>> library_dirs += ['/opt/intel/composerxe-2011.3.174/mkl/lib/intel64']
>>
>>
>>  mpicompiler = 'mpicc'
>> mpilinker = mpicompiler
>>
>>  libraries += ['xc']
>>  include_dirs += ['/opt/libxc/include']
>> library_dirs += ['/opt/libxc/lib']
>>  ====================================================
>>
>>  the customize.py looks more or less OK. Let's keep it for now.
>>
>> Marcin
>>
>>
>>  which is nearly the same (at least I think so.) from what you gave me :
>> https://wiki.fysik.dtu.dk/gpaw/install/Linux/SUNCAT/SUNCAT.html#mkl-10-2-notes
>>
>>  Is there anything I can do ?
>>
>>  BR.
>> chi-hsuan
>>
>>
>>
>>
>>
>>
>>
>>  2013/10/23 謝其軒 <z955018 at gmail.com>
>>
>>> yes, and I relink again, and there's the error message while compiling.
>>> =================
>>> ...............
>>>  /home/z955018/Ultimatum/ALLINSTALLATIONS/libxc-2.0.1/lib/libxc.a(libxc_):
>>> In function `xc_rho2dzeta':
>>> util.c:(.text+0x0): multiple definition of `xc_rho2dzeta'
>>> build/temp.linux-x86_64-2.7/c/xc/revtpss_c_pbe.o:/home/z955018/Ultimatu9.1.10596/c/xc/revtpss_c_pbe.c:43:
>>> first defined here
>>> /usr/bin/ld: Warning: size of symbol `xc_rho2dzeta' changed from 139
>>> inmp.linux-x86_64-2.7/c/xc/revtpss_c_pbe.o to 80 in
>>> /home/z955018/UltimatTALLATIONS/libxc-2.0.1/lib/libxc.a(libxc_la-util.o)
>>> collect2: ld returned 1 exit status
>>> error: command 'gcc' failed with exit status 1
>>>  ==================
>>>
>>>  did I miss something?
>>>
>>>
>>>  BR,
>>> chi-hsuan
>>>
>>>
>>>
>>> 2013/10/23 Marcin Dulak <Marcin.Dulak at fysik.dtu.dk>
>>>
>>>>  Hi,
>>>>
>>>> yes, it looks you are not linking to libxc, please see instructions
>>>> here:
>>>>
>>>> https://wiki.fysik.dtu.dk/gpaw/install/installationguide.html#libxc-installation
>>>>
>>>> Best regards,
>>>>
>>>> Marcin
>>>>
>>>>
>>>> On 10/23/2013 05:16 AM, 謝其軒 wrote:
>>>>
>>>>  Dear Marcin,
>>>> It seems that the problem results from LIBXC
>>>> the building log is here
>>>> =====================================
>>>>  build/temp.linux-x86_64-2.7/c/xc/m06l.o: In function `c_m06l_para':
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/m06l.c:329: undefined
>>>> reference to `xc_lda_exc_vxc'
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/m06l.c:339: undefined
>>>> reference to `xc_lda_exc_vxc'
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/m06l.c:373: undefined
>>>> reference to `xc_lda_exc_vxc'
>>>> build/temp.linux-x86_64-2.7/c/xc/m06l.o: In function `m06l_end':
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/m06l.c:734: undefined
>>>> reference to `xc_func_end'
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/m06l.c:737: undefined
>>>> reference to `xc_func_end'
>>>> build/temp.linux-x86_64-2.7/c/xc/m06l.o: In function `m06l_init':
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/m06l.c:725: undefined
>>>> reference to `xc_func_init'
>>>> build/temp.linux-x86_64-2.7/c/xc/m06l.o: In function `x_m06l_para':
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/m06l.c:657: undefined
>>>> reference to `xc_gga_exc_vxc'
>>>> build/temp.linux-x86_64-2.7/c/xc/m06l.o: In function `m06l_init':
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/m06l.c:728: undefined
>>>> reference to `xc_func_init'
>>>> build/temp.linux-x86_64-2.7/c/xc/tpss.o: In function `c_tpss_12':
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/tpss.c:293: undefined
>>>> reference to `xc_gga_exc_vxc'
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/tpss.c:309: undefined
>>>> reference to `xc_gga_exc_vxc'
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/tpss.c:326: undefined
>>>> reference to `xc_gga_exc_vxc'
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/tpss.c:293: undefined
>>>> reference to `xc_gga_exc_vxc'
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/tpss.c:309: undefined
>>>> reference to `xc_gga_exc_vxc'
>>>> build/temp.linux-x86_64-2.7/c/xc/tpss.o:/home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/tpss.c:326:
>>>> more undefined references to `xc_gga_exc_vxc' follow
>>>> build/temp.linux-x86_64-2.7/c/xc/tpss.o: In function `x_tpss_para':
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/tpss.c:126: undefined
>>>> reference to `xc_lda_exc_vxc'
>>>> build/temp.linux-x86_64-2.7/c/xc/tpss.o: In function `tpss_end':
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/tpss.c:545: undefined
>>>> reference to `xc_func_end'
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/tpss.c:548: undefined
>>>> reference to `xc_func_end'
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/tpss.c:549: undefined
>>>> reference to `xc_func_end'
>>>>  build/temp.linux-x86_64-2.7/c/xc/tpss.o: In function `tpss_init':
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/tpss.c:535: undefined
>>>> reference to `xc_func_init'
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/tpss.c:539: undefined
>>>> reference to `xc_func_init'
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/tpss.c:540: undefined
>>>> reference to `xc_func_init'
>>>> build/temp.linux-x86_64-2.7/c/xc/revtpss_c_pbe.o: In function
>>>> `xc_perdew_params':
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/revtpss_c_pbe.c:65:
>>>> undefined reference to `xc_lda_exc'
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/revtpss_c_pbe.c:68:
>>>> undefined reference to `xc_lda_exc_vxc'
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/revtpss_c_pbe.c:71:
>>>> undefined reference to `xc_lda'
>>>> build/temp.linux-x86_64-2.7/c/xc/libxc.o: In function `get_point':
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/libxc.c:69: undefined
>>>> reference to `xc_gga_exc_vxc'
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/libxc.c:65: undefined
>>>> reference to `xc_lda_exc_vxc'
>>>> build/temp.linux-x86_64-2.7/c/xc/libxc.o: In function
>>>> `lxcXCFunctional_dealloc':
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/libxc.c:194: undefined
>>>> reference to `xc_func_end'
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/libxc.c:194: undefined
>>>> reference to `xc_func_end'
>>>> build/temp.linux-x86_64-2.7/c/xc/libxc.o: In function `lxcXCFuncNum':
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/libxc.c:907: undefined
>>>> reference to `xc_functional_get_number'
>>>> build/temp.linux-x86_64-2.7/c/xc/libxc.o: In function
>>>> `NewlxcXCFunctionalObject':
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/libxc.c:852: undefined
>>>> reference to `xc_family_from_id'
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/libxc.c:854: undefined
>>>> reference to `xc_func_init'
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/libxc.c:866: undefined
>>>> reference to `xc_family_from_id'
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/libxc.c:868: undefined
>>>> reference to `xc_func_init'
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/libxc.c:861: undefined
>>>> reference to `xc_family_from_id'
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/libxc.c:863: undefined
>>>> reference to `xc_func_init'
>>>> build/temp.linux-x86_64-2.7/c/xc/libxc.o: In function `get_fxc_fd_lda':
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/libxc.c:38: undefined
>>>> reference to `xc_lda_fxc_fd'
>>>> build/temp.linux-x86_64-2.7/c/xc/libxc.o: In function
>>>> `lxcXCFunctional_Calculate':
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/libxc.c:642: undefined
>>>> reference to `xc_lda_exc_vxc'
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/libxc.c:647: undefined
>>>> reference to `xc_gga_exc_vxc'
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/libxc.c:653: undefined
>>>> reference to `xc_mgga_exc_vxc'
>>>> build/temp.linux-x86_64-2.7/c/xc/libxc.o: In function
>>>> `lxcXCFunctional_CalculateFXC':
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/libxc.c:758: undefined
>>>> reference to `xc_lda_fxc'
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/libxc.c:763: undefined
>>>> reference to `xc_gga_fxc'
>>>> build/temp.linux-x86_64-2.7/c/xc/revtpss.o: In function `revtpss_end':
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/revtpss.c:554: undefined
>>>> reference to `xc_func_end'
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/revtpss.c:557: undefined
>>>> reference to `xc_func_end'
>>>> build/temp.linux-x86_64-2.7/c/xc/revtpss.o: In function `revtpss_init':
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/revtpss.c:546: undefined
>>>> reference to `xc_func_init'
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/revtpss.c:548: undefined
>>>> reference to `xc_func_init'
>>>> build/temp.linux-x86_64-2.7/c/xc/revtpss.o: In function
>>>> `x_revtpss_para':
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/revtpss.c:450: undefined
>>>> reference to `xc_lda_exc_vxc'
>>>> build/temp.linux-x86_64-2.7/c/xc/revtpss.o: In function `revtpss_end':
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/revtpss.c:558: undefined
>>>> reference to `xc_func_end'
>>>> build/temp.linux-x86_64-2.7/c/xc/revtpss.o: In function `revtpss_init':
>>>> /home/z955018/Ultimatum/gpaw-0.9.1.10596/c/xc/revtpss.c:549: undefined
>>>> reference to `xc_func_init'
>>>> ld/temp.linux-x86_64-2.7/c/xc/revtpss_c_pbe.o
>>>> build/temp.linux-x86_64-2.7/c/xc/ensemble_gga.o
>>>> build/temp.linux-x86_64-2.7/c/xc/libxc.o
>>>> build/temp.linux-x86_64-2.7/c/xc/pw91.o
>>>> build/temp.linux-x86_64-2.7/c/xc/rpbe.o
>>>> build/temp.linux-x86_64-2.7/c/xc/xc_mgga.o
>>>> build/temp.linux-x86_64-2.7/c/xc/revtpss.o
>>>>  -L/opt/intel/composerxe-2011.3.174/mkl/lib/intel64
>>>> -L/home/z955018/Ultimatum/ALLINSTALLATIONS/libxc-2.0.1/lib
>>>> -L/home/z955018/Ultimatum/ALLINSTALLATIONS/Python-2.7.5/lib/python2.7/config
>>>> -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lmkl_lapack95_lp64
>>>> -lmkl_scalapack_lp64 -lmkl_blacs_intelmpi_lp64 -lpthread -lpython2.7
>>>> -lpthread -ldl  -lutil -lm   -Xlinker -export-dynamic
>>>>
>>>>  * Using standard lapack
>>>> * Compiling gpaw with gcc
>>>> * Architecture: linux-x86_64
>>>> * Building a custom interpreter
>>>> * linking FAILED!  Only serial version of code will work.
>>>>  =======================================================
>>>>
>>>>  what is this ?
>>>>
>>>>  BR,
>>>> chi-hsuan
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> 2013/10/22 Marcin Dulak <Marcin.Dulak at fysik.dtu.dk>
>>>>
>>>>>   On 10/22/2013 01:35 PM, 謝其軒 wrote:
>>>>>
>>>>> Dear Marcin,
>>>>> Although I may realize what the problem is, I still don't know how to
>>>>> solve it.
>>>>> Could you demonstrate step by step?
>>>>>
>>>>>  make sure it's the gpaw-python which is linked to
>>>>> libmkl_intel_thread.so:
>>>>> ldd `which gpaw-python`
>>>>> and in order to get rid of that library please try to build new
>>>>> gpaw-python using the customize.py
>>>>>
>>>>> https://wiki.fysik.dtu.dk/gpaw/install/Linux/SUNCAT/SUNCAT.html#mkl-10-2-notes(or something very similar to)
>>>>> without reference to threading libraries, as described at
>>>>> https://wiki.fysik.dtu.dk/gpaw/devel/developer_installation.html#developer-installation
>>>>> cd gpaw
>>>>> rm -rf build
>>>>> ....
>>>>>
>>>>> Best regards,
>>>>>
>>>>> Marcin
>>>>>
>>>>>
>>>>>  BR,
>>>>>
>>>>>  chi-hsuan
>>>>>
>>>>>
>>>>> 2013/10/19 Marcin Dulak <Marcin.Dulak at fysik.dtu.dk>
>>>>>
>>>>>>   Hi,
>>>>>>
>>>>>>
>>>>>> On 10/18/2013 11:28 AM, 謝其軒 wrote:
>>>>>>
>>>>>> Dear everyone,
>>>>>> I am one step close to compile the parallel version!
>>>>>> here's my question.
>>>>>> when I start the test :
>>>>>> ----
>>>>>> mpiexec -np 4 gpaw-python `which gpaw-test` 2>&1 | tee testgpaw.log
>>>>>> ----
>>>>>> and here's the file : testgpaw.log
>>>>>>
>>>>>>  you can find my suggestion here:
>>>>>> http://listserv.fysik.dtu.dk/pipermail/gpaw-users/2013-October/002408.html
>>>>>>
>>>>>> Best regards,
>>>>>>
>>>>>> Marcin
>>>>>>
>>>>>>   ----
>>>>>>
>>>>>> --------------------------------------------------------------------------
>>>>>>  An MPI process has executed an operation involving a call to the
>>>>>> "fork()" system call to create a child process.  Open MPI is currently
>>>>>> operating in a condition that could result in memory corruption or
>>>>>> other system errors; your MPI job may hang, crash, or produce silent
>>>>>> data corruption.  The use of fork() (or system() or other calls that
>>>>>> create child processes) is strongly discouraged.
>>>>>>
>>>>>>  The process that invoked fork was:
>>>>>>
>>>>>>    Local host:          node04 (PID 26324)
>>>>>>   MPI_COMM_WORLD rank: 0
>>>>>>
>>>>>>  If you are *absolutely sure* that your application will successfully
>>>>>> and correctly survive a call to fork(), you may disable this warning
>>>>>> by setting the mpi_warn_on_fork MCA parameter to 0.
>>>>>>
>>>>>> --------------------------------------------------------------------------
>>>>>> python 2.7.5 GCC 4.1.2 20080704 (Red Hat 4.1.2-52) 64bit ELF on Linux
>>>>>> x86_64 redhat 5.7 Final
>>>>>> Running tests in /tmp/gpaw-test-4xzJUA
>>>>>> Jobs: 1, Cores: 4, debug-mode: False
>>>>>>
>>>>>> =============================================================================
>>>>>> gemm_complex.py                         0.011  OK
>>>>>> mpicomm.py                              0.011  OK
>>>>>> ase3k_version.py                        0.009  OK
>>>>>> numpy_core_multiarray_dot.py            0.008  OK
>>>>>> eigh.py                                 0.011  OK
>>>>>> lapack.py                               0.012  OK
>>>>>> dot.py                                  0.012  OK
>>>>>> lxc_fxc.py                              0.011  OK
>>>>>> blas.py                                 0.013  OK
>>>>>> erf.py                                  0.011  OK
>>>>>> gp2.py                                  0.014  OK
>>>>>> kptpar.py                               4.205  OK
>>>>>> non_periodic.py                         0.040  OK
>>>>>> parallel/blacsdist.py                   0.014  OK
>>>>>> gradient.py                             0.017  OK
>>>>>> cg2.py                                  0.022  OK
>>>>>> kpt.py                                  0.028  OK
>>>>>> lf.py                                   0.032  OK
>>>>>> gd.py                                   0.013  OK
>>>>>> parallel/compare.py                     0.046  OK
>>>>>> pbe_pw91.py                             0.012  OK
>>>>>> fsbt.py                                 0.014  OK
>>>>>> derivatives.py                          0.018  OK
>>>>>> Gauss.py                                0.024  OK
>>>>>> second_derivative.py                    0.022  OK
>>>>>> integral4.py                            0.038  OK
>>>>>> parallel/ut_parallel.py            [node04:26319] 3 more processes
>>>>>> have sent help message help-mpi-runtime.txt / mpi_init:warn-fork
>>>>>> [node04:26319] Set MCA parameter "orte_base_help_aggregate" to 0 to
>>>>>> see all help / error messages
>>>>>>      1.080  OK
>>>>>> transformations.py                      0.023  OK
>>>>>> parallel/parallel_eigh.py               0.010  OK
>>>>>> spectrum.py                             0.080  OK
>>>>>> xc.py                                   0.047  OK
>>>>>> zher.py                                 0.046  OK
>>>>>> pbc.py                                  0.041  OK
>>>>>> lebedev.py                              0.030  OK
>>>>>> parallel/ut_hsblacs.py                  0.065  OK
>>>>>> occupations.py                          0.070  OK
>>>>>> dump_chi0.py                            0.046  OK
>>>>>> cluster.py                              0.128  OK
>>>>>> pw/interpol.py                          0.013  OK
>>>>>> poisson.py                              0.054  OK
>>>>>> pw/lfc.py                          gpaw-python: symbol lookup error:
>>>>>> /opt/intel/composerxe-2011.3.174/mkl/lib/intel64/libmkl_intel_thread.so:
>>>>>> undefined symbol: omp_get_num_procs
>>>>>> gpaw-python: symbol lookup error:
>>>>>> /opt/intel/composerxe-2011.3.174/mkl/lib/intel64/libmkl_intel_thread.so:
>>>>>> undefined symbol: omp_get_num_procs
>>>>>> gpaw-python: symbol lookup error:
>>>>>> /opt/intel/composerxe-2011.3.174/mkl/lib/intel64/libmkl_intel_thread.so:
>>>>>> undefined symbol: omp_get_num_procs
>>>>>> gpaw-python: symbol lookup error:
>>>>>> /opt/intel/composerxe-2011.3.174/mkl/lib/intel64/libmkl_intel_thread.so:
>>>>>> undefined symbol: omp_get_num_procs
>>>>>>
>>>>>> --------------------------------------------------------------------------
>>>>>> mpiexec has exited due to process rank 0 with PID 26324 on
>>>>>> node node04 exiting without calling "finalize". This may
>>>>>> have caused other processes in the application to be
>>>>>> terminated by signals sent by mpiexec (as reported here).
>>>>>>
>>>>>> --------------------------------------------------------------------------
>>>>>>
>>>>>>  what happened?
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> 2013/10/11 Marcin Dulak <Marcin.Dulak at fysik.dtu.dk>
>>>>>>
>>>>>>>    On 10/11/2013 12:13 PM, 謝其軒 wrote:
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> ---------- Forwarded message ----------
>>>>>>> From: 謝 其軒 <z955018 at gmail.com>
>>>>>>> Date: 2013/10/11
>>>>>>> Subject: Re: [gpaw-users] can't using gpaw-python
>>>>>>> To: Marcin Dulak <Marcin.Dulak at fysik.dtu.dk>
>>>>>>>
>>>>>>>
>>>>>>>  Hi again,
>>>>>>> Is Mr. Marcin there?
>>>>>>> I've tried to compile all the package myself and failed.
>>>>>>>  I recover the system to the original one that numpy and scipy and
>>>>>>> others belongs to Canopy.
>>>>>>> And of course, the error remain. I doubt that the numpy gpaw-python
>>>>>>> uses is not the one in Canopy but in the root's directory.
>>>>>>> But how can I change that for serial version works fine?
>>>>>>>
>>>>>>>    please attach the customize.py and all the steps you perform.
>>>>>>> After your last email it looked like we were close to get the
>>>>>>> working a gpaw-python,
>>>>>>> you just need to replace the intel threading libraries as described
>>>>>>> in the link i sent.
>>>>>>>
>>>>>>> Best regards,
>>>>>>>
>>>>>>> Marcin
>>>>>>>
>>>>>>>
>>>>>>
>>
>
>
> --
> ***********************************
>
> Marcin Dulak
> Technical University of Denmark
> Department of Physics
> Building 307, Room 229
> DK-2800 Kongens Lyngby
> Denmark
> Tel.: (+45) 4525 3157
> Fax.: (+45) 4593 2399
> email: Marcin.Dulak at fysik.dtu.dk
>
> ***********************************
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://listserv.fysik.dtu.dk/pipermail/gpaw-users/attachments/20131023/c933856e/attachment-0001.html>


More information about the gpaw-users mailing list