[gpaw-users] Fwd: can't using gpaw-python

Marcin Dulak Marcin.Dulak at fysik.dtu.dk
Tue Oct 22 14:46:36 CEST 2013


On 10/22/2013 01:35 PM, 謝其軒 wrote:
> Dear Marcin,
> Although I may realize what the problem is, I still don't know how to 
> solve it.
> Could you demonstrate step by step?
make sure it's the gpaw-python which is linked to libmkl_intel_thread.so:
ldd `which gpaw-python`
and in order to get rid of that library please try to build new 
gpaw-python using the customize.py
https://wiki.fysik.dtu.dk/gpaw/install/Linux/SUNCAT/SUNCAT.html#mkl-10-2-notes 
(or something very similar to)
without reference to threading libraries, as described at 
https://wiki.fysik.dtu.dk/gpaw/devel/developer_installation.html#developer-installation
cd gpaw
rm -rf build
....

Best regards,

Marcin
>
> BR,
>
> chi-hsuan
>
>
> 2013/10/19 Marcin Dulak <Marcin.Dulak at fysik.dtu.dk 
> <mailto:Marcin.Dulak at fysik.dtu.dk>>
>
>     Hi,
>
>
>     On 10/18/2013 11:28 AM, 謝其軒 wrote:
>>     Dear everyone,
>>     I am one step close to compile the parallel version!
>>     here's my question.
>>     when I start the test :
>>     ----
>>     mpiexec -np 4 gpaw-python `which gpaw-test` 2>&1 | tee testgpaw.log
>>     ----
>>     and here's the file : testgpaw.log
>     you can find my suggestion here:
>     http://listserv.fysik.dtu.dk/pipermail/gpaw-users/2013-October/002408.html
>
>     Best regards,
>
>     Marcin
>
>>     ----
>>     --------------------------------------------------------------------------
>>     An MPI process has executed an operation involving a call to the
>>     "fork()" system call to create a child process.  Open MPI is
>>     currently
>>     operating in a condition that could result in memory corruption or
>>     other system errors; your MPI job may hang, crash, or produce silent
>>     data corruption.  The use of fork() (or system() or other calls that
>>     create child processes) is strongly discouraged.
>>
>>     The process that invoked fork was:
>>
>>       Local host:          node04 (PID 26324)
>>       MPI_COMM_WORLD rank: 0
>>
>>     If you are *absolutely sure* that your application will successfully
>>     and correctly survive a call to fork(), you may disable this warning
>>     by setting the mpi_warn_on_fork MCA parameter to 0.
>>     --------------------------------------------------------------------------
>>     python 2.7.5 GCC 4.1.2 20080704 (Red Hat 4.1.2-52) 64bit ELF on
>>     Linux x86_64 redhat 5.7 Final
>>     Running tests in /tmp/gpaw-test-4xzJUA
>>     Jobs: 1, Cores: 4, debug-mode: False
>>     =============================================================================
>>     gemm_complex.py 0.011  OK
>>     mpicomm.py  0.011  OK
>>     ase3k_version.py  0.009  OK
>>     numpy_core_multiarray_dot.py  0.008  OK
>>     eigh.py 0.011  OK
>>     lapack.py 0.012  OK
>>     dot.py  0.012  OK
>>     lxc_fxc.py  0.011  OK
>>     blas.py 0.013  OK
>>     erf.py  0.011  OK
>>     gp2.py  0.014  OK
>>     kptpar.py 4.205  OK
>>     non_periodic.py 0.040  OK
>>     parallel/blacsdist.py 0.014  OK
>>     gradient.py 0.017  OK
>>     cg2.py  0.022  OK
>>     kpt.py  0.028  OK
>>     lf.py 0.032  OK
>>     gd.py 0.013  OK
>>     parallel/compare.py 0.046  OK
>>     pbe_pw91.py 0.012  OK
>>     fsbt.py 0.014  OK
>>     derivatives.py  0.018  OK
>>     Gauss.py  0.024  OK
>>     second_derivative.py  0.022  OK
>>     integral4.py  0.038  OK
>>     parallel/ut_parallel.py  [node04:26319] 3 more processes have
>>     sent help message help-mpi-runtime.txt / mpi_init:warn-fork
>>     [node04:26319] Set MCA parameter "orte_base_help_aggregate" to 0
>>     to see all help / error messages
>>          1.080  OK
>>     transformations.py  0.023  OK
>>     parallel/parallel_eigh.py 0.010  OK
>>     spectrum.py 0.080  OK
>>     xc.py 0.047  OK
>>     zher.py 0.046  OK
>>     pbc.py  0.041  OK
>>     lebedev.py  0.030  OK
>>     parallel/ut_hsblacs.py  0.065  OK
>>     occupations.py  0.070  OK
>>     dump_chi0.py  0.046  OK
>>     cluster.py  0.128  OK
>>     pw/interpol.py  0.013  OK
>>     poisson.py  0.054  OK
>>     pw/lfc.py  gpaw-python: symbol lookup error:
>>     /opt/intel/composerxe-2011.3.174/mkl/lib/intel64/libmkl_intel_thread.so:
>>     undefined symbol: omp_get_num_procs
>>     gpaw-python: symbol lookup error:
>>     /opt/intel/composerxe-2011.3.174/mkl/lib/intel64/libmkl_intel_thread.so:
>>     undefined symbol: omp_get_num_procs
>>     gpaw-python: symbol lookup error:
>>     /opt/intel/composerxe-2011.3.174/mkl/lib/intel64/libmkl_intel_thread.so:
>>     undefined symbol: omp_get_num_procs
>>     gpaw-python: symbol lookup error:
>>     /opt/intel/composerxe-2011.3.174/mkl/lib/intel64/libmkl_intel_thread.so:
>>     undefined symbol: omp_get_num_procs
>>     --------------------------------------------------------------------------
>>     mpiexec has exited due to process rank 0 with PID 26324 on
>>     node node04 exiting without calling "finalize". This may
>>     have caused other processes in the application to be
>>     terminated by signals sent by mpiexec (as reported here).
>>     --------------------------------------------------------------------------
>>
>>     what happened?
>>
>>
>>
>>
>>     2013/10/11 Marcin Dulak <Marcin.Dulak at fysik.dtu.dk
>>     <mailto:Marcin.Dulak at fysik.dtu.dk>>
>>
>>         On 10/11/2013 12:13 PM, 謝其軒 wrote:
>>>
>>>
>>>         ---------- Forwarded message ----------
>>>         From: *謝其軒* <z955018 at gmail.com <mailto:z955018 at gmail.com>>
>>>         Date: 2013/10/11
>>>         Subject: Re: [gpaw-users] can't using gpaw-python
>>>         To: Marcin Dulak <Marcin.Dulak at fysik.dtu.dk
>>>         <mailto:Marcin.Dulak at fysik.dtu.dk>>
>>>
>>>
>>>         Hi again,
>>>         Is Mr. Marcin there?
>>>         I've tried to compile all the package myself and failed.
>>>         I recover the system to the original one that numpy and
>>>         scipy and others belongs to Canopy.
>>>         And of course, the error remain. I doubt that the numpy
>>>         gpaw-python uses is not the one in Canopy but in the root's
>>>         directory.
>>>         But how can I change that for serial version works fine?
>>         please attach the customize.py and all the steps you perform.
>>         After your last email it looked like we were close to get the
>>         working a gpaw-python,
>>         you just need to replace the intel threading libraries as
>>         described in the link i sent.
>>
>>         Best regards,
>>
>>         Marcin
>>
>>
>
>
>     -- 
>     ***********************************
>       
>     Marcin Dulak
>     Technical University of Denmark
>     Department of Physics
>     Building 307, Room 229
>     DK-2800 Kongens Lyngby
>     Denmark
>     Tel.:(+45) 4525 3157  <tel:%28%2B45%29%204525%203157>
>     Fax.:(+45) 4593 2399  <tel:%28%2B45%29%204593%202399>
>     email:Marcin.Dulak at fysik.dtu.dk  <mailto:Marcin.Dulak at fysik.dtu.dk>
>
>     ***********************************
>
>


-- 
***********************************
  
Marcin Dulak
Technical University of Denmark
Department of Physics
Building 307, Room 229
DK-2800 Kongens Lyngby
Denmark
Tel.: (+45) 4525 3157
Fax.: (+45) 4593 2399
email: Marcin.Dulak at fysik.dtu.dk

***********************************

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://listserv.fysik.dtu.dk/pipermail/gpaw-users/attachments/20131022/c9ee62af/attachment-0001.html>


More information about the gpaw-users mailing list