[gpaw-users] can't using gpaw-python

謝其軒 z955018 at gmail.com
Thu Oct 10 15:09:32 CEST 2013


Hi,
Marcin
I'm afraid that I'd confuse you.
I compiled numpy1.7.0 myself with the recipe in
install_sun_chpc_SUSE10.sh<https://wiki.fysik.dtu.dk/gpaw/install/Linux/sun_chpc.html>
and
with python2.7 only.
Then I compiled gpaw with python2.7, too.
and do the gpaw-python test

------------
python 2.7.5 GCC 4.1.2 20080704 (Red Hat 4.1.2-52) 64bit ELF on Linux
x86_64 redhat 5.8 Final
Running tests in /tmp/gpaw-test-Htm4W4
Jobs: 1, Cores: 4, debug-mode: False
=============================================================================
gemm_complex.py                         0.049  OK
mpicomm.py                              0.009  OK
ase3k_version.py                        0.007  OK
numpy_core_multiarray_dot.py            0.007  OK
eigh.py                                 0.075  OK
lapack.py                               0.014  OK
......................
occupations.py                          0.057  OK
dump_chi0.py                            0.040  OK
cluster.py                              0.144  OK
pw/interpol.py                          0.011  OK
poisson.py                              0.051  OK
pw/lfc.py                          gpaw-python: symbol lookup error:
/opt/intel/composerxe-2011.3.174/mkl/lib/intel64/libmkl_intel_thread.so:
undefined symbol: omp_get_num_procs
gpaw-python: symbol lookup error:
/opt/intel/composerxe-2011.3.174/mkl/lib/intel64/libmkl_intel_thread.so:
undefined symbol: omp_get_num_procs
gpaw-python: symbol lookup error:
/opt/intel/composerxe-2011.3.174/mkl/lib/intel64/libmkl_intel_thread.so:
undefined symbol: omp_get_num_procs
gpaw-python: symbol lookup error:
/opt/intel/composerxe-2011.3.174/mkl/lib/intel64/libmkl_intel_thread.so:
undefined symbol: omp_get_num_procs
--------------------------------------------------------------------------
mpiexec has exited due to process rank 2 with PID 26226 on
node breadserver.physics.ntu exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpiexec (as reported here).
--------------------------------------------------------------------------



---------------------------------------------------------------------------------------------------

All sound fine.
----------------
[z955018 at breadserver gpaw-0.9.1.10596]$ gpaw-python
Python 2.7.5 (default, Aug 13 2013, 16:24:20)
[GCC 4.1.2 20080704 (Red Hat 4.1.2-52)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>>
--------------

When I start to calculate the ground state, the result works fine.
However, there's a MPI warning :


--------------------------------------------------------------------------
An MPI process has executed an operation involving a call to the
"fork()" system call to create a child process.  Open MPI is currently
operating in a condition that could result in memory corruption or
other system errors; your MPI job may hang, crash, or produce silent
data corruption.  The use of fork() (or system() or other calls that
create child processes) is strongly discouraged.

The process that invoked fork was:

  Local host:          node01 (PID 4577)
  MPI_COMM_WORLD rank: 0

If you are *absolutely sure* that your application will successfully
and correctly survive a call to fork(), you may disable this warning
by setting the mpi_warn_on_fork MCA parameter to 0.
--------------------------------------------------------------------------
[node01:04576] 7 more processes have sent help message help-mpi-runtime.txt
/ mpi_init:warn-fork
[node01:04576] Set MCA parameter "orte_base_help_aggregate" to 0 to see all
help / error messages
~


I can't figure out what it is.

---------------------------------------------------------

By the way, how can I find a BLAS/LAPACK library when compiling Scipy?




chi-hsuan
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://listserv.fysik.dtu.dk/pipermail/gpaw-users/attachments/20131010/cb63c176/attachment.html>


More information about the gpaw-users mailing list