[gpaw-users] Fwd: can't using gpaw-python

謝其軒 z955018 at gmail.com
Fri Oct 18 11:28:29 CEST 2013


Dear everyone,
I am one step close to compile the parallel version!
here's my question.
when I start the test :
----
mpiexec -np 4 gpaw-python `which gpaw-test` 2>&1 | tee testgpaw.log
----
and here's the file : testgpaw.log
----
--------------------------------------------------------------------------
An MPI process has executed an operation involving a call to the
"fork()" system call to create a child process.  Open MPI is currently
operating in a condition that could result in memory corruption or
other system errors; your MPI job may hang, crash, or produce silent
data corruption.  The use of fork() (or system() or other calls that
create child processes) is strongly discouraged.

The process that invoked fork was:

  Local host:          node04 (PID 26324)
  MPI_COMM_WORLD rank: 0

If you are *absolutely sure* that your application will successfully
and correctly survive a call to fork(), you may disable this warning
by setting the mpi_warn_on_fork MCA parameter to 0.
--------------------------------------------------------------------------
python 2.7.5 GCC 4.1.2 20080704 (Red Hat 4.1.2-52) 64bit ELF on Linux
x86_64 redhat 5.7 Final
Running tests in /tmp/gpaw-test-4xzJUA
Jobs: 1, Cores: 4, debug-mode: False
=============================================================================
gemm_complex.py                         0.011  OK
mpicomm.py                              0.011  OK
ase3k_version.py                        0.009  OK
numpy_core_multiarray_dot.py            0.008  OK
eigh.py                                 0.011  OK
lapack.py                               0.012  OK
dot.py                                  0.012  OK
lxc_fxc.py                              0.011  OK
blas.py                                 0.013  OK
erf.py                                  0.011  OK
gp2.py                                  0.014  OK
kptpar.py                               4.205  OK
non_periodic.py                         0.040  OK
parallel/blacsdist.py                   0.014  OK
gradient.py                             0.017  OK
cg2.py                                  0.022  OK
kpt.py                                  0.028  OK
lf.py                                   0.032  OK
gd.py                                   0.013  OK
parallel/compare.py                     0.046  OK
pbe_pw91.py                             0.012  OK
fsbt.py                                 0.014  OK
derivatives.py                          0.018  OK
Gauss.py                                0.024  OK
second_derivative.py                    0.022  OK
integral4.py                            0.038  OK
parallel/ut_parallel.py            [node04:26319] 3 more processes have
sent help message help-mpi-runtime.txt / mpi_init:warn-fork
[node04:26319] Set MCA parameter "orte_base_help_aggregate" to 0 to see all
help / error messages
     1.080  OK
transformations.py                      0.023  OK
parallel/parallel_eigh.py               0.010  OK
spectrum.py                             0.080  OK
xc.py                                   0.047  OK
zher.py                                 0.046  OK
pbc.py                                  0.041  OK
lebedev.py                              0.030  OK
parallel/ut_hsblacs.py                  0.065  OK
occupations.py                          0.070  OK
dump_chi0.py                            0.046  OK
cluster.py                              0.128  OK
pw/interpol.py                          0.013  OK
poisson.py                              0.054  OK
pw/lfc.py                          gpaw-python: symbol lookup error:
/opt/intel/composerxe-2011.3.174/mkl/lib/intel64/libmkl_intel_thread.so:
undefined symbol: omp_get_num_procs
gpaw-python: symbol lookup error:
/opt/intel/composerxe-2011.3.174/mkl/lib/intel64/libmkl_intel_thread.so:
undefined symbol: omp_get_num_procs
gpaw-python: symbol lookup error:
/opt/intel/composerxe-2011.3.174/mkl/lib/intel64/libmkl_intel_thread.so:
undefined symbol: omp_get_num_procs
gpaw-python: symbol lookup error:
/opt/intel/composerxe-2011.3.174/mkl/lib/intel64/libmkl_intel_thread.so:
undefined symbol: omp_get_num_procs
--------------------------------------------------------------------------
mpiexec has exited due to process rank 0 with PID 26324 on
node node04 exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpiexec (as reported here).
--------------------------------------------------------------------------

what happened?




2013/10/11 Marcin Dulak <Marcin.Dulak at fysik.dtu.dk>

>  On 10/11/2013 12:13 PM, 謝其軒 wrote:
>
>
>
> ---------- Forwarded message ----------
> From: 謝其軒 <z955018 at gmail.com>
> Date: 2013/10/11
> Subject: Re: [gpaw-users] can't using gpaw-python
> To: Marcin Dulak <Marcin.Dulak at fysik.dtu.dk>
>
>
>  Hi again,
> Is Mr. Marcin there?
> I've tried to compile all the package myself and failed.
>  I recover the system to the original one that numpy and scipy and others
> belongs to Canopy.
> And of course, the error remain. I doubt that the numpy gpaw-python uses
> is not the one in Canopy but in the root's directory.
> But how can I change that for serial version works fine?
>
> please attach the customize.py and all the steps you perform.
> After your last email it looked like we were close to get the working a
> gpaw-python,
> you just need to replace the intel threading libraries as described in the
> link i sent.
>
> Best regards,
>
> Marcin
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://listserv.fysik.dtu.dk/pipermail/gpaw-users/attachments/20131018/df98dd95/attachment.html>


More information about the gpaw-users mailing list