[gpaw-users] Problem with compiling parallel GPAW

Marcin Dulak Marcin.Dulak at fysik.dtu.dk
Thu Jul 24 16:24:35 CEST 2014


Hi,

On 07/23/2014 08:06 PM, Michalsky Ronald wrote:
> This worked but running the tests:
> gpaw-python `which gpaw-test` 2>&1 | tee test.log
> stops at “bse_MoS2_cut.py”
stops with an error or hangs?
>
> I assume since the parallel excetubale is not built,
> mpirun -np 2 gpaw-python -c "import gpaw.mpi as mpi; print mpi.rank"
> yields: -bash: mpirun: command not found
mpirun/mpiexec is not found - probably some environment variables not set.
Please try to get first the openmpi tests that correspond to your 
compiled openmpi-1.8.1 working running on the cluster, e.g.
http://www.student.dtu.dk/~mdul/accts/accts/mpi/gcc/examples.sh

> and: mpiexec -np 4 gpaw-python `which gpaw-test` 2>&1 | tee test4.log
> yields: -bash: mpiexec: command not found
>
> The content of test.log is:
>
> [brutus1.ethz.ch:07077] mca: base: component_find: unable to open /cluster/home03/mavt/ronaldm/openmpi-1.8.1.gfortran/lib/openmpi/mca_ess_lsf: libbat.so: cannot open shared object file: No such file or directory (ignored)
> --------------------------------------------------------------------------
> The OpenFabrics (openib) BTL failed to initialize while trying to
> allocate some locked memory.  This typically can indicate that the
> memlock limits are set too low.  For most HPC installations, the
> memlock limits should be set to "unlimited".  The failure occured
> here:
>
>    Local host:    brutus1
>    OMPI source:   btl_openib.c:203
>    Function:      ibv_create_cq()
>    Device:        mlx4_0
>    Memlock limit: 2097152
>
> You may need to consult with your system administrator to get this
> problem fixed.  This FAQ entry on the Open MPI web site may also be
> helpful:
i see some infiniband references - have you compiled openmpi with 
infiniband and batch system support?
Normally compiling own mpi on a cluster is not a good idea, better use 
the provided ones.

Best regards,

Marcin
>
>      http://www.open-mpi.org/faq/?category=openfabrics#ib-locked-pages
> --------------------------------------------------------------------------
> --------------------------------------------------------------------------
> An MPI process has executed an operation involving a call to the
> "fork()" system call to create a child process.  Open MPI is currently
> operating in a condition that could result in memory corruption or
> other system errors; your MPI job may hang, crash, or produce silent
> data corruption.  The use of fork() (or system() or other calls that
> create child processes) is strongly discouraged.
>
> The process that invoked fork was:
>
>    Local host:          brutus1 (PID 7077)
>    MPI_COMM_WORLD rank: 0
>
> If you are *absolutely sure* that your application will successfully
> and correctly survive a call to fork(), you may disable this warning
> by setting the mpi_warn_on_fork MCA parameter to 0.
> --------------------------------------------------------------------------
> python 2.7.2 GCC 4.4.6 20110731 (Red Hat 4.4.6-3) 64bit ELF on Linux x86_64 centos 6.5 Final
> Running tests in /tmp/gpaw-test-ux2iIc
> Jobs: 1, Cores: 1, debug-mode: False
> =============================================================================
> gemm_complex.py                  0.019  OK
> mpicomm.py                       0.022  OK
> ase3k_version.py                 0.030  OK
> numpy_core_multiarray_dot.py     0.017  OK
> eigh.py                          0.033  OK
> lapack.py                        0.020  OK
> dot.py                           0.013  OK
> lxc_fxc.py                       0.017  OK
> blas.py                          0.024  OK
> erf.py                           0.011  OK
> gp2.py                           0.014  OK
> kptpar.py                        0.010  OK
> non_periodic.py                  0.019  OK
> parallel/blacsdist.py            0.043  OK
> gradient.py                      0.022  OK
> cg2.py                           0.028  OK
> kpt.py                           0.020  OK
> lf.py                            0.018  OK
> gd.py                            0.011  OK
> parallel/compare.py              0.010  OK
> pbe_pw91.py                      0.012  OK
> fsbt.py                          0.017  OK
> derivatives.py                   0.034  OK
> Gauss.py                         0.052  OK
> second_derivative.py             0.040  OK
> integral4.py                     0.211  OK
> parallel/ut_parallel.py          0.085  OK
> transformations.py               0.049  OK
> parallel/parallel_eigh.py        0.024  OK
> spectrum.py                      0.294  OK
> xc.py                            0.116  OK
> zher.py                          0.057  OK
> pbc.py                           0.092  OK
> lebedev.py                       0.042  OK
> parallel/ut_hsblacs.py           0.355  OK
> occupations.py                   0.075  OK
> dump_chi0.py                     0.215  OK
> cluster.py                       0.351  OK
> pw/interpol.py                   0.137  OK
> poisson.py                       0.113  OK
> pw/lfc.py                        0.300  OK
> pw/reallfc.py                    0.413  OK
> XC2.py                           0.285  OK
> multipoletest.py                 0.476  OK
> nabla.py                         0.400  OK
> noncollinear/xccorr.py           0.682  OK
> gauss_wave.py                    0.621  OK
> harmonic.py                      0.429  OK
> atoms_too_close.py               0.557  OK
> screened_poisson.py              0.658  OK
> yukawa_radial.py                 0.017  OK
> noncollinear/xcgrid3d.py         0.655  OK
> vdwradii.py                      2.071  OK
> lcao_restart.py                  0.900  OK
> ase3k.py                         1.287  OK
> parallel/ut_kptops.py            2.112  OK
> fileio/idiotproof_setup.py       1.029  OK
> fileio/hdf5_simple.py            0.094  SKIPPED
> fileio/hdf5_noncontiguous.py     0.022  SKIPPED
> timing.py                        0.984  OK
> coulomb.py                       1.438  OK
> xcatom.py                        1.560  OK
> maxrss.py                        1.641  OK
> proton.py                        1.413  OK
> pw/moleculecg.py                 5.965  OK
> keep_htpsit.py                   7.162  OK
> pw/stresstest.py                 2.411  OK
> aeatom.py                        4.404  OK
> numpy_zdotc_graphite.py          2.449  OK
> lcao_density.py                  2.264  OK
> parallel/overlap.py              2.333  OK
> restart.py                       3.038  OK
> gemv.py                          3.314  OK
> ylexpand.py                      3.458  OK
> potential.py                     3.238  OK
> wfs_io.py                        4.943  OK
> fixocc.py                        5.507  OK
> nonselfconsistentLDA.py          4.462  OK
> gga_atom.py                      3.119  OK
> ds_beta.py                       6.206  OK
> gauss_func.py                    3.223  OK
> noncollinear/h.py                3.476  OK
> symmetry.py                      3.655  OK
> usesymm.py                       3.267  OK
> broydenmixer.py                  5.569  OK
> mixer.py                         5.567  OK
> pes.py                           6.219  OK
> wfs_auto.py                      4.546  OK
> ewald.py                         4.854  OK
> refine.py                        4.352  OK
> revPBE.py                        5.574  OK
> nonselfconsistent.py             5.969  OK
> hydrogen.py                      4.718  OK
> fileio/file_reference.py         4.291  OK
> fixdensity.py                    5.568  OK
> bee1.py                          5.994  OK
> spinFe3plus.py                   5.925  OK
> pw/h.py                          6.230  OK
> stdout.py                        9.607  OK
> parallel/lcao_complicated.py     7.766  OK
> pw/slab.py                      12.213  OK
> spinpol.py                       6.984  OK
> plt.py                           8.149  OK
> lcao_pair_and_coulomb.py         6.221  OK
> eed.py                           5.338  OK
> lrtddft2.py                      5.233  OK
> parallel/hamiltonian.py          6.910  OK
> ah.py                            5.731  OK
> laplace.py                       8.297  OK
> pw/mgo_hybrids.py               15.114  OK
> lcao_largecellforce.py           6.294  OK
> restart2.py                     11.502  OK
> Cl_minus.py                     11.902  OK
> fileio/restart_density.py       15.878  OK
> external_potential.py            6.671  OK
> pw/bulk.py                      10.417  OK
> pw/fftmixer.py                   1.736  OK
> mgga_restart.py                  9.814  OK
> vdw/quick.py                    12.291  OK
> partitioning.py                 15.677  OK
> bulk.py                         15.912  OK
> elf.py                          14.620  OK
> aluminum_EELS.py                 7.906  OK
> H_force.py                      10.790  OK
> parallel/lcao_hamiltonian.py     9.976  OK
> fermisplit.py                   11.719  OK
> parallel/ut_redist.py           16.637  OK
> lcao_h2o.py                      8.878  OK
> cmrtest/cmr_test2.py            13.187  OK
> h2o_xas.py                      12.424  OK
> ne_gllb.py                      16.306  OK
> exx_acdf.py                     10.807  OK
> asewannier.py                   15.842  OK
> exx_q.py                        12.657  OK
> ut_rsh.py                        8.009  OK
> ut_csh.py                        7.753  OK
> spin_contamination.py           12.575  OK
> davidson.py                     18.177  OK
> pw/davidson_pw.py                6.337  OK
> cg.py                           12.661  OK
> gllbatomic.py                   20.706  OK
> lcao_force.py                   15.549  OK
> neb.py                          19.622  OK
> fermilevel.py                   40.550  OK
> h2o_xas_recursion.py            19.107  OK
> diamond_eps.py                  17.053  OK
> excited_state.py               104.577  OK
> gemm.py                         18.569  OK
> rpa_energy_Ni.py                28.412  OK
> LDA_unstable.py                 42.628  OK
> si.py                           28.033  OK
> blocked_rmm_diis.py             12.470  OK
> lxc_xcatom.py                   22.621  OK
> gw_planewave.py                 26.636  OK
> degeneracy.py                   22.890  OK
> apmb.py                         25.843  OK
> vdw/potential.py                22.132  OK
> al_chain.py                     21.149  OK
> relax.py                        34.275  OK
> fixmom.py                       19.754  OK
> CH4.py                          26.450  OK
> diamond_absorption.py           23.281  OK
> simple_stm.py                   40.956  OK
> gw_method.py                    30.347  OK
> lcao_bulk.py                    22.350  OK
> constant_electric_field.py      39.661  OK
> parallel/ut_invops.py           30.231  OK
> wannier_ethylene.py             42.960  OK
> parallel/lcao_projections.py    26.394  OK
> guc_force.py                    35.227  OK
> test_ibzqpt.py                  28.109  OK
> aedensity.py                    33.635  OK
> fd2lcao_restart.py              48.761  OK
> lcao_bsse.py                    30.819  OK
> pplda.py                        73.708  OK
> revPBE_Li.py                    60.898  OK
> si_primitive.py                 33.757  OK
> complex.py                      34.953  OK
> Hubbard_U.py                    61.438  OK
> ldos.py                         37.616  OK
> parallel/ut_hsops.py            65.933  OK
> pw/hyb.py                       58.344  OK
> hgh_h2o.py                      42.048  OK
> vdw/quick_spin.py               71.347  OK
> scfsic_h2.py                    33.947  OK
> lrtddft.py                      73.679  OK
> dscf_lcao.py                    45.671  OK
> IP_oxygen.py                    69.511  OK
> Al2_lrtddft.py                  73.959  OK
> rpa_energy_Si.py                71.933  OK
> 2Al.py                          71.388  OK
> jstm.py                         48.191  OK
> tpss.py                         85.009  OK
> be_nltd_ip.py                   53.408  OK
> si_xas.py                       74.762  OK
> atomize.py                      81.544  OK
> chi0.py                        328.921  OK
> ralda_energy_H2.py              14.760  OK
> ralda_energy_N2.py              44.734  OK
> ralda_energy_Ni.py              40.005  OK
> Cu.py                           79.470  OK
> restart_band_structure.py       58.996  OK
> ne_disc.py                     112.526  OK
> exx_coarse.py                   66.586  OK
> exx_unocc.py                    17.899  OK
> Hubbard_U_Zn.py                 82.009  OK
> muffintinpot.py                144.672  OK
> diamond_gllb.py                155.858  OK
> h2o_dks.py                     131.237  OK
> aluminum_EELS_lcao.py           70.621  OK
> gw_ppa.py                       67.232  OK
> nscfsic.py                     151.915  OK
> gw_static.py                    29.595  OK
> exx.py                          88.592  OK
> pygga.py                       178.549  OK
> dipole.py                      132.991  OK
> nsc_MGGA.py                     97.486  OK
> mgga_sc.py                      64.015  OK
> MgO_exx_fd_vs_pw.py            129.831  OK
> lb94.py                        300.942  OK
> 8Si.py                         108.260  OK
> td_na2.py                      112.532  OK
> ehrenfest_nacl.py               62.301  OK
> rpa_energy_N2.py               129.659  OK
> beefvdw.py                     252.350  OK
> nonlocalset.py                 164.675  OK
> wannierk.py                    138.025  OK
> rpa_energy_Na.py               212.742  OK
> coreeig.py                     202.662  OK
> pw/si_stress.py                216.543  OK
> ut_tddft.py                    227.711  OK
> transport.py                   425.775  OK
> vdw/ar2.py                     380.194  OK
> bse_sym.py                     465.311  OK
> aluminum_testcell.py           436.707  OK
> au02_absorption.py             569.460  OK
> lrtddft3.py                    657.319  OK
> scfsic_n2.py                   307.946  OK
> bse_MoS2_cut.py
>
> ________________________________________
> From: Marcin Dulak [Marcin.Dulak at fysik.dtu.dk]
> Sent: Wednesday, July 23, 2014 10:08 AM
> To: Michalsky  Ronald; gpaw-users at listserv.fysik.dtu.dk
> Subject: Re: [gpaw-users] Problem with compiling parallel GPAW
>
> Hi,
>
> On 07/23/2014 09:33 AM, Michalsky Ronald wrote:
>> This did it:
>>
>> […]
>> * Using standard lapack
>> * Architecture: linux-x86_64
>> * Building a custom interpreter
>>
>> I’ve but compiled the setups, etc., but get stuck when running the tests:
> is GPAW_SETUP_PATH exported?
>> gpaw-test -j 8
>>
>> yields: […] ImportError: libacml.so: cannot open shared object file: No such file or directory
>>
>> gpaw-python `which gpaw-test` 2>&1 | tee test.log
>>
>> yields: gpaw-python: error while loading shared libraries: libacml.so: cannot open shared object file: No such file or directory
>>
>> libacml.so is at $HOME/acm15.3.1/gfortran64/lib
>> but writing any of these does not help:
>> export PYTHONPATH=$HOME/acm15.3.1/gfortran64/lib:$PYTHONPATH
>> export PATH=$HOME/acm15.3.1/gfortran64/lib:$PATH
>> export LIBRARY_PATH=$HOME/acm15.3.1/gfortran64/lib:$LIBRARY_PATH
> you need:
>
> export LD_LIBRARY_PATH=$HOME/acm15.3.1/gfortran64/lib:$LD_LIBRARY_PATH
>
> Please remove the three exports above - they are not needed.
> Don't forget to run the tests in parallel:
> mpiexec -np 4 gpaw-python `which gpaw-test` 2>&1 | tee test4.log
> mpiexec -np 8 gpaw-python `which gpaw-test` 2>&1 | tee test8.log
>
>
>> echo $BLASLAPACK shows:
>> -L/cluster/apps/netcdf/4.1.3/x86_64/serial/gcc_4.7.2/lib -lnetcdff -lnetcdf -L/cluster/home03/mavt/ronaldm/acm15.3.1/gfortran64/lib -lacml -L/cluster/home03/mavt/ronaldm/acm15.3.1/gfortran64/lib -lacml -lgfortran
> this is not relevant for GPAW - it does not use this variable.
>
> Best regards,
>
> Marcin
>> Do you have an idea?
>> Thanks, Ronny
>>
>> ________________________________________
>> From: Marcin Dulak [Marcin.Dulak at fysik.dtu.dk]
>> Sent: Tuesday, July 22, 2014 1:39 PM
>> To: Michalsky  Ronald; gpaw-users at listserv.fysik.dtu.dk
>> Subject: Re: [gpaw-users] Problem with compiling parallel GPAW
>>
>> Hi,
>>
>> On 07/22/2014 11:58 AM, Michalsky Ronald wrote:
>>> Hi,
>>>
>>> I’ve compiled ASE but have difficulties with compiling GPAW for parallel computations (CentOS 6 OS and I’m using a gcc/4.7.2 compiler). Below I describe how I’ve complied Libxc & GPAW. The initial error message (mpicc was not found) is fixed now. However, it appears the parallel executable is not compiled due to some -R option that is not recognized. I’d appreciate your advice.
>>>
>>> # Initial email at Campos mailing list:
>>>
>>>>>> Assuming Libxc is the only missing dependency:
>>> cd $HOME
>>> wget http://www.tddft.org/programs/octopus/down.php?file=libxc/libxc-2.0.2.tar.gz -O libxc-2.0.2.tar.gz
>>> tar xzf libxc-2.0.2.tar.gz
>>> cd libxc-2.0.2
>>> ./configure --enable-shared --prefix=$HOME/xc
>>> make
>>> make install
>>> export C_INCLUDE_PATH=~/xc/include
>>> export LIBRARY_PATH=~/xc/lib
>>> export LD_LIBRARY_PATH=~/xc/lib
>>>
>>> cd $HOME
>>> tar xzf gpaw-0.10.0.11364.tar.gz
>>>>>> Rename: $HOME/gpaw-0.10.0.11364 to $HOME/gpaw
>>> export GPAW_HOME =${HOME}/gpaw
>>> cd gpaw
>>>
>>>>>> Edit: $HOME/gpaw/customize.py:
>>> libraries = ['acml', 'gfortran']
>>> library_dirs = ['/cluster/home03/mavt/ronaldm/acm15.3.1/gfortran64/lib']
>>> mpicompiler = 'mpicc'
>>> mpilinker = 'gfortran'
>>> mpi_libraries = ['opal_wrapper']
>>>
>>> # Reply from From Marcin Dulak
>>>
>>> provide full path to mpicc. I see below from the compilation log mpicc
>>> is not in PATH.
>>>
>>> mpicompiler = '/cluster/home03/mavt/ronaldm/openmpi-1.8.1.gfortran/bin/mpicc'
>>> mpilinker = mpicompiler
>>>
>>> mpi_libraries = ['opal_wrapper']
>>>
>>> no, this is supposed to be a library to be linked (so, for -lmpi it's
>>> just "mpi"]
>>>
>>> mpi_libraries = ['mpi']
>>>
>>>> mpi_library_dirs = ['/cluster/home03/mavt/ronaldm/openmpi-1.8.1.gfortran/bin']
>>> '/cluster/home03/mavt/ronaldm/openmpi-1.8.1.gfortran/lib'
>>>
>>> # My changes & notes:
>>>
>>>>>> I’ve edited in .bashrc:
>>> export PATH=/cluster/home03/mavt/ronaldm/openmpi-1.8.1.gfortran/bin/mpicc:$PATH
>>>>>> I’ve edited in customize.py:
>>> mpicompiler = '/cluster/home03/mavt/ronaldm/openmpi-1.8.1.gfortran/bin/mpicc'
>>> mpilinker = mpicompiler
>>> mpi_libraries = ['mpi']
>>> mpi_library_dirs = ['/cluster/home03/mavt/ronaldm/openmpi-1.8.1.gfortran/lib']
>>>>>> this avoided the error: ‘sh: mpicc: command not found’
>>> # Continue “Initial email at Campos mailing list”:
>>>
>>> mpi_library_dirs = ['/cluster/home03/mavt/ronaldm/openmpi-1.8.1.gfortran/bin']
>>> mpi_include_dirs = ['/cluster/home03/mavt/ronaldm/openmpi-1.8.1.gfortran/include']
>>> mpi_runtime_library_dirs = ['/cluster/home03/mavt/ronaldm/openmpi-1.8.1.gfortran/lib']
>>> scalapack = False
>>>
>>>>>> Note, $HOME/openmpi-1.8.1.gfortran/bin contains an ‘mpicc’ symlink to the ‘opal_wrapper’ file in the same directory
>>> python setup.py build_ext 2>&1 | tee build_ext.log
>>>
>>>>>> yielding:
>>> /usr/bin/gcc -pthread -shared build/temp.linux-x86_64-2.7/c/wigner_seitz.o build/temp.linux-x86_64-2.7/c/mpi.o build/temp.linux-x86_64-2.7/c/spline.o build/temp.linux-x86_64-2.7/c/lcao.o build/temp.linux-x86_64-2.7/c/_gpaw.o build/temp.linux-x86_64-2.7/c/plane_wave.o build/temp.linux-x86_64-2.7/c/symmetry.o build/temp.linux-x86_64-2.7/c/lfc.o build/temp.linux-x86_64-2.7/c/operators.o build/temp.linux-x86_64-2.7/c/lapack.o build/temp.linux-x86_64-2.7/c/point_charges.o build/temp.linux-x86_64-2.7/c/blacs.o build/temp.linux-x86_64-2.7/c/bc.o build/temp.linux-x86_64-2.7/c/transformers.o build/temp.linux-x86_64-2.7/c/lfc2.o build/temp.linux-x86_64-2.7/c/localized_functions.o build/temp.linux-x86_64-2.7/c/mlsqr.o build/temp.linux-x86_64-2.7/c/blas.o build/temp.linux-x86_64-2.7/c/hdf5.o build/temp.linux-x86_64-2.7/c/utilities.o build/temp.linux-x86_64-2.7/c/cerf.o build/temp.linux-x86_64-2.7/c/plt.o build/temp.linux-x86_64-2.7/c/fftw.o build/temp.linux-x86_64-2.7/c/bmgs/bmgs.o build/temp.linux-x86_64-2.7/c/xc/pw91.o build/temp.linux-x86_64-2.7/c/xc/tpss.o build/temp.linux-x86_64-2.7/c/xc/revtpss.o build/temp.linux-x86_64-2.7/c/xc/revtpss_c_pbe.o build/temp.linux-x86_64-2.7/c/xc/pbe.o build/temp.linux-x86_64-2.7/c/xc/libxc.o build/temp.linux-x86_64-2.7/c/xc/rpbe.o build/temp.linux-x86_64-2.7/c/xc/m06l.o build/temp.linux-x86_64-2.7/c/xc/xc_mgga.o build/temp.linux-x86_64-2.7/c/xc/vdw.o build/temp.linux-x86_64-2.7/c/xc/xc.o build/temp.linux-x86_64-2.7/c/xc/ensemble_gga.o -L/cluster/home03/mavt/ronaldm/acm15.3.1/gfortran64/lib -L/cluster/apps/python/2.7.2/x86_64/lib64 -lacml -lgfortran -lpython2.7 -o build/lib.linux-x86_64-2.7/_gpaw.so
>>> sh: mpicc: command not found
>>> gfortran: error: unrecognized option '-R'
>>> mpicc -DNPY_NO_DEPRECATED_API=7 -D_GNU_SOURCE=1 -DPARALLEL=1 -DGPAW_INTERPRETER=1 -Wall -std=c99 -I/cluster/apps/python/2.7.2/x86_64/lib64/python2.7/site-packages/numpy/core/include -I/cluster/home03/mavt/ronaldm/openmpi-1.8.1.gfortran/include -I/cluster/apps/python/2.7.2/x86_64/include/python2.7 -I/cluster/apps/python/2.7.2/x86_64/include/python2.7 -o build/temp.linux-x86_64-2.7/c/bc.o -c c/bc.c
>>> gfortran -o build/bin.linux-x86_64-2.7//gpaw-python build/temp.linux-x86_64-2.7/c/wigner_seitz.o build/temp.linux-x86_64-2.7/c/mpi.o build/temp.linux-x86_64-2.7/c/spline.o build/temp.linux-x86_64-2.7/c/lcao.o build/temp.linux-x86_64-2.7/c/_gpaw.o build/temp.linux-x86_64-2.7/c/plane_wave.o build/temp.linux-x86_64-2.7/c/symmetry.o build/temp.linux-x86_64-2.7/c/lfc.o build/temp.linux-x86_64-2.7/c/operators.o build/temp.linux-x86_64-2.7/c/lapack.o build/temp.linux-x86_64-2.7/c/point_charges.o build/temp.linux-x86_64-2.7/c/blacs.o build/temp.linux-x86_64-2.7/c/bc.o build/temp.linux-x86_64-2.7/c/transformers.o build/temp.linux-x86_64-2.7/c/lfc2.o build/temp.linux-x86_64-2.7/c/localized_functions.o build/temp.linux-x86_64-2.7/c/mlsqr.o build/temp.linux-x86_64-2.7/c/blas.o build/temp.linux-x86_64-2.7/c/hdf5.o build/temp.linux-x86_64-2.7/c/utilities.o build/temp.linux-x86_64-2.7/c/cerf.o build/temp.linux-x86_64-2.7/c/plt.o build/temp.linux-x86_64-2.7/c/fftw.o build/temp.linux-x86_64-2.7/c/bmgs/bmgs.o build/temp.linux-x86_64-2.7/c/xc/pw91.o build/temp.linux-x86_64-2.7/c/xc/tpss.o build/temp.linux-x86_64-2.7/c/xc/revtpss.o build/temp.linux-x86_64-2.7/c/xc/revtpss_c_pbe.o build/temp.linux-x86_64-2.7/c/xc/pbe.o build/temp.linux-x86_64-2.7/c/xc/libxc.o build/temp.linux-x86_64-2.7/c/xc/rpbe.o build/temp.linux-x86_64-2.7/c/xc/m06l.o build/temp.linux-x86_64-2.7/c/xc/xc_mgga.o build/temp.linux-x86_64-2.7/c/xc/vdw.o build/temp.linux-x86_64-2.7/c/xc/xc.o build/temp.linux-x86_64-2.7/c/xc/ensemble_gga.o  -L/cluster/home03/mavt/ronaldm/acm15.3.1/gfortran64/lib -L/cluster/home03/mavt/ronaldm/openmpi-1.8.1.gfortran/bin -L/cluster/apps/python/2.7.2/x86_64/lib64/python2.7/config -lacml -lgfortran -lopal_wrapper -lpython2.7 -lpthread -ldl  -lutil -lm -R/cluster/home03/mavt/ronaldm/openmpi-1.8.1.gfortran/lib  -Xlinker -export-dynamic
>>>
>>> * Using standard lapack
>>> * Architecture: linux-x86_64
>>> * Building a custom interpreter
>>> * linking FAILED!  Only serial version of code will work.
>>>
>>> # Continue “Reply from From Marcin Dulak”:
>>>
>>>> sh: mpicc: command not found
>>> mpicc is missing in PATH.
>>>> gfortran: error: unrecognized option '-R'
>>> i don't see any -R option here, where does it come from?
>>> Please move the thread about building GPAW to the gpaw-users mailing
>>> list: https://wiki.fysik.dtu.dk/gpaw/mailinglists.html
>>>
>>> # My changes & notes:
>>>
>>>>>> I’ve written at ‘#Hack taken from distutils to determine option for runtime_libary_dirs’ in $HOME/gpaw/config.py:
>>>        else:
>>>            runtime_lib_option = '-L' # chnaged from '-R' to '-L', assuming this includes linux-x86_64
>> i guess there is a  bug in this part. Just don't use
>> mpi_runtime_library_dirs or similar for the moment.
>>>>>> This yields the error copied below. Note, the error message is the same when not hacking $HOME/gpaw/config.py but instead commenting out in customize.py:
>>> # mpi_library_dirs = ['/cluster/home03/mavt/ronaldm/openmpi-1.8.1.gfortran/lib']
>>>
>>> […]
>>> /cluster/apps/python/2.7.2/x86_64/lib64/python2.7/config/libpython2.7.a(posixmodule.o): In function `posix_tmpnam':
>>> /tmp/Python-2.7.2/./Modules/posixmodule.c:7370: warning: the use of `tmpnam_r' is dangerous, better use `mkstemp'
>>> /cluster/apps/python/2.7.2/x86_64/lib64/python2.7/config/libpython2.7.a(posixmodule.o): In function `posix_tempnam':
>>> /tmp/Python-2.7.2/./Modules/posixmodule.c:7317: warning: the use of `tempnam' is dangerous, better use `mkstemp'
>>> build/temp.linux-x86_64-2.7/c/xc/libxc.o: In function `lxcXCFunctional_dealloc':
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/libxc.c:194: undefined reference to `xc_func_end'
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/libxc.c:194: undefined reference to `xc_func_end'
>>> build/temp.linux-x86_64-2.7/c/xc/libxc.o: In function `lxcXCFunctional_CalculateFXC':
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/libxc.c:758: undefined reference to `xc_lda_fxc'
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/libxc.c:763: undefined reference to `xc_gga_fxc'
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/libxc.c:763: undefined reference to `xc_gga_fxc'
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/libxc.c:758: undefined reference to `xc_lda_fxc'
>>> build/temp.linux-x86_64-2.7/c/xc/libxc.o: In function `lxcXCFunctional_Calculate':
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/libxc.c:642: undefined reference to `xc_lda_exc_vxc'
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/libxc.c:647: undefined reference to `xc_gga_exc_vxc'
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/libxc.c:647: undefined reference to `xc_gga_exc_vxc'
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/libxc.c:653: undefined reference to `xc_mgga_exc_vxc'
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/libxc.c:653: undefined reference to `xc_mgga_exc_vxc'
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/libxc.c:642: undefined reference to `xc_lda_exc_vxc'
>>> build/temp.linux-x86_64-2.7/c/xc/libxc.o: In function `get_fxc_fd_lda':
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/libxc.c:38: undefined reference to `xc_lda_fxc_fd'
>>> build/temp.linux-x86_64-2.7/c/xc/libxc.o: In function `get_point':
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/libxc.c:65: undefined reference to `xc_lda_exc_vxc'
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/libxc.c:69: undefined reference to `xc_gga_exc_vxc'
>>> build/temp.linux-x86_64-2.7/c/xc/libxc.o: In function `NewlxcXCFunctionalObject':
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/libxc.c:852: undefined reference to `xc_family_from_id'
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/libxc.c:854: undefined reference to `xc_func_init'
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/libxc.c:861: undefined reference to `xc_family_from_id'
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/libxc.c:863: undefined reference to `xc_func_init'
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/libxc.c:866: undefined reference to `xc_family_from_id'
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/libxc.c:868: undefined reference to `xc_func_init'
>>> build/temp.linux-x86_64-2.7/c/xc/libxc.o: In function `lxcXCFuncNum':
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/libxc.c:907: undefined reference to `xc_functional_get_number'
>>> build/temp.linux-x86_64-2.7/c/xc/tpss.o: In function `c_tpss_12':
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/tpss.c:292: undefined reference to `xc_gga_exc_vxc'
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/tpss.c:308: undefined reference to `xc_gga_exc_vxc'
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/tpss.c:325: undefined reference to `xc_gga_exc_vxc'
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/tpss.c:292: undefined reference to `xc_gga_exc_vxc'
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/tpss.c:308: undefined reference to `xc_gga_exc_vxc'
>>> build/temp.linux-x86_64-2.7/c/xc/tpss.o:/cluster/home03/mavt/ronaldm/gpaw/c/xc/tpss.c:325: more undefined references to `xc_gga_exc_vxc' follow
>>> build/temp.linux-x86_64-2.7/c/xc/tpss.o: In function `x_tpss_para':
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/tpss.c:125: undefined reference to `xc_lda_exc_vxc'
>>> build/temp.linux-x86_64-2.7/c/xc/tpss.o: In function `tpss_end':
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/tpss.c:544: undefined reference to `xc_func_end'
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/tpss.c:547: undefined reference to `xc_func_end'
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/tpss.c:548: undefined reference to `xc_func_end'
>>> build/temp.linux-x86_64-2.7/c/xc/tpss.o: In function `tpss_init':
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/tpss.c:534: undefined reference to `xc_func_init'
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/tpss.c:538: undefined reference to `xc_func_init'
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/tpss.c:539: undefined reference to `xc_func_init'
>>> build/temp.linux-x86_64-2.7/c/xc/revtpss.o: In function `revtpss_end':
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/revtpss.c:553: undefined reference to `xc_func_end'
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/revtpss.c:556: undefined reference to `xc_func_end'
>>> build/temp.linux-x86_64-2.7/c/xc/revtpss.o: In function `revtpss_init':
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/revtpss.c:545: undefined reference to `xc_func_init'
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/revtpss.c:547: undefined reference to `xc_func_init'
>>> build/temp.linux-x86_64-2.7/c/xc/revtpss.o: In function `x_revtpss_para':
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/revtpss.c:449: undefined reference to `xc_lda_exc_vxc'
>>> build/temp.linux-x86_64-2.7/c/xc/revtpss.o: In function `revtpss_end':
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/revtpss.c:557: undefined reference to `xc_func_end'
>>> build/temp.linux-x86_64-2.7/c/xc/revtpss.o: In function `revtpss_init':
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/revtpss.c:548: undefined reference to `xc_func_init'
>>> build/temp.linux-x86_64-2.7/c/xc/revtpss_c_pbe.o: In function `xc_perdew_params':
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/revtpss_c_pbe.c:64: undefined reference to `xc_lda_exc'
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/revtpss_c_pbe.c:70: undefined reference to `xc_lda'
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/revtpss_c_pbe.c:67: undefined reference to `xc_lda_exc_vxc'
>>> build/temp.linux-x86_64-2.7/c/xc/m06l.o: In function `m06l_end':
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/m06l.c:733: undefined reference to `xc_func_end'
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/m06l.c:736: undefined reference to `xc_func_end'
>>> build/temp.linux-x86_64-2.7/c/xc/m06l.o: In function `m06l_init':
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/m06l.c:724: undefined reference to `xc_func_init'
>>> build/temp.linux-x86_64-2.7/c/xc/m06l.o: In function `c_m06l_para':
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/m06l.c:328: undefined reference to `xc_lda_exc_vxc'
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/m06l.c:338: undefined reference to `xc_lda_exc_vxc'
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/m06l.c:372: undefined reference to `xc_lda_exc_vxc'
>>> build/temp.linux-x86_64-2.7/c/xc/m06l.o: In function `x_m06l_para':
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/m06l.c:656: undefined reference to `xc_gga_exc_vxc'
>>> build/temp.linux-x86_64-2.7/c/xc/m06l.o: In function `m06l_init':
>>> /cluster/home03/mavt/ronaldm/gpaw/c/xc/m06l.c:727: undefined reference to `xc_func_init'
>>> collect2: error: ld returned 1 exit status
>>> python2.7 -I/cluster/apps/python/2.7.2/x86_64/include/python2.7 -o build/temp.linux-x86_64-2.7/c/hdf5.o -c c/hdf5.c
>>> /cluster/home03/mavt/ronaldm/openmpi-1.8.1.gfortran/bin/mpicc -o build/bin.linux-x86_64-2.7//gpaw-python build/temp.linux-x86_64-2.7/c/mlsqr.o build/temp.linux-x86_64-2.7/c/symmetry.o build/temp.linux-x86_64-2.7/c/lapack.o build/temp.linux-x86_64-2.7/c/lcao.o build/temp.linux-x86_64-2.7/c/blacs.o build/temp.linux-x86_64-2.7/c/plt.o build/temp.linux-x86_64-2.7/c/plane_wave.o build/temp.linux-x86_64-2.7/c/wigner_seitz.o build/temp.linux-x86_64-2.7/c/utilities.o build/temp.linux-x86_64-2.7/c/spline.o build/temp.linux-x86_64-2.7/c/point_charges.o build/temp.linux-x86_64-2.7/c/_gpaw.o build/temp.linux-x86_64-2.7/c/mpi.o build/temp.linux-x86_64-2.7/c/lfc2.o build/temp.linux-x86_64-2.7/c/bc.o build/temp.linux-x86_64-2.7/c/hdf5.o build/temp.linux-x86_64-2.7/c/cerf.o build/temp.linux-x86_64-2.7/c/transformers.o build/temp.linux-x86_64-2.7/c/blas.o build/temp.linux-x86_64-2.7/c/lfc.o build/temp.linux-x86_64-2.7/c/localized_functions.o build/temp.linux-x86_64-2.7/c/operators.o build/temp.linux-x86_64-2.7/c/fftw.o build/temp.linux-x86_64-2.7/c/bmgs/bmgs.o build/temp.linux-x86_64-2.7/c/xc/libxc.o build/temp.linux-x86_64-2.7/c/xc/xc_mgga.o build/temp.linux-x86_64-2.7/c/xc/tpss.o build/temp.linux-x86_64-2.7/c/xc/pw91.o build/temp.linux-x86_64-2.7/c/xc/pbe.o build/temp.linux-x86_64-2.7/c/xc/revtpss.o build/temp.linux-x86_64-2.7/c/xc/xc.o build/temp.linux-x86_64-2.7/c/xc/revtpss_c_pbe.o build/temp.linux-x86_64-2.7/c/xc/ensemble_gga.o build/temp.linux-x86_64-2.7/c/xc/vdw.o build/temp.linux-x86_64-2.7/c/xc/rpbe.o build/temp.linux-x86_64-2.7/c/xc/m06l.o  -L/cluster/home03/mavt/ronaldm/acm15.3.1/gfortran64/lib -L/cluster/home03/mavt/ronaldm/openmpi-1.8.1.gfortran/lib -L/cluster/apps/python/2.7.2/x86_64/lib64/python2.7/config -lacml -lgfortran -lmpi -lpython2.7 -lpthread -ldl  -lutil -lm -L/cluster/home03/mavt/ronaldm/openmpi-1.8.1.gfortran/lib  -Xlinker -export-dynamic
>>>
>>> * Using standard lapack
>>> * Architecture: linux-x86_64
>>> * Building a custom interpreter
>>> * linking FAILED!  Only serial version of code will work.
>> you miss libxc linking.
>> This is because of: libraries = ['acml', 'gfortran']
>> It should be:
>>
>> libraries += ['acml', 'gfortran']
>>
>> Best regards,
>>
>> Marcin
>>> Ronald Michalsky
>>> Postdoctoral Research Associate
>>> ETH Zürich
>>> Institute of Energy Technology
>>> ML K 23, Sonneggstr. 3
>>> 8092 Zürich, Switzerland
>>> Tel: +41-44-6338383
>>> _______________________________________________
>>> gpaw-users mailing list
>>> gpaw-users at listserv.fysik.dtu.dk
>>> https://listserv.fysik.dtu.dk/mailman/listinfo/gpaw-users
>>>
>>>
>> --
>> ***********************************
>>
>> Marcin Dulak
>> Technical University of Denmark
>> Department of Physics
>> Building 307, Room 229
>> DK-2800 Kongens Lyngby
>> Denmark
>> Tel.: (+45) 4525 3157
>> Fax.: (+45) 4593 2399
>> email: Marcin.Dulak at fysik.dtu.dk
>>
>> ***********************************
>>
>>
>>
>>
>
> --
> ***********************************
>
> Marcin Dulak
> Technical University of Denmark
> Department of Physics
> Building 307, Room 229
> DK-2800 Kongens Lyngby
> Denmark
> Tel.: (+45) 4525 3157
> Fax.: (+45) 4593 2399
> email: Marcin.Dulak at fysik.dtu.dk
>
> ***********************************
>
>
>
>


-- 
***********************************
  
Marcin Dulak
Technical University of Denmark
Department of Physics
Building 307, Room 229
DK-2800 Kongens Lyngby
Denmark
Tel.: (+45) 4525 3157
Fax.: (+45) 4593 2399
email: Marcin.Dulak at fysik.dtu.dk

***********************************





More information about the gpaw-users mailing list