[gpaw-users] Vibration analysis with GPAW

Jens Jørgen Mortensen jjmo at dtu.dk
Wed Aug 22 11:19:51 CEST 2018


On 08/22/2018 10:49 AM, Chang Liu wrote:
>
> Hi,
>
>
> Here is the result for 1.3.0:
>

Looks like all your test runs are timing out for some reason.  I don't 
know what the problem is.  Maybe start from scratch with version 1.4.1b1 
and check that the tests pass in serial.

Jens Jørgen

>
> --------------------------------------------------------------------------
> A process has executed an operation involving a call to the
> "fork()" system call to create a child process.  Open MPI is currently
> operating in a condition that could result in memory corruption or
> other system errors; your job may hang, crash, or produce silent
> data corruption.  The use of fork() (or system() or other calls that
> create child processes) is strongly discouraged.
>
> The process that invoked fork was:
>
>   Local host:          [[50631,1],0] (PID 698801)
>
> If you are *absolutely sure* that your application will successfully
> and correctly survive a call to fork(), you may disable this warning
> by setting the mpi_warn_on_fork MCA parameter to 0.
> --------------------------------------------------------------------------
> python-2.7.14 
>  /hpc2n/eb/software/MPI/GCC/6.4.0-2.28/OpenMPI/2.1.1/GPAW/1.3.0-Python-2.7.14/bin/gpaw-python
> gpaw-1.3.0 
> /hpc2n/eb/software/MPI/GCC/6.4.0-2.28/OpenMPI/2.1.1/GPAW/1.3.0-Python-2.7.14/lib/python2.7/site-packages/gpaw/
> ase-3.15.0 
> /hpc2n/eb/software/MPI/GCC/6.4.0-2.28/OpenMPI/2.1.1/ASE/3.15.0-Python-2.7.14/lib/python2.7/site-packages/ase-3.15.0-py2.7.egg/ase/
> numpy-1.13.1 
> /hpc2n/eb/software/MPI/GCC/6.4.0-2.28/OpenMPI/2.1.1/Python/2.7.14/lib/python2.7/site-packages/numpy-1.13.1-py2.7-linux-x86_64.egg/numpy/
> scipy-0.19.1 
> /hpc2n/eb/software/MPI/GCC/6.4.0-2.28/OpenMPI/2.1.1/Python/2.7.14/lib/python2.7/site-packages/scipy-0.19.1-py2.7-linux-x86_64.egg/scipy/
> _gpaw           built-in
> parallel 
> /hpc2n/eb/software/MPI/GCC/6.4.0-2.28/OpenMPI/2.1.1/GPAW/1.3.0-Python-2.7.14/bin/gpaw-python
> FFTW            yes
> scalapack       yes
> libvdwxc        yes
> PAW-datasets    1: /hpc2n/eb/software/Core/GPAW-setups/0.9.20000
> Running tests in /tmp/gpaw-test-eDDCuQ
> Jobs: 1, Cores: 4, debug-mode: False
> =============================================================================
> linalg/gemm_complex.py                        0.012  OK
> ase_features/ase3k_version.py                 0.007  OK
> kpt.py                                        0.010  OK
> mpicomm.py                                    0.008  OK
> pathological/numpy_core_multiarray_dot.py     0.007  OK
> eigen/cg2.py                                  0.010  OK
> fd_ops/laplace.py                             0.000 SKIPPED
> linalg/lapack.py                              0.009  OK
> linalg/eigh.py                                0.009  OK
> parallel/submatrix_redist.py                  0.010  OK
> lfc/second_derivative.py                      0.013  OK
> parallel/parallel_eigh.py                     0.009  OK
> lfc/gp2.py                                    0.013  OK
> linalg/blas.py                                0.011  OK
> Gauss.py                                      0.016  OK
> symmetry/check.py                             0.351  OK
> fd_ops/nabla.py                               0.096  OK
> linalg/dot.py                                 0.008  OK
> linalg/mmm.py                                 0.008  OK
> xc/lxc_fxc.py                                 0.008  OK
> xc/pbe_pw91.py                                0.007  OK
> fd_ops/gradient.py                            0.009  OK
> maths/erf.py                                  0.007  OK
> lfc/lf.py                                     0.009  OK
> maths/fsbt.py                                 0.035  OK
> parallel/compare.py                           0.010  OK
> vdw/libvdwxc_functionals.py                   0.061  OK
> radial/integral4.py                           0.035  OK
> linalg/zher.py                                0.018  OK
> fd_ops/gd.py                                  0.009  OK
> pw/interpol.py                                0.008  OK
> poisson/screened_poisson.py                   0.113  OK
> xc/xc.py                                      0.024  OK
> xc/XC2.py                                     0.038  OK
> radial/yukawa_radial.py                       0.007  OK
> vdw/potential.py                              0.008  OK
> radial/lebedev.py                             0.014  OK
> occupations.py                                0.028  OK
> lfc/derivatives.py                            0.011  OK
> parallel/realspace_blacs.py                   0.051  OK
> pw/reallfc.py                                 0.129  OK
> parallel/pblas.py                        RuntimeError: MPI barrier 
> timeout.
> To get a full traceback, use: gpaw -T test ...
>
> The output for 1.4.0:
>
>
> --------------------------------------------------------------------------
> A process has executed an operation involving a call to the
> "fork()" system call to create a child process.  Open MPI is currently
> operating in a condition that could result in memory corruption or
> other system errors; your job may hang, crash, or produce silent
> data corruption.  The use of fork() (or system() or other calls that
> create child processes) is strongly discouraged.
>
> The process that invoked fork was:
>
>   Local host:          [[55354,1],0] (PID 701537)
>
> If you are *absolutely sure* that your application will successfully
> and correctly survive a call to fork(), you may disable this warning
> by setting the mpi_warn_on_fork MCA parameter to 0.
> --------------------------------------------------------------------------
> python-2.7.14 /home/c/chliu/pfs/gpaw-ase/gpaw-1.4.0/build/bin/gpaw-python
> gpaw-1.4.0 
>  /home/c/chliu/pfs/gpaw-ase/gpaw-1.4.0/build/lib/python2.7/site-packages/gpaw/
> ase-3.16.0 
>  /home/c/chliu/pfs/gpaw-ase/ase-3.16.0/build/lib/python2.7/site-packages/ase-3.16.0-py2.7.egg/ase/
> numpy-1.13.1 
>  /pfs/software/eb/amd64_ubuntu1604_bdw/software/MPI/GCC/6.4.0-2.28/OpenMPI/2.1.1/Python/2.7.14/lib/python2.7/site-packages/numpy-1.13.1-py2.7-linux-x86_64.egg/numpy/
> scipy-0.19.1 
>  /pfs/software/eb/amd64_ubuntu1604_bdw/software/MPI/GCC/6.4.0-2.28/OpenMPI/2.1.1/Python/2.7.14/lib/python2.7/site-packages/scipy-0.19.1-py2.7-linux-x86_64.egg/scipy/
> _gpaw                    built-in
> parallel  /home/c/chliu/pfs/gpaw-ase/gpaw-1.4.0/build/bin//gpaw-python
> FFTW                     yes
> scalapack                no
> libvdwxc                 no
> PAW-datasets             1: /hpc2n/eb/software/Core/GPAW-setups/0.9.20000
> Running tests in /tmp/gpaw-test-a5FilR
> Jobs: 1, Cores: 4, debug-mode: False
> =============================================================================
> linalg/gemm_complex.py                        0.070  OK
> ase_features/ase3k_version.py                 0.030  OK
> kpt.py                                        0.029  OK
> mpicomm.py                                    0.026  OK
> pathological/numpy_core_multiarray_dot.py     0.024  OK
> eigen/cg2.py                                  0.077  OK
> fd_ops/laplace.py                             0.000 SKIPPED
> linalg/lapack.py                              0.032  OK
> linalg/eigh.py                                0.029  OK
> parallel/submatrix_redist.py                  0.000 SKIPPED
> lfc/second_derivative.py                      0.041  OK
> parallel/parallel_eigh.py                     0.029  OK
> lfc/gp2.py                                    0.028  OK
> linalg/blas.py                                0.016  OK
> Gauss.py                                      0.046  OK
> symmetry/check.py                             0.602  OK
> fd_ops/nabla.py                               0.136  OK
> linalg/dot.py                                 0.023  OK
> linalg/mmm.py                                 0.029  OK
> xc/lxc_fxc.py                                 0.031  OK
> xc/pbe_pw91.py                                0.031  OK
> fd_ops/gradient.py                            0.025  OK
> maths/erf.py                                  0.032  OK
> lfc/lf.py                                     0.043  OK
> maths/fsbt.py                                 0.061  OK
> parallel/compare.py                           0.033  OK
> vdw/libvdwxc_functionals.py                   0.000 SKIPPED
> radial/integral4.py                           0.655  OK
> linalg/zher.py                                0.043  OK
> fd_ops/gd.py                                  0.028  OK
> pw/interpol.py                                0.033  OK
> poisson/screened_poisson.py                   0.225  OK
> xc/xc.py                                      0.042 FAILED! (rank 0,1,2,3)
> #############################################################################
> RANK 0,1,2,3:
> Traceback (most recent call last):
>   File 
> "/home/c/chliu/pfs/gpaw-ase/gpaw-1.4.0/build/lib/python2.7/site-packages/gpaw/test/__init__.py", 
> line 643, in run_one
>     exec(compile(fd.read(), filename, 'exec'), loc)
>   File 
> "/home/c/chliu/pfs/gpaw-ase/gpaw-1.4.0/build/lib/python2.7/site-packages/gpaw/test/xc/xc.py", 
> line 83, in <module>
>     equal(error, 0, 6e-9)
>   File 
> "/home/c/chliu/pfs/gpaw-ase/gpaw-1.4.0/build/lib/python2.7/site-packages/gpaw/test/__init__.py", 
> line 29, in equal
>     raise AssertionError(msg)
> AssertionError: inf != 0 (error: |inf| > 6e-09)
> #############################################################################
> xc/XC2.py                                     0.065  OK
> radial/yukawa_radial.py                       0.030  OK
> vdw/potential.py                              0.027  OK
> radial/lebedev.py                             0.037  OK
> occupations.py                                0.064  OK
> lfc/derivatives.py                            0.028  OK
> pw/reallfc.py                                 0.200  OK
> parallel/pblas.py                             0.044  OK
> fd_ops/non_periodic.py                        0.029  OK
> spectrum.py                                   0.102  OK
> pw/lfc.py                                     0.140  OK
> gauss_func.py                                 0.302  OK
> multipoletest.py                              0.153  OK
> cluster.py                                    1.722  OK
> poisson/poisson.py                            0.079  OK
> poisson/poisson_asym.py                       0.173  OK
> parallel/arraydict_redist.py                  0.028  OK
> parallel/scalapack.py                         0.019  OK
> gauss_wave.py                                 0.259  OK
> fd_ops/transformations.py                     0.042  OK
> parallel/blacsdist.py                         0.037  OK
> pbc.py                                        0.910  OK
> atoms_too_close.py                            0.636  OK
> ext_potential/harmonic.py                     0.559  OK
> atoms_mismatch.py                             0.031  OK
> setup_basis_spec.py                           0.054  OK
> overlap.py                                    1.602  OK
> pw/direct.py                                  0.020  OK
> vdw/libvdwxc_spin.py                          0.000 SKIPPED
> timing.py                                     0.485  OK
> parallel/ut_parallel.py                       1.059  OK
> lcao/density.py                               0.880  OK
> pw/stresstest.py                              0.620  OK
> pw/fftmixer.py                                0.909  OK
> symmetry/usesymm.py                           1.082  OK
> coulomb.py                                    0.288  OK
> xc/xcatom.py                                  0.685  OK
> force_as_stop.py                              0.656  OK
> vdwradii.py                                   0.669  OK
> ase_features/ase3k.py                         0.974  OK
> pathological/numpy_zdotc_graphite.py          0.749  OK
> utilities/eed.py  [b-an01.hpc2n.umu.se:701531] 1 more process has sent 
> help message help-opal-runtime.txt / opal_init:warn-fork
> [b-an01.hpc2n.umu.se:701531] Set MCA parameter 
> "orte_base_help_aggregate" to 0 to see all help / error messages
>      1.440  OK
> lcao/dos.py                                   1.132  OK
> solvation/nan_radius.py                       1.195  OK
> solvation/pbc_pos_repeat.py                   0.282  OK
> lcao/generate_ngto.py                         0.000 SKIPPED
> linalg/gemv.py                                1.368  OK
> fileio/idiotproof_setup.py  [b-an01.hpc2n.umu.se:701531] 2 more 
> processes have sent help message help-opal-runtime.txt / 
> opal_init:warn-fork
>      2.065  OK
> radial/ylexpand.py                            1.101  OK
> eigen/keep_htpsit.py                          1.298  OK
> xc/gga_atom.py                                0.908  OK
> generic/hydrogen.py                           1.756  OK
> aeatom.py                                     0.785  OK
> ase_features/plt.py                           1.619  OK
> ds_beta.py                                    2.141  OK
> multipoleH2O.py                               2.132  OK
> spinorbit_Kr.py                               0.000 SKIPPED
> stdout.py                                     1.659  OK
> lcao/largecellforce.py                        1.516  OK
> parallel/scalapack_diag_simple.py             0.027  OK
> fixdensity.py                                 2.102  OK
> pseudopotential/ah.py                    RuntimeError: MPI barrier 
> timeout.
> To get a full traceback, use: gpaw -T test ...
>
> The output for 1.4.1b1:
>
>
> --------------------------------------------------------------------------
> A process has executed an operation involving a call to the
> "fork()" system call to create a child process.  Open MPI is currently
> operating in a condition that could result in memory corruption or
> other system errors; your job may hang, crash, or produce silent
> data corruption.  The use of fork() (or system() or other calls that
> create child processes) is strongly discouraged.
>
> The process that invoked fork was:
>
>   Local host:          [[54483,1],0] (PID 702648)
>
> If you are *absolutely sure* that your application will successfully
> and correctly survive a call to fork(), you may disable this warning
> by setting the mpi_warn_on_fork MCA parameter to 0.
> --------------------------------------------------------------------------
> python-2.7.14 
> /home/c/chliu/pfs/gpaw-ase/gpaw-master-180815/build/bin/gpaw-python
> gpaw-1.4.1b1 
>  /home/c/chliu/pfs/gpaw-ase/gpaw-master-180815/build/lib/python2.7/site-packages/gpaw/
> ase-3.16.3b1 
>  /home/c/chliu/pfs/gpaw-ase/ase-master-180815/build/lib/python2.7/site-packages/ase-3.16.3b1-py2.7.egg/ase/
> numpy-1.13.1 
>  /pfs/software/eb/amd64_ubuntu1604_bdw/software/MPI/GCC/6.4.0-2.28/OpenMPI/2.1.1/Python/2.7.14/lib/python2.7/site-packages/numpy-1.13.1-py2.7-linux-x86_64.egg/numpy/
> scipy-0.19.1 
>  /pfs/software/eb/amd64_ubuntu1604_bdw/software/MPI/GCC/6.4.0-2.28/OpenMPI/2.1.1/Python/2.7.14/lib/python2.7/site-packages/scipy-0.19.1-py2.7-linux-x86_64.egg/scipy/
> _gpaw                    built-in
> parallel 
>  /home/c/chliu/pfs/gpaw-ase/gpaw-master-180815/build/bin//gpaw-python
> FFTW                     yes
> scalapack                no
> libvdwxc                 no
> PAW-datasets             1: /hpc2n/eb/software/Core/GPAW-setups/0.9.20000
> Running tests in /tmp/gpaw-test-zOKgfP
> Jobs: 1, Cores: 4, debug-mode: False
> =============================================================================
> linalg/gemm_complex.py                        0.029  OK
> ase_features/ase3k_version.py                 0.028  OK
> kpt.py                                        0.026  OK
> mpicomm.py                                    0.026  OK
> pathological/numpy_core_multiarray_dot.py     0.029  OK
> eigen/cg2.py                                  0.058  OK
> fd_ops/laplace.py                             0.000 SKIPPED
> linalg/lapack.py                              0.023  OK
> linalg/eigh.py                                0.025  OK
> parallel/submatrix_redist.py                  0.000 SKIPPED
> lfc/second_derivative.py                      0.035  OK
> parallel/parallel_eigh.py                     0.025  OK
> lfc/gp2.py                                    0.023  OK
> linalg/blas.py                                0.020  OK
> Gauss.py                                      0.021  OK
> symmetry/check.py                             0.570  OK
> fd_ops/nabla.py                               0.136  OK
> linalg/dot.py                                 0.021  OK
> linalg/mmm.py                                 0.026  OK
> xc/pbe_pw91.py                                0.015  OK
> fd_ops/gradient.py                            0.030  OK
> maths/erf.py                                  0.029  OK
> lfc/lf.py                                     0.019  OK
> maths/fsbt.py                                 0.062  OK
> parallel/compare.py                           0.027  OK
> vdw/libvdwxc_functionals.py                   0.000 SKIPPED
> radial/integral4.py                           0.371  OK
> linalg/zher.py                                0.034  OK
> fd_ops/gd.py                                  0.010  OK
> pw/interpol.py                                0.021  OK
> poisson/screened_poisson.py                   0.169  OK
> xc/xc.py                                      0.048 FAILED! (rank 0,1,2,3)
> #############################################################################
> RANK 0,1,2,3:
> Traceback (most recent call last):
>   File 
> "/home/c/chliu/pfs/gpaw-ase/gpaw-master-180815/build/lib/python2.7/site-packages/gpaw/test/__init__.py", 
> line 656, in run_one
>     exec(compile(fd.read(), filename, 'exec'), loc)
>   File 
> "/home/c/chliu/pfs/gpaw-ase/gpaw-master-180815/build/lib/python2.7/site-packages/gpaw/test/xc/xc.py", 
> line 83, in <module>
>     equal(error, 0, 6e-9)
>   File 
> "/home/c/chliu/pfs/gpaw-ase/gpaw-master-180815/build/lib/python2.7/site-packages/gpaw/test/__init__.py", 
> line 29, in equal
>     raise AssertionError(msg)
> AssertionError: inf != 0 (error: |inf| > 6e-09)
> #############################################################################
> xc/XC2.py                                     0.077  OK
> radial/yukawa_radial.py                       0.032  OK
> vdw/potential.py                              0.010  OK
> radial/lebedev.py                             0.048  OK
> occupations.py                                0.044  OK
> lfc/derivatives.py                            0.032  OK
> pw/reallfc.py                                 0.142  OK
> parallel/pblas.py                             0.029  OK
> fd_ops/non_periodic.py                        0.023  OK
> spectrum.py                                   0.084  OK
> pw/lfc.py                                     0.128  OK
> gauss_func.py                                 0.262  OK
> multipoletest.py                              0.145  OK
> cluster.py                                    1.504  OK
> poisson/poisson.py                            0.069  OK
> poisson/poisson_asym.py                       0.207  OK
> parallel/arraydict_redist.py                  0.200  OK
> parallel/scalapack.py                         0.206  OK
> gauss_wave.py                                 0.267  OK
> fd_ops/transformations.py                     0.044  OK
> parallel/blacsdist.py                         0.030  OK
> pbc.py                                        0.815  OK
> atoms_too_close.py                            0.026  OK
> ext_potential/harmonic.py                     0.630  OK
> atoms_mismatch.py                             0.019  OK
> setup_basis_spec.py                           0.073  OK
> pw/direct.py                                  0.024  OK
> vdw/libvdwxc_spin.py                          0.000 SKIPPED
> timing.py                                     0.465  OK
> parallel/ut_parallel.py                       1.060  OK
> lcao/density.py                               0.797  OK
> pw/stresstest.py                              0.608  OK
> pw/fftmixer.py                                0.912  OK
> lcao/fftmixer.py                              0.960 FAILED! (rank 0,1,2,3)
> #############################################################################
> RANK 0,1,2,3:
> Traceback (most recent call last):
>   File 
> "/home/c/chliu/pfs/gpaw-ase/gpaw-master-180815/build/lib/python2.7/site-packages/gpaw/test/__init__.py", 
> line 656, in run_one
>     exec(compile(fd.read(), filename, 'exec'), loc)
>   File 
> "/home/c/chliu/pfs/gpaw-ase/gpaw-master-180815/build/lib/python2.7/site-packages/gpaw/test/lcao/fftmixer.py", 
> line 14, in <module>
>     equal(e, -1.710365540, 1.0e-6)
>   File 
> "/home/c/chliu/pfs/gpaw-ase/gpaw-master-180815/build/lib/python2.7/site-packages/gpaw/test/__init__.py", 
> line 29, in equal
>     raise AssertionError(msg)
> AssertionError: -1.7103632097 != -1.71036554 (error: 
> |2.33029705465e-06| > 1e-06)
> #############################################################################
> symmetry/usesymm.py                           1.032  OK
> coulomb.py                                    0.394  OK
> xc/xcatom.py                                  0.621  OK
> force_as_stop.py                              0.548  OK
> vdwradii.py                                   0.597  OK
> ase_features/ase3k.py                         0.997  OK
> pathological/numpy_zdotc_graphite.py     usage: gpaw [-h] [--version] 
> [-T] [-P N] 
> {help,run,info,dos,gpw,completion,test,atom,diag,python,sbatch,dataset,symmetry,rpa,install-data} 
> ...
> gpaw: error: RuntimeError: MPI barrier timeout.
> To get a full traceback, use: gpaw -T test ...
>
>
>
> Best wishes,
> Chang Liu
> ----------------------------------------------------------
>                         Chang Liu
>                        PhD student
> Fysikum, Albanova, Stockholm University
>        S-106 91 Stockholm, Sweden
>    +46 767159891
> http://xsolasgroup.fysik.su.se/
> ------------------------------------------------------------------------
> *From:* Jens Jørgen Mortensen <jjmo at dtu.dk>
> *Sent:* Wednesday, August 22, 2018 8:58:03 AM
> *To:* Chang Liu; gpaw-users
> *Subject:* Re: [gpaw-users] Vibration analysis with GPAW
> On 08/21/2018 01:47 PM, Chang Liu via gpaw-users wrote:
>>
>> Hi,
>>
>>
>> I am trying to run vibration analysis on a few trajectories obtained 
>> from NEB calculations. Below is one of the input files:
>>
>>
>> from gpaw import *
>> from ase.vibrations import Vibrations
>> from ase.io import *
>> from ase.units import Pascal, m, Bohr
>> from gpaw.utilities import h2gpts
>> import numpy as np
>>
>> name = '2co'
>> state = 0
>> xc = 'BEEF-vdW'
>>
>> mixer = Mixer(0.1, 5, weight = 100.0)
>> calc = GPAW(#gpts = h2gpts(0.2, slab.get_cell(), idiv=8),
>>  poissonsolver = {'dipolelayer':'xy','eps':1e-12},
>>            kpts = {'density':2},
>>            xc = xc,
>>  symmetry = {'point_group':False},
>>            mixer = mixer,
>>  eigensolver=Davidson(3),
>>  spinpol = False,
>>  occupations = FermiDirac(0.1),
>>  maxiter = 1000,
>> )
>> neb_states = read('%s_neb.traj@:'%name)
>> slab = neb_states[state]
>> slab.center(vacuum=5.,axis=2)
>> slab.cell[2][2]+=10.
>> slab.pbc = (1,1,0)
>>
>> calc.set(gpts = h2gpts(0.2, slab.get_cell(), idiv=8),txt = 
>> '%s_%d_gpaw.out' % (name,state))
>> slab.set_calculator(calc)
>> e_f = slab.get_potential_energy()
>> rho = slab.calc.get_all_electron_density(gridrefinement=4)
>> write('%s_%d.cube'% (name,state), slab, data=rho * Bohr**3)
>> f = open('Energy_%s_%d.txt' % (name,state), 'w')
>> f.write('%.3f\n' % e_f)
>> f.close()
>>
>> vib = Vibrations(slab,range(45,49),'vib_%s_%d'%(name,state))
>> vib.run()
>> vib.summary(log='vib_%s_%d.log'%(name,state))
>> vib.write_mode()
>> vib_energies = vib.get_energies()
>>
>> With GPAW-1.3.0, the calculation was finished without any problem. 
>> However, with both 1.4.0 and 1.4.1b1, all the calculations crash with 
>> such an error:
>>
>>
>>
>> ___ ___ ___ _ _ _
>>  |   |   |_  | | | |
>>  | | | | | . | | | |
>>  |__ | _|___|_____|  1.4.1b1
>>  |___|_|
>>
>> User: chliu at b-cn0521.hpc2n.umu.se
>> Date:   Mon Aug 20 21:55:05 2018
>> Arch:   x86_64
>> Pid:    68732
>> Python: 2.7.14
>> gpaw: 
>>  /home/c/chliu/pfs/gpaw-ase/gpaw-master-180815/build/lib/python2.7/site-packages/gpaw
>> _gpaw: 
>> /home/c/chliu/pfs/gpaw-ase/gpaw-master-180815/build/bin/gpaw-python
>> ase: 
>> /home/c/chliu/pfs/gpaw-ase/ase-master-180815/build/lib/python2.7/site-packages/ase-3.16.3b1-py2.7.egg/ase 
>> (version 3.16.3b1)
>> numpy: 
>> /pfs/software/eb/amd64_ubuntu1604_bdw/software/MPI/GCC/6.4.0-2.28/OpenMPI/2.1.1/Python/2.7.14/lib/python2.7/site-packages/numpy-1.13.1-py2.7-linux-x86_64.egg/numpy 
>> (version 1.13.1)
>> scipy: 
>> /pfs/software/eb/amd64_ubuntu1604_bdw/software/MPI/GCC/6.4.0-2.28/OpenMPI/2.1.1/Python/2.7.14/lib/python2.7/site-packages/scipy-0.19.1-py2.7-linux-x86_64.egg/scipy 
>> (version 0.19.1)
>> units:  Angstrom and eV
>> cores:  84
>>
>> Input parameters:
>>   eigensolver: {name: dav,
>> niter: 3}
>>   kpts: {density: 2}
>>   maxiter: 1000
>>   mixer: {backend: pulay,
>>           beta: 0.1,
>> method: separate,
>> nmaxold: 5,
>> weight: 100.0}
>>   occupations: {name: fermi-dirac,
>> width: 0.1}
>>   poissonsolver: {dipolelayer: xy,
>>   eps: 1e-12}
>>   spinpol: False
>>   symmetry: {point_group: False}
>>   xc: BEEF-vdW
>>
>> Writing vib_2co_0.eq.pckl
>> rank=00 L00: Traceback (most recent call last):
>> rank=00 L01:  File 
>> "/home/c/chliu/pfs/gpaw-ase/gpaw-master-180815/build/lib/python2.7/site-packages/gpaw/__init__.py", 
>> line 200, in main
>> rank=00 L02:  runpy.run_path(gpaw_args.script, run_name='__main__')
>> rank=00 L03:  File 
>> "/hpc2n/eb/software/MPI/GCC/6.4.0-2.28/OpenMPI/2.1.1/Python/2.7.14/lib/python2.7/runpy.py", 
>> line 252, in run_path
>> rank=00 L04:  return _run_module_code(code, init_globals, run_name, 
>> path_name)
>> rank=00 L05:  File 
>> "/hpc2n/eb/software/MPI/GCC/6.4.0-2.28/OpenMPI/2.1.1/Python/2.7.14/lib/python2.7/runpy.py", 
>> line 82, in _run_module_code
>> rank=00 L06:  mod_name, mod_fname, mod_loader, pkg_name)
>> rank=00 L07:  File 
>> "/hpc2n/eb/software/MPI/GCC/6.4.0-2.28/OpenMPI/2.1.1/Python/2.7.14/lib/python2.7/runpy.py", 
>> line 72, in _run_code
>> rank=00 L08:  exec code in run_globals
>> rank=00 L09:  File "2co_0.py", line 40, in <module>
>> rank=00 L10:  vib.run()
>> rank=00 L11:  File 
>> "/home/c/chliu/pfs/gpaw-ase/ase-master-180815/build/lib/python2.7/site-packages/ase-3.16.3b1-py2.7.egg/ase/vibrations/vibrations.py", 
>> line 131, in run
>> rank=00 L12:  self.calculate(atoms, filename, fd)
>> rank=00 L13:  File 
>> "/home/c/chliu/pfs/gpaw-ase/ase-master-180815/build/lib/python2.7/site-packages/ase-3.16.3b1-py2.7.egg/ase/vibrations/vibrations.py", 
>> line 168, in calculate
>> rank=00 L14:  forces = self.calc.get_forces(atoms)
>> rank=00 L15:  File 
>> "/home/c/chliu/pfs/gpaw-ase/ase-master-180815/build/lib/python2.7/site-packages/ase-3.16.3b1-py2.7.egg/ase/calculators/calculator.py", 
>> line 515, in get_forces
>> rank=00 L16:  return self.get_property('forces', atoms)
>> rank=00 L17:  File 
>> "/home/c/chliu/pfs/gpaw-ase/ase-master-180815/build/lib/python2.7/site-packages/ase-3.16.3b1-py2.7.egg/ase/calculators/calculator.py", 
>> line 548, in get_property
>> rank=00 L18:  self.calculate(atoms, [name], system_changes)
>> rank=00 L19:  File 
>> "/home/c/chliu/pfs/gpaw-ase/gpaw-master-180815/build/lib/python2.7/site-packages/gpaw/calculator.py", 
>> line 297, in calculate
>> rank=00 L20:  self.log, self.call_observers)
>> rank=00 L21:  File 
>> "/home/c/chliu/pfs/gpaw-ase/gpaw-master-180815/build/lib/python2.7/site-packages/gpaw/scf.py", 
>> line 62, in run
>> rank=00 L22:  wfs.eigensolver.iterate(ham, wfs)
>> rank=00 L23:  File 
>> "/home/c/chliu/pfs/gpaw-ase/gpaw-master-180815/build/lib/python2.7/site-packages/gpaw/eigensolvers/eigensolver.py", 
>> line 84, in iterate
>> rank=00 L24:  wfs.orthonormalize(kpt)
>> rank=00 L25:  File 
>> "/home/c/chliu/pfs/gpaw-ase/ase-master-180815/build/lib/python2.7/site-packages/ase-3.16.3b1-py2.7.egg/ase/utils/timing.py", 
>> line 173, in new_method
>> rank=00 L26:  x = method(slf, *args, **kwargs)
>> rank=00 L27:  File 
>> "/home/c/chliu/pfs/gpaw-ase/gpaw-master-180815/build/lib/python2.7/site-packages/gpaw/wavefunctions/fdpw.py", 
>> line 396, in orthonormalize
>> rank=00 L28:  S.invcholesky()
>> rank=00 L29:  File 
>> "/home/c/chliu/pfs/gpaw-ase/gpaw-master-180815/build/lib/python2.7/site-packages/gpaw/matrix.py", 
>> line 248, in invcholesky
>> rank=00 L30:  check_finite=debug)
>> rank=00 L31:  File 
>> "/pfs/software/eb/amd64_ubuntu1604_bdw/software/MPI/GCC/6.4.0-2.28/OpenMPI/2.1.1/Python/2.7.14/lib/python2.7/site-packages/scipy-0.19.1-py2.7-linux-x86_64.egg/scipy/linalg/decomp_cholesky.py", 
>> line 81, in
>> cholesky
>> rank=00 L32:  check_finite=check_finite)
>> rank=00 L33:  File 
>> "/pfs/software/eb/amd64_ubuntu1604_bdw/software/MPI/GCC/6.4.0-2.28/OpenMPI/2.1.1/Python/2.7.14/lib/python2.7/site-packages/scipy-0.19.1-py2.7-linux-x86_64.egg/scipy/linalg/decomp_cholesky.py", 
>> line 30, in
>> _cholesky
>> rank=00 L34:  raise LinAlgError("%d-th leading minor not positive 
>> definite" % info)
>> rank=00 L35: LinAlgError: 21-th leading minor not positive definite
>> GPAW CLEANUP (node 0): <class 'numpy.linalg.linalg.LinAlgError'> 
>> occurred.  Calling MPI_Abort!
>> --------------------------------------------------------------------------
>> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
>> with errorcode 42.
>>
>> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
>> You may or may not see output from other processes, depending on
>> exactly when Open MPI kills them.
>> --------------------------------------------------------------------------
>>
>> The GPAW versions 1.3.0, 1.4.0 and 1.4.1b1 were all installed with 
>> the same modules (OpenMPI, Scipy, Numpy, etc.). It seems to be 
>> related with the change from 1.3.0 to 1.4.0... Could you please help 
>> me? Thanks a lot!
>>
>
> Could you try to run the GPAW tests in parallel and show us the output?
>
>     $ gpaw -P 4 test
>
> Jens Jørgen
>
>>
>> Best wishes,
>> Chang Liu
>> ----------------------------------------------------------
>>                         Chang Liu
>>                        PhD student
>> Fysikum, Albanova, Stockholm University
>>  S-106 91 Stockholm, Sweden
>>        +46 767159891
>> http://xsolasgroup.fysik.su.se/
>>
>>
>> _______________________________________________
>> gpaw-users mailing list
>> gpaw-users at listserv.fysik.dtu.dk
>> https://listserv.fysik.dtu.dk/mailman/listinfo/gpaw-users
>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://listserv.fysik.dtu.dk/pipermail/gpaw-users/attachments/20180822/0c9bc069/attachment-0001.html>


More information about the gpaw-users mailing list