[gpaw-users] MPI support without gpaw-python
Gaël Donval
G.Donval at bath.ac.uk
Wed Aug 8 11:16:14 CEST 2018
On Tue, 2018-08-07 at 16:25 +0000, Gaël Donval via gpaw-users wrote:
> On Tue, 2018-08-07 at 15:01 +0000, Gaël Donval via gpaw-users wrote:
> > On Tue, 2018-08-07 at 16:55 +0200, Jens Jørgen Mortensen wrote:
> > > On 08/07/2018 04:45 PM, Gaël Donval via gpaw-users wrote:
> > > > Hi,
> > > >
> > > > Is it still possible to get MPI support without compiling the
> > > > custom
> > > > interpreter?
> > >
> > > I don't know if it still works. Try to turn on this section:
> > >
> > > https://gitlab.com/gpaw/gpaw/blob/master/customize.py#L80
> > >
> > > compile and then set the environment variable $GPAW_MPI to point
> > > to
> > > your
> > > dynamic MPI library.
> >
> > Hi Jens Jørgen,
> >
> > That's what I did (on a standard ArchLinux distribution, nothing
> > fancy,
> > really). I get this:
> >
> > ----------------------------------------------------------------
> > ----------
> > Open MPI has detected that this process has attempted to
> > initialize
> > MPI (via MPI_INIT or MPI_INIT_THREAD) more than once. This is
> > erroneous.
> > ----------------------------------------------------------------
> > ----------
> > [hostname:25791] *** An error occurred in MPI_Init
> > [hostname:25791] *** reported by process [2821259265,0]
> > [hostname:25791] *** on a NULL communicator
> > [hostname:25791] *** Unknown error
> > [hostname:25791] *** MPI_ERRORS_ARE_FATAL (processes in this
> > communicator will now abort,
> > [hostname:25791] *** and potentially your MPI job)
> >
> > Other MPI programs on that computer are working fine and a (very)
> > simple mpi4py test works. For the records, the version of openmpi
> > is
> > 3.1.0.
> >
> > I'll have a look at the mpi.c sources and activate whatever macro
> > can
> > help.
> >
> > Gaël
> >
> > >
> > > Jens Jørgen
>
> Success (apparently) \O/
>
> Race conditions that exist with python (multiple instances are
> spawned
> by mpiexec) but not in gpaw-python (since it initialises MPI right
> away). Once solution is to use MPI_Initialized().
>
> I'll submit a patch tomorrow though I am not sure that this is
> enough.
>
> Gaël
>
I've created a new merge request:
https://gitlab.com/gpaw/gpaw/merge_requests/403
This is working with OpenMPI but I don't know for other vendors.
Best regards,
Gaël
More information about the gpaw-users
mailing list