[gpaw-users] Questions about Parallel Runs
Hongliang Xin
hxin at umich.edu
Mon Apr 12 22:02:47 CEST 2010
Problem solved. Now the calculation is paralleled correctly. We have fixed
the link of mpi library to the right openmpi and it just works.
Thanks for your help.
Hongliang
On Mon, Apr 12, 2010 at 10:32 AM, Hongliang Xin <hxin at umich.edu> wrote:
> I just tried the example of H.py. The calculation finished correctly. Below
> is just a part of output. I do not know how that indicates the calculation
> is paralleled normally.
>
> thanks,
>
>
> : mpirun -np 2 gpaw-python H.py
> mpirun: Info: plugin plugin002_007_001.so being skipped due to setting of
> mpirun: HPMPI_IGNORE_002_007_001 in environment
> Parsing application description...
> Identifying hosts...
> Spawning processes...
> --------------------------------------------------------------------------
> [0,0,0]: OpenIB on host nyx-login-amd3.engin.umich.edu was unable to find
> any HCAs.
> Another transport will be used instead, although this may result in
> lower performance.
> --------------------------------------------------------------------------
> --------------------------------------------------------------------------
> [0,0,0]: OpenIB on host nyx-login-amd3.engin.umich.edu was unable to find
> any HCAs.
> Another transport will be used instead, although this may result in
> lower performance.
> --------------------------------------------------------------------------
> MPI Application rank 0 exited before MPI_Init() with status 0
>
>
>
>
> On Mon, Apr 12, 2010 at 10:23 AM, Duy Le <ttduyle at gmail.com> wrote:
>
>> You should have tested with smaller system. For example, the H.py in
>> examples directory.
>> Try to use directly "mpirun -np 2 gpaw-python H.py" (not thru the queue
>> system)
>> And also, hope that you did not make any mistake in the command:
>> "mpirun -np $NP -machinefile $PBS_NODEFILE"
>> --------------------------------------------------
>> Duy Le
>> PhD Student
>> Department of Physics
>> University of Central Florida.
>>
>> "Men don't need hand to do things"
>>
>>
>>
>> On Mon, Apr 12, 2010 at 10:03 AM, Hongliang Xin <hxin at umich.edu> wrote:
>> > openmpi-1.3.2/gcc/bin/mpicc
>> >
>> > This is the mpi available as I build the gpaw.
>> >
>> > thanks,
>> >
>> > Hongliang
>> >
>> > On Mon, Apr 12, 2010 at 9:59 AM, Duy Le <ttduyle at gmail.com> wrote:
>> >>
>> >> which mpi are you using?
>> >> --------------------------------------------------
>> >> Duy Le
>> >> PhD Student
>> >> Department of Physics
>> >> University of Central Florida.
>> >>
>> >> "Men don't need hand to do things"
>> >>
>> >>
>> >>
>> >> On Mon, Apr 12, 2010 at 8:49 AM, Hongliang Xin <hxin at umich.edu> wrote:
>> >> > Here is the output. The gpaw has been built with mpi support. The
>> output
>> >> > below does not seem right to me. We built gpaw with the system mpi,
>> but
>> >> > it
>> >> > shows below that the library is linked to our user mpi
>> (/home/morabim).
>> >> > Is
>> >> > this the problem? I will take a look and rebuild the gpaw.
>> >> >
>> >> > thanks,
>> >> >
>> >> > : ldd `which gpaw-python`
>> >> > libgfortran.so.1 => /usr/lib64/libgfortran.so.1
>> (0x00002aeb72229000)
>> >> > libpython2.4.so.1.0 => /usr/lib64/libpython2.4.so.1.0
>> >> > (0x00002aeb724c0000)
>> >> > libpthread.so.0 => /lib64/libpthread.so.0 (0x00002aeb727f1000)
>> >> > libdl.so.2 => /lib64/libdl.so.2 (0x00002aeb72a0d000)
>> >> > libutil.so.1 => /lib64/libutil.so.1 (0x00002aeb72c11000)
>> >> > libm.so.6 => /lib64/libm.so.6 (0x00002aeb72e14000)
>> >> > libmpi.so.0 => /home/morabitm/usr/lib/libmpi.so.0
>> >> > (0x00002aeb73098000)
>> >> > libopen-rte.so.0 => /home/morabitm/usr/lib/libopen-rte.so.0
>> >> > (0x00002aeb7322c000)
>> >> > libopen-pal.so.0 => /home/morabitm/usr/lib/libopen-pal.so.0
>> >> > (0x00002aeb73385000)
>> >> > libnsl.so.1 => /lib64/libnsl.so.1 (0x00002aeb734e5000)
>> >> > libc.so.6 => /lib64/libc.so.6 (0x00002aeb736fd000)
>> >> > /lib64/ld-linux-x86-64.so.2 (0x00002aeb7200c000)
>> >> >
>> >> > On Mon, Apr 12, 2010 at 4:58 AM, Marcin Dulak
>> >> > <Marcin.Dulak at fysik.dtu.dk>
>> >> > wrote:
>> >> >>
>> >> >> Hi,
>> >> >>
>> >> >> is you gpaw built with mpi support?
>> >> >> What does:
>> >> >> ldd `which gpaw-python`
>> >> >> say?
>> >> >> To build the parallel version, mpicc must be in PATH at the build
>> time,
>> >> >> or the customize.py must include the relevant mpi information.
>> >> >> Please look at the following example of the
>> >> >>
>> >> >>
>> https://trac.fysik.dtu.dk/projects/gpaw/browser/trunk/doc/install/Linux/Niflheim/customize-thul-acml.py
>> >> >> (the scalapack settings and the platform_id variable at the end of
>> the
>> >> >> customize.py are not relevant to your case - anyway
>> >> >> the example of using platform_id is described here
>> >> >>
>> >> >>
>> https://wiki.fysik.dtu.dk/gpaw/install/Linux/Niflheim/Niflheim.html#niflheim
>> ).
>> >> >>
>> >> >> Best regards,
>> >> >>
>> >> >> Marcin
>> >> >>
>> >> >> Hongliang Xin wrote:
>> >> >>>
>> >> >>> Dear gpaw-users,
>> >> >>>
>> >> >>> We have been trying to figure out what is the problem with the
>> >> >>> parallel
>> >> >>> gpaw calculations recently.
>> >> >>> I know there is definitely the problem due to the following several
>> >> >>> observations:
>> >> >>> 1. There is no benefit for calculations with multiple processors
>> >> >>> compared
>> >> >>> with one serial calculation.
>> >> >>> 2. It seems it only runs several serial versions of the program
>> >> >>> without
>> >> >>> any communication between nodes. 3. After calculation is done,
>> there
>> >> >>> are
>> >> >>> always .gpaw and another .old.gpaw files with exactly the same
>> size.
>> >> >>> 4. I did not see any indication that the job is paralleled.
>> >> >>>
>> >> >>> The gpaw is compiled without difficulty with the configuration as
>> >> >>> shown
>> >> >>> in the bottom. The job is submitted according to the gpaw
>> >> >>> documentation
>> >> >>> using:
>> >> >>> mpirun -np $NP -machinefile $PBS_NODEFILE gpaw-python Script.py
>> >> >>>
>> >> >>> Any suggestion will be very welcome.
>> >> >>>
>> >> >>> thanks,
>> >> >>>
>> >> >>> Hongliang
>> >> >>>
>> >> >>>
>> >> >>> * Using standard lapack
>> >> >>> * Architecture: linux-x86_64
>> >> >>> * Building a custom interpreter
>> >> >>> libraries ['acml', 'gfortran']
>> >> >>> library_dirs ['/home/software/rhel5/acml/4.0.1-gcc/gfortran64/lib']
>> >> >>> include_dirs
>> >> >>>
>> >> >>>
>> ['/home/software/rhel5/python/lib64/python2.4/site-packages/numpy/core/include',
>> >> >>> 'c/libxc']
>> >> >>> define_macros []
>> >> >>> extra_link_args []
>> >> >>> extra_compile_args ['-Wall', '-std=c99']
>> >> >>> runtime_library_dirs []
>> >> >>> extra_objects []
>> >> >>>
>> >> >>> Parallel configuration
>> >> >>> mpicompiler mpicc
>> >> >>> mpi_libraries []
>> >> >>> mpi_library_dirs []
>> >> >>> mpi_include_dirs []
>> >> >>> mpi_define_macros []
>> >> >>> mpi_runtime_library_dirs []
>> >> >>>
>> >> >>>
>> >> >>>
>> >> >>> --
>> >> >>> Hongliang Xin
>> >> >>> Ph.D. Candidate
>> >> >>> Dept. of Chemical Engineering
>> >> >>> University of Michigan
>> >> >>> 3166 HH Dow
>> >> >>> 2300 Hayward
>> >> >>> Ann Arbor, MI 48109
>> >> >>> Phone: (734) 647-8051
>> >> >>> E-mail: hxin at umich.edu <mailto:hxin at umich.edu>
>> >> >>>
>> >> >>>
>> ------------------------------------------------------------------------
>> >> >>>
>> >> >>> _______________________________________________
>> >> >>> gpaw-users mailing list
>> >> >>> gpaw-users at listserv.fysik.dtu.dk
>> >> >>> https://listserv.fysik.dtu.dk/mailman/listinfo/gpaw-users
>> >> >>
>> >> >> --
>> >> >> ***********************************
>> >> >>
>> >> >> Marcin Dulak
>> >> >> Technical University of Denmark
>> >> >> Department of Physics
>> >> >> Building 307, Room 229
>> >> >> DK-2800 Kongens Lyngby
>> >> >> Denmark
>> >> >> Tel.: (+45) 4525 3157
>> >> >> Fax.: (+45) 4593 2399
>> >> >> email: Marcin.Dulak at fysik.dtu.dk
>> >> >>
>> >> >> ***********************************
>> >> >>
>> >> >
>> >> >
>> >> >
>> >> > --
>> >> > Hongliang Xin
>> >> > Ph.D. Candidate
>> >> > Dept. of Chemical Engineering
>> >> > University of Michigan
>> >> > 3166 HH Dow
>> >> > 2300 Hayward
>> >> > Ann Arbor, MI 48109
>> >> > Phone: (734) 647-8051
>> >> > E-mail: hxin at umich.edu
>> >> >
>> >> > _______________________________________________
>> >> > gpaw-users mailing list
>> >> > gpaw-users at listserv.fysik.dtu.dk
>> >> > https://listserv.fysik.dtu.dk/mailman/listinfo/gpaw-users
>> >> >
>> >
>> >
>> >
>> > --
>> > Hongliang Xin
>> > Ph.D. Candidate
>> > Dept. of Chemical Engineering
>> > University of Michigan
>> > 3166 HH Dow
>> > 2300 Hayward
>> > Ann Arbor, MI 48109
>> > Phone: (734) 647-8051
>> > E-mail: hxin at umich.edu
>> >
>>
>
>
>
> --
> Hongliang Xin
> Ph.D. Candidate
> Dept. of Chemical Engineering
> University of Michigan
> 3166 HH Dow
> 2300 Hayward
> Ann Arbor, MI 48109
> Phone: (734) 647-8051
> E-mail: hxin at umich.edu
>
--
Hongliang Xin
Ph.D. Candidate
Dept. of Chemical Engineering
University of Michigan
3166 HH Dow
2300 Hayward
Ann Arbor, MI 48109
Phone: (734) 647-8051
E-mail: hxin at umich.edu
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://listserv.fysik.dtu.dk/pipermail/gpaw-users/attachments/20100412/f0c50651/attachment.html
More information about the gpaw-users
mailing list