[gpaw-users] MPI-Scalapack library for Python
marc
marc.barbry at mailoo.org
Tue Sep 26 10:14:53 CEST 2017
Dear Hugo,
I had a look to this wrapper and tried to use it, but I didn't manage to
make it work.
Also, there was no commit to this project this a while, so it is
probably not anymore maintained and my knowledge of Cython is too low to
be able to debug this code.
Did you manage to make it work?
Bests,
marc
On 09/26/2017 10:06 AM, Hugo Strand wrote:
> Dear Marc,
>
> Maybe you would like to consider contributing to the existing wrapper
> project?
>
> https://github.com/jrs65/scalapy
>
> Did you have a look at it?
>
> Best, Hugo
>
> On Mon, Sep 25, 2017 at 6:15 PM, marc via gpaw-users
> <gpaw-users at listserv.fysik.dtu.dk
> <mailto:gpaw-users at listserv.fysik.dtu.dk>> wrote:
>
> Hi!
>
> I started a repository
> <https://gitlab.com/mbarbry/python-scalapack> for this project.
> Since I don't know the Python/C api, I used ctypes for the first
> implemented function (blacs_pinfo).
> If some of you want to help me with the Python/C api, as used in
> GPAW, you are more than welcome, otherwise I will continue with
> ctypes.
>
> Best regards,
> marc
>
>
> On 09/21/2017 06:00 PM, marc via gpaw-users wrote:
>> I think for starting we should implement an external library for
>> Scalapack. But ultimately, I believe that it would be nice for
>> the community if this functions are available directly through
>> Scipy, as the BLAS and LAPACK wrapper are already implemented
>> there. Then, since they are already implemented, I don't think it
>> is useful to pack BLAS and LAPACK.
>>
>> I'm not a GPAW developer, but you could replace the call to BLAS
>> and LAPACK in GPAW by the scipy wrapper instead of your one. We
>> are doing it in our own code and the functions are as fast than
>> C/Fortran implementation (almost all blas routine are
>> implemented, not so sure for lapack),
>>
>> Simple code like this one can be use,
>>
>> import scipy.linalg.blas as blas
>>
>> C = blas.gemm(1.0, A, B)
>>
>> Concerning the MPI wrapper, as Ask already commented, we should
>> use the mpi4py library, then only Scalapack routines need to be
>> implemented.
>>
>> Marc
>>
>> On 09/21/2017 05:42 PM, Kuisma, Mikael wrote:
>>> Hi!
>>>
>>> There is one question which needs to be figured out. As Ask, I
>>> believe that
>>> "It would be very useful to have a well-designed stack of Python
>>> binding for BLAS, LAPACK, MPI, and ScaLAPACK." It could even
>>> allow new novel parallelization methods and algorithms as they
>>> become available (say for efficient diagonalization). Such a
>>> library would in general be quite good to the community even
>>> outside DFT.
>>>
>>> The question is, however, how hard would it be to restructure
>>> GPAW to use that library? Although, the general library would be
>>> nice, writing it should make a useful contribution to GPAW
>>> implementation too. What do other GPAW developers think on the
>>> tediousness compared to usefullness of packaking BLAS, LAPACK,
>>> MPI, and ScaLAPACK to one independent package?
>>>
>>> Regarding scipy. Why scipy, why not just an external Python
>>> library with c-bindings?
>>>
>>> Mikael
>>>
>>> ------------------------------------------------------------------------
>>> *Lähettäjä:* gpaw-users-bounces at listserv.fysik.dtu.dk
>>> <mailto:gpaw-users-bounces at listserv.fysik.dtu.dk>
>>> [gpaw-users-bounces at listserv.fysik.dtu.dk
>>> <mailto:gpaw-users-bounces at listserv.fysik.dtu.dk>] käyttäjän
>>> Peter Koval via gpaw-users [gpaw-users at listserv.fysik.dtu.dk
>>> <mailto:gpaw-users at listserv.fysik.dtu.dk>] puolesta
>>> *Lähetetty:* 21. syyskuuta 2017 17:51
>>> *Vastaanottaja:* marc Barbry
>>> *Kopio:* gpaw-users at listserv.fysik.dtu.dk
>>> <mailto:gpaw-users at listserv.fysik.dtu.dk>
>>> *Aihe:* Re: [gpaw-users] MPI-Scalapack library for Python
>>>
>>> I think scipy could be more tolerant to mpi calls than numpy.
>>>
>>> El 21 sept. 2017 4:28 PM, "marc" <marc.barbry at mailoo.org
>>> <mailto:marc.barbry at mailoo.org>> escribió:
>>>
>>> From my experience, scipy has a good parallelization through
>>> Blas routines if build against the right library (e.g. mkl).
>>> But this is only for shared memory parallelization and it
>>> can not be use with MPI. Scipy therefore is missing MPI
>>> capabilities that it could be nice to add.
>>>
>>> On 09/21/2017 04:25 PM, Ask Hjorth Larsen wrote:
>>>
>>> It is worth noting that scipy calls a lot of BLAS
>>> functions, but I
>>> don't know how it works (maybe they actually have a complete
>>> interface?).
>>>
>>> 2017-09-21 16:16 GMT+02:00 Peter Koval via gpaw-users
>>> <gpaw-users at listserv.fysik.dtu.dk
>>> <mailto:gpaw-users at listserv.fysik.dtu.dk>>:
>>>
>>> Yes, I agree. Ultimately the lib or similar should
>>> be a pary of scipy.
>>>
>>> El 21 sept. 2017 2:50 PM, "marc"
>>> <marc.barbry at mailoo.org
>>> <mailto:marc.barbry at mailoo.org>> escribió:
>>>
>>> Hi Mikael,
>>>
>>> nice to see that you are interested in writing a
>>> Scalapack wrapper. I
>>> think it is really missing in python.
>>> I found one, name scalapy. But it is not
>>> maintained anymore and the
>>> installation failed. We could try to write
>>> something similar using the code
>>> from GPAW.
>>>
>>> Best regards,
>>> Marc
>>>
>>> On 09/21/2017 01:42 PM, Kuisma, Mikael wrote:
>>>
>>> Hi!
>>>
>>> I added some Scalapack/blacs routines while
>>> implementing the LCAO-TDDFT
>>> for GPAW and it was somewhat cumbersome. Also,
>>> the API we have in GPAW is
>>> somewhat tedious to use at the moment: we have a
>>> lot of code to handle
>>> ScaLAPACK and LAPACK calls as different cases.
>>> So I would vote for writing a
>>> general Python-Scalapack library (instead that
>>> you take code from GPAW to
>>> your Python/C code). We could make this library
>>> transparent, so it would
>>> work also without ScaLAPACK performing serial
>>> linear algebra..
>>>
>>> ScaPACK does not parallize as well as I would
>>> like it to parallelize.
>>> With a library wrapper, one could also try other
>>> libraries (for
>>> eigensolvers, there are at least more parallel
>>> solvers than ScaLAPACK).
>>>
>>> One more remark regarding GPAW: I always resist
>>> adding external libraries
>>> to GPAW, so if one does a python with c-bindings
>>> library, I think it would
>>> make sense to embed that Python and C-code
>>> directly to GPAW repository. The
>>> end users do not like install new dependencies
>>> every time the version number
>>> increases by 0.1. (I wish that we should also
>>> provide basic working
>>> installation of libxc and libvdwxc as well in
>>> our repository, with option to
>>> link to external libraries).
>>>
>>> I do not know about licencing issues that well,
>>> but as far as I think, it
>>> should be ok to take code from GPAW (GPL), if
>>> your code is also GPL and make
>>> your code available.
>>>
>>> BR,
>>> Mikael Kuisma
>>>
>>> Academy of Finland Post Doc
>>>
>>> Nanoscience Center
>>> Department of Chemistry
>>> University of Jyväskylä
>>> P.O. Box 35
>>> FIN-40014 University of Jyväskylä
>>> Finland
>>>
>>> E-mail: mikael.j.kuisma (at) jyu.fi <http://jyu.fi>
>>> Phone: +358 40 077 4114
>>> <tel:%2B358%2040%20077%204114>
>>>
>>> https://scholar.google.se/citations?user=w0eP9esAAAAJ
>>> <https://scholar.google.se/citations?user=w0eP9esAAAAJ>
>>>
>>> Visiting address: NSC, 3rd floor, YN320
>>>
>>> ________________________________________
>>> Lähettäjä:
>>> gpaw-users-bounces at listserv.fysik.dtu.dk
>>> <mailto:gpaw-users-bounces at listserv.fysik.dtu.dk>
>>> [gpaw-users-bounces at listserv.fysik.dtu.dk
>>> <mailto:gpaw-users-bounces at listserv.fysik.dtu.dk>]
>>> käyttäjän marc via
>>> gpaw-users [gpaw-users at listserv.fysik.dtu.dk
>>> <mailto:gpaw-users at listserv.fysik.dtu.dk>] puolesta
>>> Lähetetty: 21. syyskuuta 2017 14:10
>>> Vastaanottaja: gpaw-users at listserv.fysik.dtu.dk
>>> <mailto:gpaw-users at listserv.fysik.dtu.dk>
>>> Kopio: Peter Koval
>>> Aihe: [gpaw-users] MPI-Scalapack library for Python
>>>
>>> Dear Gpaw developers,
>>>
>>> We are actually developing a python code to
>>> perform efficient TDDFT
>>> calculation with LCAO. Our program can read DFT
>>> data from different code
>>> such as Siesta, OpenMX, Pyscf and GPAW (with
>>> some limitations for the
>>> moment).
>>> We would like to parallelize our code using MPI
>>> and Scalapack but we
>>> could not find any library in python that wrap
>>> both MPI and Scalapack.
>>> Looking to the GPAW code, we have seen that you
>>> wrote your own wrapper
>>> for MPI-Saclapack. My question is the following,
>>> would it be difficult
>>> to modify your MPI wrapper to use it in our
>>> code? Is it feasible from a
>>> license point of view?
>>> Do you think it could be feasible to write a
>>> general python wrapper for
>>> MPI and Scalapack in an other library using your
>>> implementation?
>>>
>>> Best regards,
>>> Marc Barbry
>>> _______________________________________________
>>> gpaw-users mailing list
>>> gpaw-users at listserv.fysik.dtu.dk
>>> <mailto:gpaw-users at listserv.fysik.dtu.dk>
>>> https://listserv.fysik.dtu.dk/mailman/listinfo/gpaw-users
>>> <https://listserv.fysik.dtu.dk/mailman/listinfo/gpaw-users>
>>>
>>>
>>> _______________________________________________
>>> gpaw-users mailing list
>>> gpaw-users at listserv.fysik.dtu.dk
>>> <mailto:gpaw-users at listserv.fysik.dtu.dk>
>>> https://listserv.fysik.dtu.dk/mailman/listinfo/gpaw-users
>>> <https://listserv.fysik.dtu.dk/mailman/listinfo/gpaw-users>
>>>
>>>
>>
>>
>>
>> _______________________________________________
>> gpaw-users mailing list
>> gpaw-users at listserv.fysik.dtu.dk
>> <mailto:gpaw-users at listserv.fysik.dtu.dk>
>> https://listserv.fysik.dtu.dk/mailman/listinfo/gpaw-users
>> <https://listserv.fysik.dtu.dk/mailman/listinfo/gpaw-users>
>
>
> _______________________________________________
> gpaw-users mailing list
> gpaw-users at listserv.fysik.dtu.dk
> <mailto:gpaw-users at listserv.fysik.dtu.dk>
> https://listserv.fysik.dtu.dk/mailman/listinfo/gpaw-users
> <https://listserv.fysik.dtu.dk/mailman/listinfo/gpaw-users>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://listserv.fysik.dtu.dk/pipermail/gpaw-users/attachments/20170926/5e984fa8/attachment-0001.html>
More information about the gpaw-users
mailing list