[gpaw-users] general comment on memory leaks.

Ask Hjorth Larsen asklarsen at gmail.com
Fri Jan 15 14:26:03 CET 2016


The files are appreciated but the text output (stdout), being the most
important one, is still missing.

Best regards
Ask

2016-01-15 14:09 GMT+01:00 abhishek khetan <askhetan at gmail.com>:
> Attached are the files. The cif file is actually a gpaw converged output
> that i extracted using ase and then changed one atom in it. It is quite huge
> in size though.
>
> Maybe such errors are related to my installation, although I cannot find it
> in any way.
>
> Thanks and Best,
>
>
> On Thu, Jan 14, 2016 at 5:47 PM, Ask Hjorth Larsen <asklarsen at gmail.com>
> wrote:
>>
>> Please attach both input script and text output.
>>
>> Best regards
>> Ask
>>
>> 2016-01-14 17:38 GMT+01:00 abhishek khetan <askhetan at gmail.com>:
>> > Dear gpaw developers,
>> >
>> > i have found that in general for large systems (> 150 atoms) or systems
>> > with
>> > memory intensive methods like the GW, there are always segfault errors
>> > of a
>> > similar kind. I have a scalapack compiled working version of gpaw-0.12
>> > which
>> > passes all tests in the suite. For a system, small in size, the various
>> > methods in gpaw run properly but for bigger systems of the desired sizes
>> > of
>> > the same kind, gpaw fails with the exact same kind of error.
>> >
>> > gpaw-python:18622 terminated with signal 11 at PC=3d8d6acba8
>> > SP=7ffe9b9d47b0.  Backtrace:
>> >
>> > I have posted about this in the context of GW method on the gpaw forums
>> > a
>> > couple of dozen times before, but i haven't seen anyone else report
>> > similar
>> > errors. Now I am encountering the same unsolved errors in even simple
>> > relaxation problems where the unit cell happens to be quite large. For
>> > slightly smaller cases where the systems do converge, i see that the
>> > memory
>> > reuirements are actually very modest (1-2) gigs per core for 60 cores.
>> >
>> > Any ideas/ methods/ procedures that i can resolve this error as a user ?
>> > Am
>> > I allowed to make a ticket on this or request for a ticket on this on
>> > the
>> > TRAC ?
>> >
>> > Thanks and Best,
>> >
>> > askhetan
>> >
>> > _______________________________________________
>> > gpaw-users mailing list
>> > gpaw-users at listserv.fysik.dtu.dk
>> > https://listserv.fysik.dtu.dk/mailman/listinfo/gpaw-users
>
>
>
>
> --
> || radhe radhe ||
>
> abhishek
>
> _______________________________________________
> gpaw-users mailing list
> gpaw-users at listserv.fysik.dtu.dk
> https://listserv.fysik.dtu.dk/mailman/listinfo/gpaw-users


More information about the gpaw-users mailing list