Front page | perl.perl5.porters |
Postings from July 2000
Re: [ID 20000731.009] Does not free memory when deleting hash keys< 1M,!
Thread Previous
From:
Jesse Brown
Date:
July 31, 2000 19:53
Subject:
Re: [ID 20000731.009] Does not free memory when deleting hash keys< 1M,!
Message ID:
Pine.LNX.4.20.0007311943440.661-100000@lugnut.avlug.org
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Actually, the 'problem' has to do with the way perl allocates memory for
hash storage.
Except for certain instances (like where the value is very large), when
per destroys the hash, it leaves the memory allocated system wise, and
make it available to other hashes.
It sucks if you grow hashes, kill them, and then do non-hash memory
intensive stuff, because that memory will never be returned to the
general system memory pool (or even available to other forms of perl
variables) until you exit.
Think of it like a berkeley DBM. :)
Odd stuff, and completly non-documented.
It would be nice if there was some way to change that behaviour, or
perhaps a function to force an expensive clean and dealloc.
Oh well, too many features, not enought time. :)
On Mon, 31 Jul 2000, ___cliff rayman___ wrote:
> i looked at::
> perldoc -q free
>
> then,
> /memory
>
> looks like your answer - one that you may not like - is there.
>
> --
> ___cliff rayman___cliff@genwax.com___http://www.genwax.com/
> jbrown@dilbert.iz.com wrote:
>
> > This is a bug report for perl from bextreme@pobox.com,
> > generated with the help of perlbug 1.28 running under perl v5.6.0.
> >
> > -----------------------------------------------------------------
> > [Please enter your report here]
> >
> > When deleting entries from a hash, the memory they took is never freed
> > (not even on destruction of the hash), unless that entry is huge (like 1M+).
> > Attacted is a script that demonstrates this problem (beware it stores 1000
> > 100k entries, taking 100M when run).
> > When run with 1000 100k entries, it NEVER FREES THE MEMORY, even on hash
> > destruction, until the script ends. When run with 100 1M entries, it visibly frees memory. (just change count and size to change memory consumption)
> >
> > My method of viewing memory size is with
> > ps -axv
> >
> > As I am sure you can guess, this is a MAJOR problem. I have situations where I
> > load 200+ entries into a hash, and delete them when I am done, and it never gets
> > freed. Any help at all would be appreciated.
> >
> > --- Duplication script. --
>
>
>
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.0.2 (GNU/Linux)
Comment: For info see http://www.gnupg.org
iD8DBQE5hjrRstdHMTOZj9oRAt41AKC/DVBNeNlUFVP9POf3GqC57obNaQCgz/yB
1cRA1N1zqFZDWTjlmEtXa2M=
=mHWH
-----END PGP SIGNATURE-----
Thread Previous