develooper Front page | perl.perl6.language | Postings from February 2001

Re: Garbage collection (was Re: JWZ on s/Java/Perl/)

Thread Previous | Thread Next
February 14, 2001 08:10
Re: Garbage collection (was Re: JWZ on s/Java/Perl/)
Message ID:

[[ reply to this goes only to -internals ]]

Dan Sugalski wrote:
> *) People like it

Well, if people liking it is the only reason (either is the only on or
appears 3 times in a 5 item list, what is pretty much the same to me ;-)
[... the only reason] to add a feature to Perl, we'll probably end much more
bloated than we're now, IMHO.

> *) Scarce external resources (files, DB handles, whatever) don't get
> unnecessarily used

Unless there's a way to do it predictably without impacting programs that
don't depend so much on quick freeing of external resources, I don't believe
it's worth.

> *) Saves having to write explicit cleanup code yourself

You wouldn't have to, you only would be able to, if you like it. If you're
writing an application that would possibly open too many files, you'd
probably want to destroy their handles ASAP. OTOH, if you're writing an
application that only opens one file and does a lot of processing over it,
you simply wouldn't care and let it be freed whenever the GC collects its

> At 10:12 AM 2/14/2001 -0300, Branden wrote:
> >If resource exhaustion is the problem, I think we can deal with that when
> >try to allocate a resource and we get an error, then we call the GC
> >explicitly (one or more times if needed) to see if we can free some
> >resources with it. Resource exhaustion would be a rare situation (I
> >and doing some expensive treatment when it happens is OK for me.
> The point of DESROY isn't resource exhaustion per se, at least not
> the garbage collector will care about, since it only cares about memory.

Well, I thought DESTROY frees open files, database connections, OS locks,
etc. Aren't those what cause resource exhaustion?

> >Also, I think it would be valid for the programmer to explicitly say ``I
> >would like to DESTROY this object now'', and have the DESTROY method
> >in that time, even if the memory would be reclaimed only later.
> So you undef your object reference. If the object doesn't go away, it
> that something else probably still has a handle on it somewhere.

I thought that was the whole problem with ``not predictable stuff'':
undefing the variable, no other variable references the object, and it's
still there, it doesn't get destroyed.

> Plus there's nothing stopping you from having $obj->DESTROY in your own
> code, though it may be inadvisable.

It is (mainly) inadvisable because:
1. GC will call DESTROY when it collects the memory, so DESTROY would get
called twice, which is VERY BAD.
2. If I call DESTROY on an object, it would still be a (valid) object after
the call, so that if I call some other method, it will succeed. But that
shouldn't happen, since the object was DESTROYed, right?

That's exactly what I propose. Having something that, when called with an
object as parameter, would call the object's DESTROY, and would flag the
object someway so that GC doesn't call DESTROY on it when collecting the
memory and that every other attempt to call a method on the object raises an
exception that makes it clear what happened (ie. ``Method call on already
destroyed object''), so that debugging is `possible' in this semantic.

- Branden

Thread Previous | Thread Next Perl Programming lists via nntp and http.
Comments to Ask Bjørn Hansen at | Group listing | About