develooper Front page | perl.perl6.language | Postings from September 2005

use fatal err fail

Thread Next
From:
Adam D. Lopresto
Date:
September 28, 2005 17:04
Subject:
use fatal err fail
Message ID:
Pine.LNX.4.62.0509281135040.13649@hive.cec.wustl.edu
The recent thread on Expectuations brought back to mind something I've been
thinking for a while.  In short, I propose that "use fatal" be on by default, and
that "err" be turned into syntactic sugar for a very small try/CATCH block.

Basically, my observation is that no-one consistently checks the return values of
the built-in functions.  As Mama[1] can attest, lots and lots of code is posted to
Perl Monks calling open without checking the return value.  Even among those who
check the return value of open, just about none check for close[2].  And have you
*ever* seen a program that actually checks the return value of print[3]?

Exception handling seems to provide a good solution to this.  Instead of writing
"open ... or die ..." every time, we can just blindly assume the open worked, and
catch the error later.  If we don't catch it, it does exactly what the "die" would
have anyway, so we're no worse off.  The exception propagates upwards until it's
handled, or it terminates the program.  At least the program doesn't continue
blindly, believing all to be well.  The problem with typical exception handling
code is that it quickly becomes bulky and ugly.  Compare.

#ignore any files we don't have access to
for @files {
   my $fh = open $_ or next;
   load_config_from($fh);
}

with

#ignore any files we don't have access to
for @files {
     my $fh;
     try {
         $fh = open $_;

         CATCH {
             next;
         }
     }
     load_config_from($fh);
}

So returning a special value makes checking the return code simpler, but hurts us
if we forget.  Throwing an exception saves us from forgetting, but has the
nastiness of catching that exception.

I propose that the spelled out version of // be renamed to something like "dor".
(I'm willing to accept other suggestions on that.)  "err" becomes a
single-expression level try/CATCH block.  The left hand side is executed, and if it
successfully returns anything (even undef), it's value is used.  If it throws an
exception, however, that exception is caught and the right hand side is evaluated
and used.  So that code becomes

for @files {
     my $fh = open $_ err next;
     load_config_from($fh);
}

which is just syntactic sugar for

for @files {
     my $fh = try {
         open $_;

         CATCH {
             next;
         }
     }
}

"err" would bind tighter than assignment, so it can be used to provide a fall-back
value to use in case the normal flow of control fails.  Using "no fatal" is still
allowed, but you can achieve almost the same thing [4] by adding "err $! but undef"
after calls whose failure you want to handle that way.

I think on the whole this gives us a way to allow exceptions to be used everywhere,
while making it clean and simple enough that it doesn't bog us down too much.



[1] http://www.perl.com/pub/a/2005/09/22/onion.html?page=6
[2] I've actually seen data lost due to this.  When drive space is very limited
(due to, for instance, a user quota) it's often possible to open a new file (since
there's some space left), but the close fails since too much was written to it.
[3] Actually, I'm not sure this is fair.  It seems that, due to buffering and other
things, print returns true even when it doesn't actually succeed.  But why let
facts get in the way of rhetoric?
[4] The difference is that "no fatal" would only affect code that calls "fail"
itself.  The "err" would affect code that directly calls "die", too.
-- 
Adam Lopresto
http://cec.wustl.edu/~adam/

Eschew obfuscation!

Thread Next


nntp.perl.org: Perl Programming lists via nntp and http.
Comments to Ask Bjørn Hansen at ask@perl.org | Group listing | About