develooper Front page | perl.perl5.porters | Postings from August 2001

[ID 20000519.006] Malloc coredump madness patch advise

From:
Wilson, Doug
Date:
August 23, 2001 16:26
Subject:
[ID 20000519.006] Malloc coredump madness patch advise
Message ID:
35A280DF784CD411A06B0008C7B130ADB55117@sdex04.sd.intuit.com

Looking for advice, comments on patching this...

the problem is triggered in this program:
my @var1;
$#var1 = 2_147_483_647;

and the problem is that the malloc routines are called
by way of the New() macro:
#define New(x,v,n,t)    (v = (t*)safemalloc((MEM_SIZE)((n)*sizeof(t))))

and if n is in a range of certain very large numbers, multiplying it by
sizeof(t) results in a very small number. So we get a few bytes of
memory allocated when we think we have a very large number, and
a seg fault results when we are initializing the very large array
above.

What I'd like to ask is for comments on changing new to the below:
Are the types right?
Can/should we change sizeof(t) to Size_t_size?
Should I32_MAX/Size_t_size just be another defined constant, or
should we just let the c optimizer optimize this away to another constant?
Is this the best place to patch it?
Should Newc(), Newz(), Renew(), and Renewc() be patched the same way?
Is this the best choice of error message?
What else am I missing???

Proposed new New()
############################
#define New(x,v,n,t)    \
    ( \
     ((n) > (I32_MAX/Size_t_size)) \
       ? (PerlIO_puts(Perl_error_log,PL_no_mem), my_exit(1)) \
       : (v = (t*)safemalloc((MEM_SIZE)((n)*sizeof(t)))) \
    )



nntp.perl.org: Perl Programming lists via nntp and http.
Comments to Ask Bjørn Hansen at ask@perl.org | Group listing | About