Looking for advice, comments on patching this... the problem is triggered in this program: my @var1; $#var1 = 2_147_483_647; and the problem is that the malloc routines are called by way of the New() macro: #define New(x,v,n,t) (v = (t*)safemalloc((MEM_SIZE)((n)*sizeof(t)))) and if n is in a range of certain very large numbers, multiplying it by sizeof(t) results in a very small number. So we get a few bytes of memory allocated when we think we have a very large number, and a seg fault results when we are initializing the very large array above. What I'd like to ask is for comments on changing new to the below: Are the types right? Can/should we change sizeof(t) to Size_t_size? Should I32_MAX/Size_t_size just be another defined constant, or should we just let the c optimizer optimize this away to another constant? Is this the best place to patch it? Should Newc(), Newz(), Renew(), and Renewc() be patched the same way? Is this the best choice of error message? What else am I missing??? Proposed new New() ############################ #define New(x,v,n,t) \ ( \ ((n) > (I32_MAX/Size_t_size)) \ ? (PerlIO_puts(Perl_error_log,PL_no_mem), my_exit(1)) \ : (v = (t*)safemalloc((MEM_SIZE)((n)*sizeof(t)))) \ )