develooper Front page | perl.perl5.porters | Postings from March 2004

Re: [perl #26785] mem leak in split

From:
Dave Mitchell
Date:
March 26, 2004 09:29
Subject:
Re: [perl #26785] mem leak in split
Message ID:
20040326172854.GE11011@fdisolutions.com
On Thu, Feb 19, 2004 at 06:25:48AM -0600, Ohlman, Jeff wrote:
> Yes - for some strange reason the perl.org web-site munged my plain text.   Here is the original submission (hopefully it will not be clobbered).  
> 
> ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~`
> 
> This is a bug report for perl from jeff.ohlman@bellsouth.com,
> generated with the help of perlbug 1.33 running under perl v5.6.1.
>  
> I am trying to split a record twice - the 2nd split takes an element of the first split and splits it again.  Trying to assign the results of the 2nd split to another array causes the OS to produce:
>  
> Out of memory during request for 1008 bytes, total sbrk() is 64858324 bytes!
>  
> I've reduced the program to the bare min in order to reproduce the error.  This is the offending code:
>  
> perl -ne 'chomp;@x=split(/<!@>/,$_);@r=split(/~/,$x[1]);' reallybigfile.dat
>  
> The work around is:
>  
> perl -ne 'chomp;@x=split(/<!@>/,$_);($a, $b, $c, $d, $e) =split(/~/,$x[1]);' reallybigfile.dat
>  
> Here are 2 typical records:
>  
> 00040000000~BUBBAS BEER~~~A~~~~~~N~<!@>BILLING~~~~~<!@>
> 00000443000~FOOBAR BROTHERS~~~A~~~~~~N~<!@>BILLING~~~~~<!@>TYPE1
>  
> 
> The data file is 900 mb, but the out of mem occurs after about 300 mb.

could you check that the file hasn't got any very long lines:

perl -ne 'die "length=".length if length > 10000;chomp;@x=split(/<!@>/,$_);@r=split(/~/,$x[1]);' reallybigfile.dat


-- 
The Enterprise is captured by a vastly superior alien intelligence which
does not put them on trial.
    -- Things That Never Happen in "Star Trek" #10



nntp.perl.org: Perl Programming lists via nntp and http.
Comments to Ask Bjørn Hansen at ask@perl.org | Group listing | About