develooper Front page | perl.perl5.porters | Postings from September 2013

[perl #26785] mem leak in split

Thread Previous
From:
James E Keenan via RT
Date:
September 18, 2013 23:10
Subject:
[perl #26785] mem leak in split
Message ID:
rt-3.6.HEAD-1873-1379545794-1651.26785-15-0@perl.org
On Sat May 26 19:18:29 2012, sprout wrote:
> On Sat May 26 19:08:36 2012, jkeenan wrote:
> > On Fri Mar 26 09:29:33 2004, davem@fdisolutions.com wrote:
> > > On Thu, Feb 19, 2004 at 06:25:48AM -0600, Ohlman, Jeff wrote:
> > > >
> > > > I am trying to split a record twice - the 2nd split takes an element
> > > of the first split and splits it again.  Trying to assign the results
> > > of the 2nd split to another array causes the OS to produce:
> > > >
> > > > Out of memory during request for 1008 bytes, total sbrk() is
> > > 64858324 bytes!
> > > >
> > > > I've reduced the program to the bare min in order to reproduce the
> > > error.  This is the offending code:
> > > >
> > > > perl -ne 'chomp;@x=split(/<!@>/,$_);@r=split(/~/,$x[1]);'
> > > reallybigfile.dat
> > > >
> > > > The work around is:
> > > >
> > > > perl -ne 'chomp;@x=split(/<!@>/,$_);($a, $b, $c, $d, $e)
> > > =split(/~/,$x[1]);' reallybigfile.dat
> > > >
> > > > Here are 2 typical records:
> > > >
> > > > 00040000000~BUBBAS BEER~~~A~~~~~~N~<!@>BILLING~~~~~<!@>
> > > > 00000443000~FOOBAR BROTHERS~~~A~~~~~~N~<!@>BILLING~~~~~<!@>TYPE1
> > > >
> > > >
> > > > The data file is 900 mb, but the out of mem occurs after about 300
> > > mb.
> > > 
> > > could you check that the file hasn't got any very long lines:
> > > 
> > > perl -ne 'die "length=".length if length >
> > > 10000;chomp;@x=split(/<!@>/,$_);@r=split(/~/,$x[1]);'
> > > reallybigfile.dat
> > > 
> > > 
> > 
> > Discussion in this ticket petered out eight years ago.  We didn't hear
> > back from the OP as to whether the file contained very long lines or
not.
> > 
> > I was able to run the OP's code without incident:
> > 
> > $ perl -ne 'chomp;@x=split(/<!@>/,$_);($a, $b, $c, $d, $e)
> > =split(/~/,$x[1]);print join("|"=>($a,$b,$c,$d,$e)), "\n";'
> > reallybigfile.dat
> > BILLING||||
> > BILLING||||
> 
> I can’t reproduce it either, in 5.6.2 or 5.10.1.
> 
> If I create a huge file with carriage returns instead of line feeds, it
> gobbles up memory, and presumably prints ‘Out of memory!’ (not there yet).
> 
> > How should we proceed with this RT?
> 
> If it’s gobbling up memory (the only cause I can think of), then it
> probably should be rejected.  If the original poster does not respond,
> then I think we can assume that was the cause.
> 

No response from the OP.  Marking this 9-year-old ticket as rejected.

Thank you very much.
Jim Keenan

---
via perlbug:  queue: perl5 status: open
https://rt.perl.org:443/rt3/Ticket/Display.html?id=26785

Thread Previous


nntp.perl.org: Perl Programming lists via nntp and http.
Comments to Ask Bjørn Hansen at ask@perl.org | Group listing | About