develooper Front page | perl.beginners | Postings from February 2002

Re: memory issues reading large files

Thread Previous | Thread Next
From:
Brett W. McCoy
Date:
February 7, 2002 13:22
Subject:
Re: memory issues reading large files
Message ID:
Pine.LNX.4.43.0202071630460.30587-100000@chapelperilous.net
On Thu, 7 Feb 2002, Brian Hayes wrote:

> > You should be using something like
> >
> > open(FILE, $file) or die "$!\n";
> > while(<FILE>){
> > 	## do something
> > 	}
> > close FILE;
> > __END__
>
> This is what I am doing, but before any of the file is processed, the
> whole text file is moved into memory.  The only solution I can think of
> is to break apart the text file and read thru each smaller part...but I
> would like to avoid this.  I was hoping someone knew how perl interacts
> with memory and knew how to trick it into not reading the whole file at
> one time.

Can you show the code you have?  The entire file shouldn't be loading into
memory before you start reading it line by line, should it?

-- Brett
                                          http://www.chapelperilous.net/
------------------------------------------------------------------------
Hors d'oeuvres -- a ham sandwich cut into forty pieces.
		-- Jack Benny


Thread Previous | Thread Next


nntp.perl.org: Perl Programming lists via nntp and http.
Comments to Ask Bjørn Hansen at ask@perl.org | Group listing | About