develooper Front page | perl.beginners | Postings from February 2002

RE: memory issues reading large files

Thread Previous | Thread Next
From:
Nikola Janceski
Date:
February 7, 2002 12:51
Subject:
RE: memory issues reading large files
Message ID:
1449413DA482D311B67000508B5A12F50592DCEA@nyexchange01.summithq.com
You should be using something like

open(FILE, $file) or die "$!\n";
while(<FILE>){
	## do something
	}
close FILE;
__END__

if you use something like 

local $/;
$contents = <FILE>;
__END__
then you are mistaken...

my perlscripts go up to almost a gig of mem sometimes (foolish yes), but
quick to write it! ;)


-----Original Message-----
From: Brett W. McCoy [mailto:bmccoy@chapelperilous.net]
Sent: Thursday, February 07, 2002 3:49 PM
To: Brian Hayes
Cc: beginners@perl.org
Subject: Re: memory issues reading large files


On Thu, 7 Feb 2002, Brian Hayes wrote:

> Hello all.  I need to read through a large (150 MB) text file line by
> line.  Does anyone know how to do this without my process swelling to
> 300 megs?

As long as you aren't reading that file into an array (which would be a
foolish thing to do, IMHO), I don't see why the process would swell to 300
megs.

-- Brett
                                          http://www.chapelperilous.net/
------------------------------------------------------------------------
-  long    f_ffree;    /* free file nodes in fs */
+  long    f_ffree;    /* freie Dateiknoten im Dateisystem */
	-- Seen in a translation


-- 
To unsubscribe, e-mail: beginners-unsubscribe@perl.org
For additional commands, e-mail: beginners-help@perl.org

----------------------------------------------------------------------------
--------------------
The views and opinions expressed in this email message are the sender's
own, and do not necessarily represent the views and opinions of Summit
Systems Inc.


Thread Previous | Thread Next


nntp.perl.org: Perl Programming lists via nntp and http.
Comments to Ask Bjørn Hansen at ask@perl.org | Group listing | About