develooper Front page | perl.beginners | Postings from February 2002

RE: memory issues reading large files

Thread Previous | Thread Next
Nikola Janceski
February 7, 2002 12:51
RE: memory issues reading large files
Message ID:
You should be using something like

open(FILE, $file) or die "$!\n";
	## do something
close FILE;

if you use something like 

local $/;
$contents = <FILE>;
then you are mistaken...

my perlscripts go up to almost a gig of mem sometimes (foolish yes), but
quick to write it! ;)

-----Original Message-----
From: Brett W. McCoy []
Sent: Thursday, February 07, 2002 3:49 PM
To: Brian Hayes
Subject: Re: memory issues reading large files

On Thu, 7 Feb 2002, Brian Hayes wrote:

> Hello all.  I need to read through a large (150 MB) text file line by
> line.  Does anyone know how to do this without my process swelling to
> 300 megs?

As long as you aren't reading that file into an array (which would be a
foolish thing to do, IMHO), I don't see why the process would swell to 300

-- Brett
-  long    f_ffree;    /* free file nodes in fs */
+  long    f_ffree;    /* freie Dateiknoten im Dateisystem */
	-- Seen in a translation

To unsubscribe, e-mail:
For additional commands, e-mail:

The views and opinions expressed in this email message are the sender's
own, and do not necessarily represent the views and opinions of Summit
Systems Inc.

Thread Previous | Thread Next Perl Programming lists via nntp and http.
Comments to Ask Bjørn Hansen at | Group listing | About