On Mon, Feb 12, 2001 at 12:08:12PM -0500, schwern@pobox.com wrote: > On Sat, Feb 10, 2001 at 12:58:34AM +0100, Bart Lateur wrote: > > * On a currently normal Pentium of 500MHz, 64Mb, ungzipping and > > untarring a .tgz archive of 250k (the ungzipped file itself is roughly > > 1.5Mb) takes roughly 1 second. (ONE second!) > > One second is too slow (for a Unix user it is, maybe not for a Windows > I'm not planning on waiting for Perl 6 to start work on par, so Moore > isn't with us. I agree with the "too slow" opinion [mainly because I'm very impatient :-)] When I last tried it (over a year ago) running the 5.005 regression tests with the standard libraries coming out of a zip file took about the same time as running the regression tests with the standard libraries on disk. [x86 BSD unix, fairly big machine, SCSI disks - something I'd expect to be good at IO] The IO gain from having the libraries all in one smaller file rather than scattered about a bit seemed to offset the CPU loss in having to actually decompress files rather than just copy them. Please don't take this as pro-zip anti tar 1: I don't see why we need to decide on actual format right now. Surely what we want to be able to do with it is more important? 2: Is this really still language? If not, where should we be discussing it? Nicholas ClarkThread Previous | Thread Next