The customary Perl approach for processing all the lines in a file is to do so one line at a time:
open (INPUT, $file) || die "can't open $file: $!";
while (<INPUT>) {
chomp;
# do something with $_
}
close(INPUT) || die "can't close $file: $!";This is tremendously more efficient than reading the entire file into
memory as an array of lines and then processing it one element at a time,
which is often--if not almost always--the wrong approach. Whenever
you see someone do this:
@lines = <INPUT>;you should think long and hard about why you need everything loaded at once. It's just not a scalable solution. You might also find it more fun to use the standard Tie::File module, or the DB_File module's $DB_RECNO bindings, which allow you to tie an array to a file so that accessing an element the array actually accesses the corresponding line in the file.
You can read the entire filehandle contents into a scalar.
{
local(*INPUT, $/);
open (INPUT, $file) || die "can't open $file: $!";
$var = <INPUT>;
}That temporarily undefs your record separator, and will automatically
close the file at block exit. If the file is already open, just use this:
$var = do { local $/; <INPUT> };For ordinary files you can also use the read function.
read( INPUT, $var, -s INPUT );The third argument tests the byte size of the data on the INPUT filehandle and reads that many bytes into the buffer $var.