When working with VERY large files, php tends to fall over sideways and die.
Here is a neat way to pull chunks out of a file very fast and won't stop in mid line, but rater at end of last known line. It pulled a 30+ million line 900meg file through in ~ 24 seconds.
NOTE:
$buf just hold current chunk of data to work with. If you try "$buf .=" (note 'dot' in from of '=') to append $buff, script will come to grinding crawl around 100megs of data, so work with current data then move on!
//File to be opened
$file = "huge.file";
//Open file (DON'T USE a+ pointer will be wrong!)
$fp = fopen($file, 'r');
//Read 16meg chunks
$read = 16777216;
//\n Marker
$part = 0;
while(!feof($fp)) {
$rbuf = fread($fp, $read);
for($i=$read;$i > 0 || $n == chr(10);$i--) {
$n=substr($rbuf, $i, 1);
if($n == chr(10))break;
//If we are at the end of the file, just grab the rest and stop loop
elseif(feof($fp)) {
$i = $read;
$buf = substr($rbuf, 0, $i+1);
break;
}
}
//This is the buffer we want to do stuff with, maybe thow to a function?
$buf = substr($rbuf, 0, $i+1);
//Point marker back to last \n point
$part = ftell($fp)-($read-($i+1));
fseek($fp, $part);
}
fclose($fp);