f



PHP Read Text File

Hey, read some tips/pointers on PHP.net but can't seem to solve this
problem. I have a php page that reads the contents of a file and then
displays the last XX lines of the file.  Problem is this...whenever
the file gets larger that ~5MB, the page just displays nothing, as
though a timeout has occurred but I get no error.  At 4.8MB (last
confirmed size)...the function still works.  Any ideas what code below
is lacking??

<?
$handle= fopen("/var/log/myfile", "r");
if ($handle) {
   while (!feof($handle)) {
       $arrLog[] = fgets($handle, 4096);
   }
   fclose($handle);
}

$int_number_of_lines = count($arrLog);
if ($int_number_of_lines == 0)
{
  echo '<p><strong>No lines read.</strong></p>';
}
if ($int_number_of_lines < $int_lines)
{
  $int_lines = $int_number_of_lines;
}
$int_firstline = $int_number_of_lines - $int_lines;
echo 'Showing the last '.$int_lines.' lines out of '.
$int_number_of_lines.'<BR />';


 echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
  for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
  {
    echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
  }
 echo "</TABLE>\n";

?>
0
tlpell (7)
4/19/2008 9:09:39 PM
comp.lang.php 32646 articles. 0 followers. Post Follow

32 Replies
1187 Views

Similar Articles

[PageSpeed] 48

tlpell@gmail.com wrote:

> Hey, read some tips/pointers on PHP.net but can't seem to solve this
> problem. I have a php page that reads the contents of a file and then
> displays the last XX lines of the file.

<?php
passthru("tail -n $lines $file");
?>

Cheers,
-- 
----------------------------------
Iv�n S�nchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-

Proudly running Debian Linux with 2.6.24-1-amd64 kernel, KDE 3.5.9, and PHP
5.2.5-3 generating this signature.
Uptime: 00:28:14 up 10 days, 15:27,  2 users,  load average: 0.89, 0.35,
0.18

0
ISO
4/19/2008 10:28:55 PM
<comp.lang.php>
<>
<Sat, 19 Apr 2008 14:09:39 -0700 (PDT)>
<cec74348-2d9a-4987-ac6c-33537f2d0294@a70g2000hsh.googlegroups.com>

> Hey, read some tips/pointers on PHP.net but can't seem to solve this
> problem. I have a php page that reads the contents of a file and then
> displays the last XX lines of the file.  Problem is this...whenever
> the file gets larger that ~5MB, the page just displays nothing, as
> though a timeout has occurred but I get no error.  At 4.8MB (last
> confirmed size)...the function still works.  Any ideas what code below
> is lacking??
> 

$build="/var/log/myfile";
$contents=file_get_contents($build);
$demo=explode("\n",$contents);

Something like the above might be worth a try .


For the benefit of any newbies .....

The above reads in the file in the one go and $demo in effect becomes 
$demo[0] $demo[1] $demo[2] etc etc via the explode command - the 
advantage being the above code is much faster than reading in the text 
file one line at a time .


-- 
www.krustov.co.uk
0
me4 (19624)
4/19/2008 10:54:30 PM
On Apr 19, 5:09 pm, tlp...@gmail.com wrote:
> Hey, read some tips/pointers on PHP.net but can't seem to solve this
> problem. I have a php page that reads the contents of a file and then
> displays the last XX lines of the file.  Problem is this...whenever
> the file gets larger that ~5MB, the page just displays nothing, as
> though a timeout has occurred but I get no error.  At 4.8MB (last
> confirmed size)...the function still works.  Any ideas what code below
> is lacking??
>
> <?
> $handle= fopen("/var/log/myfile", "r");
> if ($handle) {
>    while (!feof($handle)) {
>        $arrLog[] = fgets($handle, 4096);
>    }
>    fclose($handle);
>
> }
>
> $int_number_of_lines = count($arrLog);
> if ($int_number_of_lines == 0)
> {
>   echo '<p><strong>No lines read.</strong></p>';}
>
> if ($int_number_of_lines < $int_lines)
> {
>   $int_lines = $int_number_of_lines;}
>
> $int_firstline = $int_number_of_lines - $int_lines;
> echo 'Showing the last '.$int_lines.' lines out of '.
> $int_number_of_lines.'<BR />';
>
>  echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
>   for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
>   {
>     echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
>   }
>  echo "</TABLE>\n";
>
> ?>

The problem is probably due to the php.ini configuration of
max_execution_time.  I forget the default but it's about 30 seconds.
Try jacking the value up and see if it keeps executing.  Though any
script which takes more than 30 seconds is probably not the best
solution either.  Consider dumping the lines into a mysql table and
searching that way.
0
4/19/2008 11:56:25 PM
tlpell@gmail.com wrote:
> Hey, read some tips/pointers on PHP.net but can't seem to solve this
> problem. I have a php page that reads the contents of a file and then
> displays the last XX lines of the file.  Problem is this...whenever
> the file gets larger that ~5MB, the page just displays nothing, as
> though a timeout has occurred but I get no error.  At 4.8MB (last
> confirmed size)...the function still works.  Any ideas what code below
> is lacking??
> 
> <?
> $handle= fopen("/var/log/myfile", "r");
> if ($handle) {
>    while (!feof($handle)) {
>        $arrLog[] = fgets($handle, 4096);
>    }
>    fclose($handle);
> }
> 
> $int_number_of_lines = count($arrLog);
> if ($int_number_of_lines == 0)
> {
>   echo '<p><strong>No lines read.</strong></p>';
> }
> if ($int_number_of_lines < $int_lines)
> {
>   $int_lines = $int_number_of_lines;
> }
> $int_firstline = $int_number_of_lines - $int_lines;
> echo 'Showing the last '.$int_lines.' lines out of '.
> $int_number_of_lines.'<BR />';
> 
> 
>  echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
>   for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
>   {
>     echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
>   }
>  echo "</TABLE>\n";
> 
> ?>
> 

Probably running out of memory in PHP...

-- 
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
jstucklex@attglobal.net
==================

0
jstucklex (14659)
4/20/2008 12:54:10 AM
On Apr 19, 5:28 pm, Iv=E1n S=E1nchez Ortega <ivansanchez-...@rroba-
escomposlinux.-.punto.-.org> wrote:
> tlp...@gmail.com wrote:
> > Hey, read some tips/pointers on PHP.net but can't seem to solve this
> > problem. I have a php page that reads the contents of a file and then
> > displays the last XX lines of the file.
>
> <?php
> passthru("tail -n $lines $file");
> ?>
>
> Cheers,
> --
> ----------------------------------
> Iv=E1n S=E1nchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-
>
> Proudly running Debian Linux with 2.6.24-1-amd64 kernel, KDE 3.5.9, and PH=
P
> 5.2.5-3 generating this signature.
> Uptime: 00:28:14 up 10 days, 15:27,  2 users,  load average: 0.89, 0.35,
> 0.18

First, thank you for offering suggestions. I'll research using tail
but my initial problem with it is that it doesn't create an arrary
where I can display each line on a table row but I may be doing
something wrong...

$myfile =3D "/var/log/maillog";
$arrLog[]=3Dpassthru("tail -n $int_lines $myfile");

$int_number_of_lines =3D count($myfile);
if ($int_number_of_lines =3D=3D 0)
{
  echo '<p><strong>No lines read.</strong></p>';
}
if ($int_number_of_lines < $int_lines)
{
  $int_lines =3D $int_number_of_lines;
}
$int_firstline =3D $int_number_of_lines - $int_lines;
echo 'Showing the last '.$int_lines.' lines out of '.
$int_number_of_lines.'<BR />';


 echo "<TABLE WIDTH=3D100% CLASS=3D\"mail\">\n";
  for ($i=3D$int_firstline; $i<$int_number_of_lines; $i++)
  {
    echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
  }
 echo "</TABLE>\n";
0
tlpell (7)
4/20/2008 6:18:21 PM
On Apr 19, 6:56 pm, venti <timgreg...@shieldinvestmentgroup.com>
wrote:
> On Apr 19, 5:09 pm, tlp...@gmail.com wrote:
>
>
>
> > Hey, read some tips/pointers on PHP.net but can't seem to solve this
> > problem. I have a php page that reads the contents of a file and then
> > displays the last XX lines of the file.  Problem is this...whenever
> > the file gets larger that ~5MB, the page just displays nothing, as
> > though a timeout has occurred but I get no error.  At 4.8MB (last
> > confirmed size)...the function still works.  Any ideas what code below
> > is lacking??
>
> > <?
> > $handle= fopen("/var/log/myfile", "r");
> > if ($handle) {
> >    while (!feof($handle)) {
> >        $arrLog[] = fgets($handle, 4096);
> >    }
> >    fclose($handle);
>
> > }
>
> > $int_number_of_lines = count($arrLog);
> > if ($int_number_of_lines == 0)
> > {
> >   echo '<p><strong>No lines read.</strong></p>';}
>
> > if ($int_number_of_lines < $int_lines)
> > {
> >   $int_lines = $int_number_of_lines;}
>
> > $int_firstline = $int_number_of_lines - $int_lines;
> > echo 'Showing the last '.$int_lines.' lines out of '.
> > $int_number_of_lines.'<BR />';
>
> >  echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
> >   for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
> >   {
> >     echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
> >   }
> >  echo "</TABLE>\n";
>
> > ?>
>
> The problem is probably due to the php.ini configuration of
> max_execution_time.  I forget the default but it's about 30 seconds.
> Try jacking the value up and see if it keeps executing.  Though any
> script which takes more than 30 seconds is probably not the best
> solution either.  Consider dumping the lines into a mysql table and
> searching that way.

Thanks, but the page returns blank instantly...no time out.
0
tlpell (7)
4/20/2008 6:20:09 PM
On Apr 19, 6:56 pm, venti <timgreg...@shieldinvestmentgroup.com>
wrote:
> On Apr 19, 5:09 pm, tlp...@gmail.com wrote:
>
>
>
> > Hey, read some tips/pointers on PHP.net but can't seem to solve this
> > problem. I have a php page that reads the contents of a file and then
> > displays the last XX lines of the file.  Problem is this...whenever
> > the file gets larger that ~5MB, the page just displays nothing, as
> > though a timeout has occurred but I get no error.  At 4.8MB (last
> > confirmed size)...the function still works.  Any ideas what code below
> > is lacking??
>
> > <?
> > $handle= fopen("/var/log/myfile", "r");
> > if ($handle) {
> >    while (!feof($handle)) {
> >        $arrLog[] = fgets($handle, 4096);
> >    }
> >    fclose($handle);
>
> > }
>
> > $int_number_of_lines = count($arrLog);
> > if ($int_number_of_lines == 0)
> > {
> >   echo '<p><strong>No lines read.</strong></p>';}
>
> > if ($int_number_of_lines < $int_lines)
> > {
> >   $int_lines = $int_number_of_lines;}
>
> > $int_firstline = $int_number_of_lines - $int_lines;
> > echo 'Showing the last '.$int_lines.' lines out of '.
> > $int_number_of_lines.'<BR />';
>
> >  echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
> >   for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
> >   {
> >     echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
> >   }
> >  echo "</TABLE>\n";
>
> > ?>
>
> The problem is probably due to the php.ini configuration of
> max_execution_time.  I forget the default but it's about 30 seconds.
> Try jacking the value up and see if it keeps executing.  Though any
> script which takes more than 30 seconds is probably not the best
> solution either.  Consider dumping the lines into a mysql table and
> searching that way.

Thanks, but the page returns blank instantly...no time out.
0
tlpell (7)
4/20/2008 6:20:17 PM
tlpell@gmail.com wrote:
> On Apr 19, 6:56 pm, venti <timgreg...@shieldinvestmentgroup.com>
> wrote:
>> On Apr 19, 5:09 pm, tlp...@gmail.com wrote:
>>
>>
>>
>>> Hey, read some tips/pointers on PHP.net but can't seem to solve this
>>> problem. I have a php page that reads the contents of a file and then
>>> displays the last XX lines of the file.  Problem is this...whenever
>>> the file gets larger that ~5MB, the page just displays nothing, as
>>> though a timeout has occurred but I get no error.  At 4.8MB (last
>>> confirmed size)...the function still works.  Any ideas what code below
>>> is lacking??
>>> <?
>>> $handle= fopen("/var/log/myfile", "r");
>>> if ($handle) {
>>>    while (!feof($handle)) {
>>>        $arrLog[] = fgets($handle, 4096);
>>>    }
>>>    fclose($handle);
>>> }
>>> $int_number_of_lines = count($arrLog);
>>> if ($int_number_of_lines == 0)
>>> {
>>>   echo '<p><strong>No lines read.</strong></p>';}
>>> if ($int_number_of_lines < $int_lines)
>>> {
>>>   $int_lines = $int_number_of_lines;}
>>> $int_firstline = $int_number_of_lines - $int_lines;
>>> echo 'Showing the last '.$int_lines.' lines out of '.
>>> $int_number_of_lines.'<BR />';
>>>  echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
>>>   for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
>>>   {
>>>     echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
>>>   }
>>>  echo "</TABLE>\n";
>>> ?>
>> The problem is probably due to the php.ini configuration of
>> max_execution_time.  I forget the default but it's about 30 seconds.
>> Try jacking the value up and see if it keeps executing.  Though any
>> script which takes more than 30 seconds is probably not the best
>> solution either.  Consider dumping the lines into a mysql table and
>> searching that way.
> 
> Thanks, but the page returns blank instantly...no time out.
> 

Enable errors and display them.  You'll see your problem.

In your php.ini file, put:

error_reporting = E_ALL
display_errors = on

-- 
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
jstucklex@attglobal.net
==================

0
jstucklex (14659)
4/20/2008 8:43:37 PM
On Apr 20, 3:43 pm, Jerry Stuckle <jstuck...@attglobal.net> wrote:
> tlp...@gmail.com wrote:
> > On Apr 19, 6:56 pm, venti <timgreg...@shieldinvestmentgroup.com>
> > wrote:
> >> On Apr 19, 5:09 pm, tlp...@gmail.com wrote:
>
> >>> Hey, read some tips/pointers on PHP.net but can't seem to solve this
> >>> problem. I have a php page that reads the contents of a file and then
> >>> displays the last XX lines of the file.  Problem is this...whenever
> >>> the file gets larger that ~5MB, the page just displays nothing, as
> >>> though a timeout has occurred but I get no error.  At 4.8MB (last
> >>> confirmed size)...the function still works.  Any ideas what code below
> >>> is lacking??
> >>> <?
> >>> $handle= fopen("/var/log/myfile", "r");
> >>> if ($handle) {
> >>>    while (!feof($handle)) {
> >>>        $arrLog[] = fgets($handle, 4096);
> >>>    }
> >>>    fclose($handle);
> >>> }
> >>> $int_number_of_lines = count($arrLog);
> >>> if ($int_number_of_lines == 0)
> >>> {
> >>>   echo '<p><strong>No lines read.</strong></p>';}
> >>> if ($int_number_of_lines < $int_lines)
> >>> {
> >>>   $int_lines = $int_number_of_lines;}
> >>> $int_firstline = $int_number_of_lines - $int_lines;
> >>> echo 'Showing the last '.$int_lines.' lines out of '.
> >>> $int_number_of_lines.'<BR />';
> >>>  echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
> >>>   for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
> >>>   {
> >>>     echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
> >>>   }
> >>>  echo "</TABLE>\n";
> >>> ?>
> >> The problem is probably due to the php.ini configuration of
> >> max_execution_time.  I forget the default but it's about 30 seconds.
> >> Try jacking the value up and see if it keeps executing.  Though any
> >> script which takes more than 30 seconds is probably not the best
> >> solution either.  Consider dumping the lines into a mysql table and
> >> searching that way.
>
> > Thanks, but the page returns blank instantly...no time out.
>
> Enable errors and display them.  You'll see your problem.
>
> In your php.ini file, put:
>
> error_reporting = E_ALL
> display_errors = on
>
> --
> ==================
> Remove the "x" from my email address
> Jerry Stuckle
> JDS Computer Training Corp.
> jstuck...@attglobal.net
> ==================

Thanks...I see it's an error due to allocated memory but can anyone
explain why I could read the file just fine...it incremented by 45K
and then I couldn't read it anymore? And from what I'm reading in the
error, 4097 bytes exhausted 8388608?  Huh?

Fatal error: Allowed memory size of 8388608 bytes exhausted (tried to
allocate 4097 bytes) in /var/www/html/maillog2.php on line 39
0
tlpell (7)
4/20/2008 11:22:46 PM
tlpell@gmail.com wrote:
> On Apr 20, 3:43 pm, Jerry Stuckle <jstuck...@attglobal.net> wrote:
>> tlp...@gmail.com wrote:
>>> On Apr 19, 6:56 pm, venti <timgreg...@shieldinvestmentgroup.com>
>>> wrote:
>>>> On Apr 19, 5:09 pm, tlp...@gmail.com wrote:
>>>>> Hey, read some tips/pointers on PHP.net but can't seem to solve this
>>>>> problem. I have a php page that reads the contents of a file and then
>>>>> displays the last XX lines of the file.  Problem is this...whenever
>>>>> the file gets larger that ~5MB, the page just displays nothing, as
>>>>> though a timeout has occurred but I get no error.  At 4.8MB (last
>>>>> confirmed size)...the function still works.  Any ideas what code below
>>>>> is lacking??
>>>>> <?
>>>>> $handle= fopen("/var/log/myfile", "r");
>>>>> if ($handle) {
>>>>>    while (!feof($handle)) {
>>>>>        $arrLog[] = fgets($handle, 4096);
>>>>>    }
>>>>>    fclose($handle);
>>>>> }
>>>>> $int_number_of_lines = count($arrLog);
>>>>> if ($int_number_of_lines == 0)
>>>>> {
>>>>>   echo '<p><strong>No lines read.</strong></p>';}
>>>>> if ($int_number_of_lines < $int_lines)
>>>>> {
>>>>>   $int_lines = $int_number_of_lines;}
>>>>> $int_firstline = $int_number_of_lines - $int_lines;
>>>>> echo 'Showing the last '.$int_lines.' lines out of '.
>>>>> $int_number_of_lines.'<BR />';
>>>>>  echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
>>>>>   for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
>>>>>   {
>>>>>     echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
>>>>>   }
>>>>>  echo "</TABLE>\n";
>>>>> ?>
>>>> The problem is probably due to the php.ini configuration of
>>>> max_execution_time.  I forget the default but it's about 30 seconds.
>>>> Try jacking the value up and see if it keeps executing.  Though any
>>>> script which takes more than 30 seconds is probably not the best
>>>> solution either.  Consider dumping the lines into a mysql table and
>>>> searching that way.
>>> Thanks, but the page returns blank instantly...no time out.
>> Enable errors and display them.  You'll see your problem.
>>
>> In your php.ini file, put:
>>
>> error_reporting = E_ALL
>> display_errors = on
>>
>> --
>> ==================
>> Remove the "x" from my email address
>> Jerry Stuckle
>> JDS Computer Training Corp.
>> jstuck...@attglobal.net
>> ==================
> 
> Thanks...I see it's an error due to allocated memory but can anyone
> explain why I could read the file just fine...it incremented by 45K
> and then I couldn't read it anymore? And from what I'm reading in the
> error, 4097 bytes exhausted 8388608?  Huh?
> 
> Fatal error: Allowed memory size of 8388608 bytes exhausted (tried to
> allocate 4097 bytes) in /var/www/html/maillog2.php on line 39
> 

Well, 8,388,608 is 8M - which just happens to be the default memory 
limit for PHP.  You needed more memory than is available.

-- 
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
jstucklex@attglobal.net
==================

0
jstucklex (14659)
4/20/2008 11:32:24 PM
On Apr 20, 6:22 pm, tlp...@gmail.com wrote:
> On Apr 20, 3:43 pm, Jerry Stuckle <jstuck...@attglobal.net> wrote:
>
>
>
> > tlp...@gmail.com wrote:
> > > On Apr 19, 6:56 pm, venti <timgreg...@shieldinvestmentgroup.com>
> > > wrote:
> > >> On Apr 19, 5:09 pm, tlp...@gmail.com wrote:
>
> > >>> Hey, read some tips/pointers on PHP.net but can't seem to solve this
> > >>> problem. I have a php page that reads the contents of a file and then
> > >>> displays the last XX lines of the file.  Problem is this...whenever
> > >>> the file gets larger that ~5MB, the page just displays nothing, as
> > >>> though a timeout has occurred but I get no error.  At 4.8MB (last
> > >>> confirmed size)...the function still works.  Any ideas what code below
> > >>> is lacking??
> > >>> <?
> > >>> $handle= fopen("/var/log/myfile", "r");
> > >>> if ($handle) {
> > >>>    while (!feof($handle)) {
> > >>>        $arrLog[] = fgets($handle, 4096);
> > >>>    }
> > >>>    fclose($handle);
> > >>> }
> > >>> $int_number_of_lines = count($arrLog);
> > >>> if ($int_number_of_lines == 0)
> > >>> {
> > >>>   echo '<p><strong>No lines read.</strong></p>';}
> > >>> if ($int_number_of_lines < $int_lines)
> > >>> {
> > >>>   $int_lines = $int_number_of_lines;}
> > >>> $int_firstline = $int_number_of_lines - $int_lines;
> > >>> echo 'Showing the last '.$int_lines.' lines out of '.
> > >>> $int_number_of_lines.'<BR />';
> > >>>  echo "<TABLE WIDTH=100% CLASS=\"mail\">\n";
> > >>>   for ($i=$int_firstline; $i<$int_number_of_lines; $i++)
> > >>>   {
> > >>>     echo "<TR><TD>".$arrLog[$i]."</TR></TD>\n";
> > >>>   }
> > >>>  echo "</TABLE>\n";
> > >>> ?>
> > >> The problem is probably due to the php.ini configuration of
> > >> max_execution_time.  I forget the default but it's about 30 seconds.
> > >> Try jacking the value up and see if it keeps executing.  Though any
> > >> script which takes more than 30 seconds is probably not the best
> > >> solution either.  Consider dumping the lines into a mysql table and
> > >> searching that way.
>
> > > Thanks, but the page returns blank instantly...no time out.
>
> > Enable errors and display them.  You'll see your problem.
>
> > In your php.ini file, put:
>
> > error_reporting = E_ALL
> > display_errors = on
>
> > --
> > ==================
> > Remove the "x" from my email address
> > Jerry Stuckle
> > JDS Computer Training Corp.
> > jstuck...@attglobal.net
> > ==================
>
> Thanks...I see it's an error due to allocated memory but can anyone
> explain why I could read the file just fine...it incremented by 45K
> and then I couldn't read it anymore? And from what I'm reading in the
> error, 4097 bytes exhausted 8388608?  Huh?
>
> Fatal error: Allowed memory size of 8388608 bytes exhausted (tried to
> allocate 4097 bytes) in /var/www/html/maillog2.php on line 39

Solved!  I upped the max memory in php.ini from 8M to 16M.

Thanks for all the help.
0
tlpell (7)
4/20/2008 11:37:06 PM
tlpell@gmail.com wrote:

> Solved!  I upped the max memory in php.ini from 8M to 16M.

That doesn't solve it - it just postpones it.

Open the file, fseek() to the end, read 4Kb chunks, then strrchr() to find
the line breaks. No wasted memory.

-- 
----------------------------------
Iv�n S�nchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-

Proudly running Debian Linux with 2.6.24-1-amd64 kernel, KDE 3.5.9, and PHP
5.2.5-3 generating this signature.
Uptime: 01:53:55 up 11 days, 16:52,  2 users,  load average: 0.34, 0.37,
0.36

0
ISO
4/20/2008 11:56:21 PM
On Apr 20, 6:56 pm, Iv=E1n S=E1nchez Ortega <ivansanchez-...@rroba-
escomposlinux.-.punto.-.org> wrote:
> tlp...@gmail.com wrote:
> > Solved!  I upped the max memory in php.ini from 8M to 16M.
>
> That doesn't solve it - it just postpones it.
>
> Open the file, fseek() to the end, read 4Kb chunks, then strrchr() to find=

> the line breaks. No wasted memory.
>
> --
> ----------------------------------
> Iv=E1n S=E1nchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-
>
> Proudly running Debian Linux with 2.6.24-1-amd64 kernel, KDE 3.5.9, and PH=
P
> 5.2.5-3 generating this signature.
> Uptime: 01:53:55 up 11 days, 16:52,  2 users,  load average: 0.34, 0.37,
> 0.36

Okay, the log is rotated nightly so it *may* not be a problem but I'll
starting reading on those functions. Thanks for the assistance. But I
have a question. If I couldn't even fopen() the 5MB file before I
increased the php max size, how am I going to open it now and then do
an fseek() to find the end?
0
tlpell (7)
4/21/2008 12:54:41 AM
Iv�n S�nchez Ortega wrote:
> tlpell@gmail.com wrote:
> 
>> Solved!  I upped the max memory in php.ini from 8M to 16M.
> 
> That doesn't solve it - it just postpones it.
> 
> Open the file, fseek() to the end, read 4Kb chunks, then strrchr() to find
> the line breaks. No wasted memory.
> 

Maybe, maybe not.  It depends on how large the file gets.

Reading 4K at a time and trying to tack things together across chunks is 
also requires a much larger amount of CPU.

I often give PHP 32-128MB, depending on the system and what's required 
for the site.  Memory is cheaper than CPU cycles.

-- 
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
jstucklex@attglobal.net
==================

0
jstucklex (14659)
4/21/2008 1:09:09 AM
On 21 Apr, 02:09, Jerry Stuckle <jstuck...@attglobal.net> wrote:
> Iv=E1n S=E1nchez Ortega wrote:
> > tlp...@gmail.com wrote:
>
> >> Solved!  I upped the max memory in php.ini from 8M to 16M.
>
> > That doesn't solve it - it just postpones it.
>
> > Open the file, fseek() to the end, read 4Kb chunks, then strrchr() to fi=
nd
> > the line breaks. No wasted memory.
>
> Maybe, maybe not.  It depends on how large the file gets.
>
> Reading 4K at a time and trying to tack things together across chunks is
> also requires a much larger amount of CPU.
>
> I often give PHP 32-128MB, depending on the system and what's required
> for the site.  Memory is cheaper than CPU cycles.
>

Certainly, the right way to this would be to use tail. The next more
correct way would be to read backwards from the end of the file - but
this is non-trivial since you would need to a lot of fseeking and
would probably end up using too much CPU. Another way to solve the
problem would be to use a rotating buffer:

$lines=3D0;
$keep=3D200;
while (!feof($fp)) {
   $lines++;
   $buffer[$lines % $keep]=3Dfgets($fp);
}
fclose($fp);
for($x=3D($lines % $keep); $x<=3D$keep; $x++) {
   print $buffer[$x];
}
for ($x=3D0; $x<($lines % $keep); $x++) {
   print $buffer[$x];
}

C.
0
4/21/2008 12:47:22 PM
C. (http://symcbean.blogspot.com/) wrote:
> On 21 Apr, 02:09, Jerry Stuckle <jstuck...@attglobal.net> wrote:
>> Iv�n S�nchez Ortega wrote:
>>> tlp...@gmail.com wrote:
>>>> Solved!  I upped the max memory in php.ini from 8M to 16M.
>>> That doesn't solve it - it just postpones it.
>>> Open the file, fseek() to the end, read 4Kb chunks, then strrchr() to find
>>> the line breaks. No wasted memory.
>> Maybe, maybe not.  It depends on how large the file gets.
>>
>> Reading 4K at a time and trying to tack things together across chunks is
>> also requires a much larger amount of CPU.
>>
>> I often give PHP 32-128MB, depending on the system and what's required
>> for the site.  Memory is cheaper than CPU cycles.
>>
> 
> Certainly, the right way to this would be to use tail. The next more
> correct way would be to read backwards from the end of the file - but
> this is non-trivial since you would need to a lot of fseeking and
> would probably end up using too much CPU. Another way to solve the
> problem would be to use a rotating buffer:
>
> $lines=0;
> $keep=200;
> while (!feof($fp)) {
>    $lines++;
>    $buffer[$lines % $keep]=fgets($fp);
> }
> fclose($fp);
> for($x=($lines % $keep); $x<=$keep; $x++) {
>    print $buffer[$x];
> }
> for ($x=0; $x<($lines % $keep); $x++) {
>    print $buffer[$x];
> }
> 
> C.
> 

Yes, tail is one way to do it.  But it also means you must have the 
privileges to exec tail.  Many (most?) shared hosts don't allow this.

And multiple seeks, etc. are again a cpu hog.  They can work if you have 
an idea what the size of your lines are, but if, like many log files, 
the lines vary considerably in length, it's much harder.  And the code 
is far more complicated.

I just find it much easier to get a decent amount of RAM allocated to 
PHP and read the entire file in.  It's not like you're talking 100MB or 
anything.

But if you are, then you need to take further steps to break it up.

-- 
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
jstucklex@attglobal.net
==================

0
jstucklex (14659)
4/21/2008 4:30:44 PM
Jerry Stuckle wrote:

> Yes, tail is one way to do it.  But it also means you must have the
> privileges to exec tail.  Many (most?) shared hosts don't allow this.

Point taken. The same goes for windows hosts.

> And multiple seeks, etc. are again a cpu hog.

As compared to loading and looking through the entire file? I have to remind
you that the worst speed hog here is disk access. You *do* want to avoid
unneccesary disk access. And the only way to do that is by searching line
breaks from the end of the file, using fseek().

C'mon, this is a old-timey C textbook exercise.

> They can work if you have an idea what the size of your lines are, but if, 
> like many log files, the lines vary considerably in length, it's much 
> harder.  And the code is far more complicated.

The method does work without any hassle... Really, try it out. And a 10-line
loop is not complicated.

<?php

$file = "/var/log/foobar";

define('CHUNK',2048);

$lines_to_find = 10;
$lines = array();
$buffer = '';
$line_break = "\n";

$fp = fopen($file,'r');
fseek($fp, - CHUNK, SEEK_END);
$buffer = fread($fp,CHUNK);

while($lines_to_find)
{
        if ( $pos = strrpos($buffer,$line_break) )
        {
                $lines[] = substr($buffer,$pos);
                $buffer  = substr($buffer,0,$pos);
                $lines_to_find--;
        }
        else
        {
                fseek($fp, - 2 * CHUNK, SEEK_CUR);
                $buffer = fread($fp,CHUNK) . $buffer;
        }
}

print_r($lines);

?>

(The lines will come in reverse order. I guess that you know how to (a)
reverse an array (b) print an array in reverse order)

> I just find it much easier to get a decent amount of RAM allocated to
> PHP and read the entire file in.  It's not like you're talking 100MB or
> anything.

Please do excuse me here for being sarcastic, but if your first piece of
advice is "tweak php.ini to raise the memory limit", the original poster
*will* face 100 MB-long logfiles.


-- 
----------------------------------
Iv�n S�nchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-

MSN:i_eat_s_p_a_m_for_breakfast@hotmail.com
Jabber:ivansanchez@jabber.org ; ivansanchez@kdetalk.net
0
ISO
4/21/2008 11:51:51 PM
Iv�n S�nchez Ortega wrote:
> Jerry Stuckle wrote:
> 
>> Yes, tail is one way to do it.  But it also means you must have the
>> privileges to exec tail.  Many (most?) shared hosts don't allow this.
> 
> Point taken. The same goes for windows hosts.
> 
>> And multiple seeks, etc. are again a cpu hog.
> 
> As compared to loading and looking through the entire file? I have to remind
> you that the worst speed hog here is disk access. You *do* want to avoid
> unneccesary disk access. And the only way to do that is by searching line
> breaks from the end of the file, using fseek().
> 

Nowadays, disk transfer is done by hardware - especially in servers. 
While the disk is seeking, etc., the CPU can be handling other requests. 
Only during the relatively short reads is the bus tied up for I/O.  The 
joys of multiprocessing.

> C'mon, this is a old-timey C textbook exercise.
>

Yep, with the emphasis on OLD - DOS days, single process, software doing 
the data transfer...

>> They can work if you have an idea what the size of your lines are, but if, 
>> like many log files, the lines vary considerably in length, it's much 
>> harder.  And the code is far more complicated.
> 
> The method does work without any hassle... Really, try it out. And a 10-line
> loop is not complicated.
> 
> <?php
> 
> $file = "/var/log/foobar";
> 
> define('CHUNK',2048);
> 
> $lines_to_find = 10;
> $lines = array();
> $buffer = '';
> $line_break = "\n";
> 
> $fp = fopen($file,'r');
> fseek($fp, - CHUNK, SEEK_END);
> $buffer = fread($fp,CHUNK);
> 
> while($lines_to_find)
> {
>         if ( $pos = strrpos($buffer,$line_break) )
>         {
>                 $lines[] = substr($buffer,$pos);
>                 $buffer  = substr($buffer,0,$pos);
>                 $lines_to_find--;
>         }
>         else
>         {
>                 fseek($fp, - 2 * CHUNK, SEEK_CUR);
>                 $buffer = fread($fp,CHUNK) . $buffer;
>         }
> }
> 
> print_r($lines);
> 
> ?>
> 
> (The lines will come in reverse order. I guess that you know how to (a)
> reverse an array (b) print an array in reverse order)
> 

<?php
   $array = file('/var/log/foobar');
   $start = max(count($array-10), 0);
   for ($i = $start; $i < count($array); $i++
     echo $array[$i] . "\n";
?>

Which is easier to read?  And BTW, the latter doesn't give you the extra 
stuff print_r() does.

>> I just find it much easier to get a decent amount of RAM allocated to
>> PHP and read the entire file in.  It's not like you're talking 100MB or
>> anything.
> 
> Please do excuse me here for being sarcastic, but if your first piece of
> advice is "tweak php.ini to raise the memory limit", the original poster
> *will* face 100 MB-long logfiles.
> 
> 

He didn't indicate that.  In fact, his log files are less than 5M.


-- 
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
jstucklex@attglobal.net
==================

0
jstucklex (14659)
4/22/2008 1:05:27 AM
Jerry Stuckle wrote:

>> As compared to loading and looking through the entire file? I have to
>> remind you that the worst speed hog here is disk access. You *do* want to
>> avoid unneccesary disk access. And the only way to do that is by
>> searching line breaks from the end of the file, using fseek().
> 
> Nowadays, disk transfer is done by hardware - especially in servers.
> While the disk is seeking, etc., the CPU can be handling other requests.
> Only during the relatively short reads is the bus tied up for I/O.  The
> joys of multiprocessing.

Last time I checked, disk I/O wasn't "relatively short" - it was two orders
of magnitude longer than memory I/O, as in miliseconds compared to
nanoseconds.

And yeah, OK, we got multiprocessing and DMA, but it doesn't mean that any
other processes don't need that I/O time as well.

Again, I don't think it's a good idea to load the entire file in memory.

Cheers,
-- 
----------------------------------
Iv�n S�nchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-

Los fot�grafos lo hacen con la luz apagada.
0
ISO
4/22/2008 7:41:35 AM
Iv�n S�nchez Ortega wrote:
> Jerry Stuckle wrote:
> 
>>> As compared to loading and looking through the entire file? I have to
>>> remind you that the worst speed hog here is disk access. You *do* want to
>>> avoid unneccesary disk access. And the only way to do that is by
>>> searching line breaks from the end of the file, using fseek().
>> Nowadays, disk transfer is done by hardware - especially in servers.
>> While the disk is seeking, etc., the CPU can be handling other requests.
>> Only during the relatively short reads is the bus tied up for I/O.  The
>> joys of multiprocessing.
> 
> Last time I checked, disk I/O wasn't "relatively short" - it was two orders
> of magnitude longer than memory I/O, as in miliseconds compared to
> nanoseconds.
>

Sure.  But the CPU is doing other things at the time.

> And yeah, OK, we got multiprocessing and DMA, but it doesn't mean that any
> other processes don't need that I/O time as well.
> 

Nope. But there's a lot going on which doesn't require disk I/O.  And in 
most active (and properly configured) web servers, a good portion of 
what is served comes from cache.

With your code it takes many more CPU cycles to accomplish the same 
thing, during which time nothing else requiring CPU cycles can be processed.

> Again, I don't think it's a good idea to load the entire file in memory.
> 
> Cheers,

It's fine for you to disagree.  I don't see a problem when you have a 
file which will be known not to grow to 100 MB.

-- 
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
jstucklex@attglobal.net
==================

0
jstucklex (14659)
4/22/2008 11:06:24 AM
On Tue, 22 Apr 2008 07:06:24 -0400, Jerry Stuckle wrote:
> Iv�n S�nchez Ortega wrote:
>> Again, I don't think it's a good idea to load the entire file in memory.
>> 
>> Cheers,
>
> It's fine for you to disagree.  I don't see a problem when you have a 
> file which will be known not to grow to 100 MB.

.... Or some other arbitrary size that won't cause your system to go into
swap death. This is hardly a disagreement, btw.

-- 
"HTML's a cheap whore. Treating her with respect is possible, and even
preferable, because once upon a time she was a beautiful and virginal
format, but you shouldn't expect too much of her at this point." --M"K"H
0
hellsop (974)
4/22/2008 12:28:52 PM
Peter H. Coffin wrote:
> On Tue, 22 Apr 2008 07:06:24 -0400, Jerry Stuckle wrote:
>> Iv�n S�nchez Ortega wrote:
>>> Again, I don't think it's a good idea to load the entire file in memory.
>>>
>>> Cheers,
>> It's fine for you to disagree.  I don't see a problem when you have a 
>> file which will be known not to grow to 100 MB.
> 
> ... Or some other arbitrary size that won't cause your system to go into
> swap death. This is hardly a disagreement, btw.
> 

Sure, it's disagreement.  I say it's only a 5MB file, and not likely to 
grow to a huge size - go ahead and load it into memory.  Iv�n says read 
the file in chunks.  Sounds like a disagreement to me :-)

If we were talking hundreds of megabytes, I'd have a different answer.

-- 
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
jstucklex@attglobal.net
==================

0
jstucklex (14659)
4/22/2008 1:42:05 PM
Jerry Stuckle wrote:

> With your code it takes many more CPU cycles to accomplish the same
> thing, during which time nothing else requiring CPU cycles can be
> processed.

Pardon me?

Complexity of my algorithm (the same as GNU "tail") is O(n), where n is the
number of bytes that make up the desired lines at the end of the file.
Efficiency here depends on strrpos(), which has an complexity of O(n).

Complexity of your algorithm is O(m^2 * log(m)), where m is the total size
of the file. file() must check *every* character read to see if it's a line
break - that takes O(m). Then, you're doing count() - as PHP arrays are
hash tables (well, ordered maps), it is well known that transversing it
takes O(m*log(m)).

What is your basis to say that parsing the entire file is more CPU efficient
than parsing the last lines starting from the end? Because the way I see
it, O(n) << O(m^2*log(m)).


Cheers,
-- 
----------------------------------
Iv�n S�nchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-

Un ordenador no es un televisor ni un microondas, es una herramienta
compleja.
0
ISO
4/22/2008 8:50:22 PM
Iv�n S�nchez Ortega wrote:
> Jerry Stuckle wrote:
> 
>> With your code it takes many more CPU cycles to accomplish the same
>> thing, during which time nothing else requiring CPU cycles can be
>> processed.
> 
> Pardon me?
> 
> Complexity of my algorithm (the same as GNU "tail") is O(n), where n is the
> number of bytes that make up the desired lines at the end of the file.
> Efficiency here depends on strrpos(), which has an complexity of O(n).
> 
> Complexity of your algorithm is O(m^2 * log(m)), where m is the total size
> of the file. file() must check *every* character read to see if it's a line
> break - that takes O(m). Then, you're doing count() - as PHP arrays are
> hash tables (well, ordered maps), it is well known that transversing it
> takes O(m*log(m)).
> 
> What is your basis to say that parsing the entire file is more CPU efficient
> than parsing the last lines starting from the end? Because the way I see
> it, O(n) << O(m^2*log(m)).
> 
> 
> Cheers,

tail is a compiled program.  It is much more efficient than an 
interpreted one.

And the only searching the program has to do is for the new line 
character.  Even in an interpreted language, that can be optimized be 
quite a fast operation.

As opposed to multiple calls to seek and read the file, doing your own 
searching... Much more code to go through and much more cpu intensive.

-- 
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
jstucklex@attglobal.net
==================

0
jstucklex (14659)
4/22/2008 10:26:36 PM
Jerry Stuckle wrote:

> As opposed to multiple calls to seek and read the file, doing your own
> searching... Much more code to go through and much more cpu intensive.

Your argument doesn't hold here, Jerry. The longer "seek and read" algorithm
has a complexity of O(n), whereas the "file() - count() - for()" has
O(n^2*log(n)).

It just doesn't hold.

-- 
----------------------------------
Iv�n S�nchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-

MSN:i_eat_s_p_a_m_for_breakfast@hotmail.com
Jabber:ivansanchez@jabber.org ; ivansanchez@kdetalk.net
0
ISO
4/22/2008 11:09:22 PM
Iv�n S�nchez Ortega wrote:
> Jerry Stuckle wrote:
> 
>> As opposed to multiple calls to seek and read the file, doing your own
>> searching... Much more code to go through and much more cpu intensive.
> 
> Your argument doesn't hold here, Jerry. The longer "seek and read" algorithm
> has a complexity of O(n), whereas the "file() - count() - for()" has
> O(n^2*log(n)).
> 
> It just doesn't hold.
> 

You're assuming the path through the code is the same - or at least the 
same length in cpu cycles.  It isn't - not by a long shot.

Your argument is highly fallacious.

-- 
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
jstucklex@attglobal.net
==================

0
jstucklex (14659)
4/23/2008 12:02:04 AM
Jerry Stuckle wrote:
> Iv�n S�nchez Ortega wrote:
>
>> The longer "seek and read" algorithm has a complexity of O(n), whereas 
>> the "file() - count() - for()" has O(n^2*log(n)).
> 
> Your argument is highly fallacious.

Would you please elaborate?

-- 
----------------------------------
Iv�n S�nchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-

Q:      How does a hacker fix a function which
        doesn't work for all of the elements in its domain?
A:      He changes the domain.

0
ISO
4/23/2008 12:17:06 AM
Iv�n S�nchez Ortega wrote:
> Jerry Stuckle wrote:
>> Iv�n S�nchez Ortega wrote:
>>
>>> The longer "seek and read" algorithm has a complexity of O(n), whereas 
>>> the "file() - count() - for()" has O(n^2*log(n)).
>> Your argument is highly fallacious.
> 
> Would you please elaborate?
> 

You assume either way takes the same number of cpu cycles.  file() is a 
single call to fetch the entire file.  Searching for the new line 
characters is also very fast, in cpu cycles (it can be highly optimized 
in machine code). fopen(), then multiple fseek(), fread() and searching 
yourself for the newline characters, followed by fclose() is much more 
cpu intensive.  This is true in a compiled language also, but in an 
interpreted language the difference is even greater.

-- 
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
jstucklex@attglobal.net
==================

0
jstucklex (14659)
4/23/2008 12:25:50 AM
Jerry Stuckle wrote:

[...]
> You assume either way takes the same number of cpu cycles.

No, I don't - I do assume that more efficient algorithms take less CPU
cycles, though.

> file() is a single call to fetch the entire file.  Searching for the new 
> line characters is also very fast, in cpu cycles (it can be highly 
> optimized in machine code). fopen(), then multiple fseek(), fread() and 
> searching yourself for the newline characters, followed by fclose() is 
> much more cpu intensive.

Are you telling me that the overhead of PHP's fseek() over C's fseek() (and
fopen() and fread(), etc) is great enough to make it up for one full order
of magnitude in the complexity of the algorithm and the extra use of
memory?

-- 
----------------------------------
Iv�n S�nchez Ortega -ivansanchez-algarroba-escomposlinux-punto-org-

Por que pelear por poseer las monta�as?; si cuando de nosotros no quede ni
el recuerdo, y otras generaciones sigan matandose para conseguirlas, ellas
seguiran ahi, riendose del hombre. Proverbio apache
0
ISO
4/23/2008 1:08:55 AM
Iv�n S�nchez Ortega wrote:
> Jerry Stuckle wrote:
> 
> [...]
>> You assume either way takes the same number of cpu cycles.
> 
> No, I don't - I do assume that more efficient algorithms take less CPU
> cycles, though.
> 

That's true if they're doing the same thing.  But in this case they aren't.

>> file() is a single call to fetch the entire file.  Searching for the new 
>> line characters is also very fast, in cpu cycles (it can be highly 
>> optimized in machine code). fopen(), then multiple fseek(), fread() and 
>> searching yourself for the newline characters, followed by fclose() is 
>> much more cpu intensive.
> 
> Are you telling me that the overhead of PHP's fseek() over C's fseek() (and
> fopen() and fread(), etc) is great enough to make it up for one full order
> of magnitude in the complexity of the algorithm and the extra use of
> memory?
> 

Interpreted languages always have more overhead, and the more time you 
spend in the interpreted code, the higher that overhead.  Once call to 
file() has very little overhead because everything from that point on is 
compiled code.  But going back between your code and the system 
functions has much more overhead.

Don't even try to compare performance in a compiled language vs. an 
interpreted one.  It's comparing apples and oranges.

-- 
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp.
jstucklex@attglobal.net
==================

0
jstucklex (14659)
4/23/2008 1:40:01 AM
On Tue, 22 Apr 2008 21:40:01 -0400, Jerry Stuckle wrote:

[putolin]

> Don't even try to compare performance in a compiled language vs. an
> interpreted one.  It's comparing apples and oranges.

Ever heard of perl?

-- 
Tayo'y Mga Pinoy
0
4/23/2008 8:45:11 PM
<comp.lang.php>
<Baho Utot>
<Wed, 23 Apr 2008 16:45:11 -0400>
<pan.2008.04.23.20.45.11@bildanet.com>

> Ever heard of perl?
> 

Isnt she a singer ? .


-- 
www.krustov.co.uk
0
me4 (19624)
4/23/2008 9:08:22 PM
Reply: