I am building a news site, written in php. Pages on the website, have content that needs to be automatically updated, depending on some information on the website. My idea was to create one .txt file, that will contain all those information, so I can access it via php and modify pages on the site. When user loads the site, PHP script runs and reads whole file into a string, splits it two times and then works with this strings. But, possible problem is that this .txt file can be really big (like 10mb) and with a lot of traffic, it is probably bad idea to read file this big every time user loads a page, so I probably need something faster and better. Possible solution is mysql database, but would it make big difference? Also, are there some tricks to optimize website performance using PHP?
NOTE: Here is script that reads the file:
<?php
$myfile = fopen("postovi.txt", "r") or die("Unable to open file!");
$smejanje = fread($myfile,filesize("postovi.txt"));
$postovi= explode("<novipost>", $smejanje);
$i = 0;
$duzina = sizeof($postovi);
while ($i < $duzina) //duzina = broj postova <ucitava sve postove>
{
$postovi[$i] = explode("|", $postovi[$i]);
$i++;
}
fclose($myfile);
and here is html element that is to be updated with information from that file:
<?php $id++; ?>
<div class="mali-post-h">
<a href="<?php echo($postovi[$id][0]); ?>" id="<?php echo("mali-post-link-".$id); ?>">
<div class="mali-post" style="background-image: url(<?php echo $postovi[$id][3]; ?>);" id="<?php echo("mali-post-".$id); ?>">
<div class="mali-post-naslov">
<p id="<?php echo("mali-post-naslov-".$id); ?>" class="customfont"><?php echo $postovi[$id][2]; ?></p>
<p style="font-size:10px"> <span style="font-weight: 900; font-size:15px; color: #3366FF;"> <?php echo $znak ?> </span> <?php
$time = strtotime($postovi[$id][4]);
echo ' pre '.humanTiming($time); ?> </p>
</div>
</div>
</a>
</div>