0

I am having a lot of problem while writing a file within foreach loop. It either writes the line which is at the end in the array or is it at the start of the array.

For Example:

A file contains such elements,

page.php?id=1
page.php?id=3
page.php?id=4
investor.php?id=1&la=1
page.php?id=15
page.php?id=13
page.php?id=14

The code will open this file and then separate each array using explode by using = delimiter. And will return such elements

page.php?id
page.php?id
page.php?id
investor.php?id
page.php?id
page.php?id
page.php?id

then it will choose unique elements using array_unique function & then save it in a file. I have this code. Please Help me

 $lines = file($fopen2);
    foreach($lines as $line)
    {
    $rfi_links = explode("=",$line);
    echo $array = $rfi_links[0];
    $save1 = $rfi.$file.$txt;
    $fp=fopen("$save1","w+");
    fwrite($fp,$array);
    fclose($fp);
    }
    $links_duplicate_removed = array_unique($array);
    print_r($links_duplicate_removed);
Nida Zubair
  • 23
  • 2
  • 5
  • Beyond opening the file in the foreach loop and truncating it each time, you're not removing duplicates until after the foreach loop and the file has been written, so everything, including duplicates, would wind up in the file. – Jason Mar 17 '12 at 11:30

2 Answers2

0

What kind of does not make sense, is the fact that you're writing the current url always to that file while overwriting its previous content. In every step of the foreach-loop, you reopen that file, erase its content and write one url to that file. In the next step, you reopen exactly the same file and do that again. That's why you end up with only the last url in that file.

You will need to collect all urls in an array, throw out duplicates and then write the unique ones to the disc:

$lines = file($fopen2);
$urls = array();                          // <-- create empty array for the urls

foreach ($lines as $line) {
    $rfi_links = explode('=', $line, 2);  // <-- you need only two parts, rights?
    $urls[] = $rfi_links[0];              // <-- push new URL to the array
}

// Remove duplicates from the array
$links_duplicate_removed = array_unique($urls);

// Write unique urls to the file:
file_put_contents($rfi.$file.$ext, implode(PHP_EOL, $links_duplicate_removed));

Another solution (much more inspired by your former method) is to open the file once, before starting to iterate over the lines:

$lines = file($fopen2);
$urls = array();

// Open file
$fp = fopen($rfi.$file.$ext, 'w');

foreach ($lines as $line) {
    $rfi_url = explode('=', $line, 2);

    // check if that url is new
    if (!in_array($rfi_url[0], $urls)) {
        // it is new, so add it to the array (=mark it as "already occured")
        $urls[] = $rfi_url[0];

        // Write new url to the file
        fputs($fp, $rfi_url[0] . PHP_EOL);
    }
}

// Close the file
fclose($fp);
Niko
  • 26,516
  • 9
  • 93
  • 110
0

"w+" would create a new file on each open, wiping out the old content.

"a+" solves the problem, but it's better to open the file for writing before the loop, and closing after it.

Karoly Horvath
  • 94,607
  • 11
  • 117
  • 176