I am using multiple csv files to populate the tables in a database. As I am on Symfony, I created a command which from a given directory read files in a defined order. A csv file equals a table in my BD. File sizes differ from file to file and may contain over than 65 thousand lines.
my script has been running locally for 3 days, it's progressing but it's heavy.
On a recipe server my script stops just after a few minutes showing the error
CRITICAL: Error thrown while running command "symfony-command-name". Message: "Failed to store cell BB18118 in cache" {"exception": "[object] (PhpOffice \ PhpSpreadsheet \ Exception (code: 0): Failed to store cell BB18118 in cache at project / vendor / phpoffice / phpspreadsheet / src / PhpSpreadsheet / Collection / Cells.php: 393)
I m using Symfony FileSystemAdapter, SimpleCacheBridge (1.1)
In My Symfony command I do
$pool = new FilesystemAdapter();
$simpleCache = new SimpleCacheBridge($pool);
Settings::setCache($simpleCache); // PhpOffice\PhpSpreadsheet\Settings
// loop directories and call Symfony service ...
In Symfony Service
$spreadSheet = IOFactory::load($csvPath);
$sheet = $spreadSheet->getActiveSheet()->toArray();
// loop sheet and databases operations ...
Stack : PHP : 7.4 SF : 5.3.9 phpspreadsheet : 1.18 SimpleCache : 1.1
Any Help please