Hi all I just have a quick question regarding best practacises and perhaps some help on queuing and image manipulation.
Im currently working on a website that allows the user to upload in excess of 10 files at one time now in my experience ive only really handled single uploads or 2-3 max, this site allows the user to upload as many as they like and then performs image manipulation to create 3 versions of each image at different sizes.
My thought process and how ive implemented this goes as follows.
User goes to upload form and selected multiple files these are all uploaded inline when they have finished the form autosubmits, The uploaded files are uploaded directly to a temporary folder in S3, this was done as there are multiple servers in the live environment with a load balancer in front of them so i was worried if I uploaded them all to the server then if i fired a queue it might go to the incorrect server and not find the files, would be great if there was a nice way of doing this.
when the form is submitted it fires a notification to the queue on iron.io with the data from the form submit which basically calls the server and starts the processing of images the code for this is below
public function fire($job, $data)
{
set_time_limit(0);
try {
if(is_array($data)){
foreach ($data['file'] as $x => $file){ //loop through each file uploaded and now save them
if ($this->images->doesMediaExistInTemporaryFolder($file)){
if ($new_file = $this->images->getMediaFromTemporaryS3Folder($file)){
file_put_contents (app_path() . '/storage/bulk-upload/' . $file, (string) $new_file['Body']);
$record_date = false;
if ($data['date'][$x] != 'no-date'){
if ($new_file['ContentType'] == 'image/jpeg') {
$exif_data = @exif_read_data(app_path() . '/storage/bulk-upload/' . $file, 'FILE');
}
if (!empty($exif_data) && @array_key_exists('DateTime', $exif_data)){
$record_date = $exif_data['DateTime'];
} else {
$record_date = $data['date'][$x];
}
}
$created_file = new \Symfony\Component\HttpFoundation\File\UploadedFile(app_path() . '/storage/bulk-upload/' . $file, $file, $new_file['ContentType'] );
$input = array('vehicle_objectId' => $data['vehicle_objectId'], 'type' => $data['type'], 'privacy' => $data['privacy'], 'date' => $record_date);
if (file_exists(app_path() . '/storage/bulk-upload/' . $file)){
if ($record = $this->record_repository->save($input, $created_file)) {
unlink(app_path() . '/storage/bulk-upload/' . $file);
$this->images->deleteMediaFromTemporaryS3(array(array('Key' => $file )));
} else {
$data['filename'] = $file;
\Mail::send('emails.bulk-upload', $data, function($message) {
$message->to('email', 'Daniel Newns')->subject('Bulk upload save issue');
});
}
}
}
}
}
$parse = new \ParseRestClient();
$user = $parse->retrieveCurrentUser( $data['pid']);
if (isset($user->email)) {
$vehicle_url = \URL::route('vehicles.show', $data['vehicle_objectId']);
$body = "<p>Hi " . $user->username . "</p><p>Your records have all been created. View them all as part of your vehicle record <a href='" . $vehicle_url . "'>here</a></p>";
$message = array(
'to' => array(array('email' => $user->email)),
'from_email' => 'xxxxx
'from_name' => 'xxxxx'
);
$template_content = array(array("name" => "share", "content" => $body));
$response = \Email::messages()->sendTemplate('Bulk_Upload', $template_content, $message);
}
}
} catch(\Exception $e){
$message = array(
'to' => array(array('email' => 'email')),
'from_email' => 'email',
'from_name' => 'xxxxxx'
);
$content = '<p>'. $e->getMessage() . '</p>';
$content .= '<p>' . $e->getTraceAsString() . '</p>';
$template_content = array(array("name" => "share", "content" => $content));
$response = \Email::messages()->sendTemplate('Content_Share', $template_content, $message);
}
}
as you can see it loops through the data returned from the queue and loops through the files from here it pulls the image from S3 and stores it locally then it checks if there is a date set and works out the created date via either that or exif data. then it creates the file and saves the record in the save function it performs all the resizing required.
my question is really does someone else have any suggestions on how i can improve this as im occasionally getting emails from the exception where its saying that it cant find a certain image like it hasnt been created locally, is my method of creating the image locally using file_put_contest
the one i should be using or is there a better way for me to pull the data from S3 and work with it. Ive put a number of if statements in to stop things falling through the gaps etc.
be great to hear other peoples thoughts on where I have gone wrong here and what perhaps i could do to improve this? perhaps i could store an array of files that dont exists on first loop and then try again afterwards as i was thinking that it might be a case of the the code executing before the image exists would this be the case?
any help would be much appreciated.
thanks