2

I am working on the integration of jqueryfileupload plugin with AWS.I have completed the upload section successfully,but now i am looking to integrate the image resize feature.
I am using this plugin code.I have set up an example using minimum code which is as below.

index.html

 <!DOCTYPE HTML>
 <html>
 <head>
 <meta charset="utf-8">
 <title>jQuery File Upload Example</title>
 </head>
 <body>
 <input id="fileupload" type="file" name="files[]" data-url="aws/" multiple>
 <script src="//ajax.googleapis.com/ajax/libs/jquery/1.9.1/jquery.min.js"></script>
 <script src="js/vendor/jquery.ui.widget.js"></script>
 <script src="js/jquery.iframe-transport.js"></script>
 <script src="js/jquery.fileupload.js"></script>
 <script>
   $(function () {
     $('#fileupload').fileupload({
      dataType: 'json',
      done: function (e, data) {
        $.each(data.result.files, function (index, file) {
            $('<p/>').text(file.name).appendTo(document.body);
        });
      }
    });
  });
   </script>
  </body> 
 </html>

awssdk.php---This is the file I call after image is selected.

      <?php 
    $bucket = "my bucket name";
    $subFolder = "";  // leave blank for upload into the bucket directly
    if (!class_exists('S3'))require_once('S3.php');

    //AWS access info
   if (!defined('awsAccessKey')) define('awsAccessKey', 'my key');
   if (!defined('awsSecretKey')) define('awsSecretKey', 'my secret key');


    $options = array( 'image_versions' => array(
     'small' => array(
    'max_width' => 1920,
    'max_height' => 1200,
    'jpeg_quality' => 95
 ),

'medium' => array(
    'max_width' => 800,
    'max_height' => 600,
    'jpeg_quality' => 80
),

'thumbnail' => array(
    'max_width' => 80,
    'max_height' => 80
)
   ) 
  );

  //instantiate the class
  $s3 = new S3(awsAccessKey, awsSecretKey);

  function getFileInfo($bucket, $fileName) {
   global $s3;
   $fileArray = "";
   $size = $s3->getBucket($bucket);
   $furl = "http://" . $bucket . ".s3.amazonaws.com/".$fileName;
   $fileArray['name'] = $fileName;
   $fileArray['size'] = $size;
   $fileArray['url'] = $furl;
   $fileArray['thumbnail'] = $furl;
   $fileArray['delete_url'] = "server/php/index.php?file=".$fileName;
   $fileArray['delete_type'] = "DELETE";
   return $fileArray;
 }

  function uploadFiles($bucket, $prefix="") {
   global $s3;
   if (isset($_REQUEST['_method']) && $_REQUEST['_method'] === 'DELETE') {
     return "";
  }
  $upload = isset($_FILES['files']) ? $_FILES['files'] : null;
  $info = array();
  if ($upload && is_array($upload['tmp_name'])) {
foreach($upload['tmp_name'] as $index => $value) {
    $fileTempName = $upload['tmp_name'][$index];
    $fileName = (isset($_SERVER['HTTP_X_FILE_NAME']) ?      $_SERVER['HTTP_X_FILE_NAME'] : $upload['name'][$index]);
    $fileName = $prefix.str_replace(" ", "_", $fileName);
    // $response = $s3->create_object($bucket, $fileName, array('fileUpload' => $fileTempName, 'acl' => AmazonS3::ACL_PUBLIC, 'meta' => array('keywords' => 'example, test'),));
    $response = $s3->putObjectFile($fileTempName,$bucket,'images/'.$fileName,S3::ACL_PUBLIC_READ);
    //print_r($response);
    if ($response==1) {
        $info[] = getFileInfo($bucket, $fileName);
    } else {
             echo "<strong>Something went wrong while uploading your file... sorry.</strong>";
    }
}
} else {
    if ($upload || isset($_SERVER['HTTP_X_FILE_NAME'])) {
        $fileTempName = $upload['tmp_name'];
        $fileName = (isset($_SERVER['HTTP_X_FILE_NAME']) ? $_SERVER['HTTP_X_FILE_NAME'] : $upload['name']);
        $fileName =  $prefix.str_replace(" ", "_", $fileName);
        //$response = $s3->create_object($bucket, $fileName, array('fileUpload' => $fileTempName, 'acl' => AmazonS3::ACL_PUBLIC, 'meta' => array('keywords' => 'example, test'),));
        $response = $s3->putObjectFile($upload['tmp_name'],$bucket,$fileName,S3::ACL_PUBLIC_READ);
        if ($response->isOK()) {
            $info[] = getFileInfo($bucket, $fileName);
        } else {
                 echo "<strong>Something went wrong while uploading your file... sorry.</strong>";
        }
    }
}
header('Vary: Accept');
$json = json_encode($info);
$redirect = isset($_REQUEST['redirect']) ? stripslashes($_REQUEST['redirect']) : null;
if ($redirect) {
    header('Location: ' . sprintf($redirect, rawurlencode($json)));
    return;
}
if (isset($_SERVER['HTTP_ACCEPT']) && (strpos($_SERVER['HTTP_ACCEPT'], 'application/json') !== false)) {
    header('Content-type: application/json');
} else {
    header('Content-type: text/plain');
}
 return $info;
 }
?>

Here is the S3 class I am using.

The JqueryUploadPLugin ships with a server side PHP class to upload images which is great.But as I am using AWS I have to use their API to upload images and the plugin code won't work.As I mentioned above I have implemented the upload portion,but need help to create image thumbnails and different size images before I upload.i.e I want images to upload in 3 variations ex: original,1024x768,100x100.
The UploadHandler.php has few functions for creating scaled images example:protected function create_scaled_image($file_name, $version, $options) for scaling and others. I am stuck at integrating these functions as I am new to OO php and AWS.
Any one done something similar and can give inputs would be helpful
Thank you

KillABug
  • 1,414
  • 6
  • 34
  • 69
  • 1
    You haven't mentioned where you are stuck exactly? If you have already got upload working, What is the problem you are having just to add some more code to create thumbnails? – rineez Jul 05 '13 at 13:46
  • Yes but I want to use the same functions as in the UploadHandler.php(I have added a link to it).It has some awesome functions that work great,but it involves some dependency which is troubling me.I want help in using those functions example:`protected function create_scaled_image($file_name, $version, $options)` for scaling and others – KillABug Jul 05 '13 at 13:51

2 Answers2

2

It seems you are trying to use the methods from UploadHandler class inside your code in awssdk.php.

I think the right way for you to go will be by customizing the UploadHandler class - more specifically the handle_file_upload function. This will be probably more beneficial for you as you get access to all the good features of UploadHandler class this way.

And you can put just following lines in your awssdk.php

require('UploadHandler.php');
$upload_handler = new UploadHandler();

You can see currently the code in this function is storing the uploaded files in the path set in the "upload_dir" option. You just need to make an object of the S3 class inside that function and change the code to store uploaded file to S3.

I think the line you will have to change in UploadHandler is probably Line 703.

move_uploaded_file($uploaded_file, $file_path);

Should become:

$s3 = new S3(awsAccessKey, awsSecretKey);
$response = $s3->putObjectFile($uploaded_file,$bucket,$file->name,S3::ACL_PUBLIC_READ);
if ($response->isOK()) {
    $info[] = getFileInfo($bucket, $fileName);
} else {
   $file->error = "<strong>Something went wrong while uploading your file... sorry.</strong>";
}

You may also need to bring related code - for example the getFileInfo function - into the UploadHandler class.

Lines 707-711 appear to be for handling file uploads through PUT method. To handle this case you will have to keep those lines to let the file be uploaded to your server first and then transfer the file to S3, and then you can unlink the file in your server. But it is also safe to just comment out those lines if you are allowing POST method only.

Lines 697-701 appear to be for handling split uploads(I'm not sure). You will have to change that also if you wish to handle that case as well.

rineez
  • 753
  • 13
  • 33
  • Yes but the path set in the upload_dir has to be the path of the S3 directory.But S3 directories are all over `http://` and it won't allow me to have path that way.I have tried this `'upload_dir' => dirname('https://s3.amazonaws.com/elasticbeanstalk-us-east-1-966938761981/images/'), 'upload_url' => 'https://s3.amazonaws.com/elasticbeanstalk-us-east-1-966938761981/images/',` – KillABug Jul 05 '13 at 15:45
  • I get this error for `move_uploaded_file` -->move-uploaded-file</a>]: failed to open stream: HTTP wrapper does not support writeable connections if I change the paths as in the above comments – KillABug Jul 05 '13 at 15:52
  • 1
    I appreciate your response but I am kind of stuck badly,trying to help myself,I will need your help.`To handle this case you will have to keep those lines to let the file be uploaded to your server first and then transfer the file to S3,` I did not get your these lines as AWS itself supports uploads via PUT – KillABug Jul 05 '13 at 16:04
  • 1
    Sorry I didn't know AWS supports PUT upload. That's why I said so. If S3 class already support uploads from a PUT request, you can definitely replace those lines(707-711) with AWS code for PUT upload. Basically the three cases for handling a file upload will be all you need to change there. But You cannot just replace Amazon API urls in place of upload_dir. I think you can actually get rid of all the code that is related to upload_dir. – rineez Jul 06 '13 at 07:47
  • I was actually suggesting you should take the file upload code in awssdk.php and put it inside UploadHandler.php – rineez Jul 06 '13 at 07:52
  • I am trying it out!!Will check and let you know!!Thank you very much for your feedback – KillABug Jul 06 '13 at 10:17
  • I have succeeded in uploading the image as suggested by you but as I earlier said,I want to upload image in different versions(sizes) like `original,1024x768,thumbnail(100x100)` etc.I am trying to use the `image_versions` array and create images of different sizes and then upload.But,I am how to first create and then upload thumbnails and other sizes – KillABug Jul 07 '13 at 19:45
  • Sorry, I didn't notice the scaling happens only after line 719 inside `handle_image_file` method. So I think there has to be a change in approach to do this properly. You may need to revert back those previously changed lines and let the uploaded files be saved in your server first. And put the code for uploading to S3 somewhere after line 719. Clearly understand the logic of `handle_file_upload` method and figure out the correct place to put S3 uploading. May be it's ok to upload to S3 just before `return $file;`. I'll also update my answer after I have a closer look at that code. – rineez Jul 08 '13 at 04:50
  • Ok rineez!!Are you saying that I need to save the image first on my server and then move it to S3?I am not sure if that's good as it may take long time to upload large images.I am checking it from myside.Also,thank you for the help you have been a great support for me over this and hope I come over it with your help! – KillABug Jul 08 '13 at 05:46
  • Hi rineez,I worked on it and made a few changes,but as I said it takes time when I upload it first to my server and then to S3.It takes around 5 secs for a 500kb image which according to me is a bit too much.Did you find a better way of doing it? – KillABug Jul 08 '13 at 12:16
  • @KillABug Glad I could help :) . In any case the normal PHP image upload through POST method will always temporarily save the file to the web server's temp folder. So it is difficult to believe time spent for storing in image in php server can be a bottle neck here. How much time was taken when you were uploading direct to S3? And also please check how much time is taken when the code for uploading to S3 is commented out(ie; time for uploading to web server plus resizing alone). – rineez Jul 08 '13 at 17:59
  • I will go through the steps you mentioned and update if the results vary.Meanwhile I would like a final help over the return url and a progress bar I need to integrate.I get the progress bar for my local upload(i.e my server) and not the entire upload cycle and a return url for the delete that default plugin returns – KillABug Jul 08 '13 at 18:18
  • I have accepted the answer rineez,but the latency is an issue.I checked for direct POST upload the image uploads to S3 in about 14 secs(size:1.8MB).Now I tried to upload the same image using plugin to S3,and it takes around 22 secs.Same plugin on my current server takes about 15secs with the cropping to different sizes.Not sure how can I fix it! :( – KillABug Jul 09 '13 at 08:46
  • 22 secs to S3 - with image conversion or without image conversion? (I think it would be better to ask this question about latency as a separate question. As it may help you get attention of people who have more experience in that area.) – rineez Jul 09 '13 at 17:30
  • With image conversion.Actually I have added a [question](http://stackoverflow.com/questions/17544628/image-upload-performance-issue-with-amazon-s3-and-jqueryfileupload-plugin) already but no response!! – KillABug Jul 10 '13 at 01:40
  • nice. Then I shall up vote it to get more attention and also try to put my suggestions as answer there. – rineez Jul 10 '13 at 09:24
  • One thing to remember is that you wanted to reuse the code available in blueimp's UploadHandler for resizing the images. The images being stored on your server is a requirement for this image resizing code in UploadHandler. So optimizing the performance will most probably involve dropping the usage of resizing feature provided by blueimp's UploadHandler. – rineez Jul 10 '13 at 09:38
  • Yes right rineez,bcoz,I believe very well written and optimized and apart from changing the entire logic,I don't find any reason not to use it!I believe you would agree1 ;) – KillABug Jul 12 '13 at 06:56
2

Maybe what are you looking for is the Stream Wrapper. http://docs.aws.amazon.com/aws-sdk-php/guide/latest/service-s3.html#amazon-s3-stream-wrapper

"The Amazon S3 stream wrapper allows you to store and retrieve data from Amazon S3 using built-in PHP functions like file_get_contents, fopen, copy, rename, unlink, mkdir, rmdir, etc."

I'm looking the for the same solution too. I've founded this https://gist.github.com/tim-peterson/8172999 my it can help. I'm still waiting AWS approve my account so I couldn't test any solution.

Cassiano
  • 562
  • 5
  • 16