FAQs

Can I save files to AWS?

Written

A user had a website where there was the potential for a large number of files to be uploaded. Rather than save them on the web server they wanted to save the files to the Amazon Web Services Simple Storage Service or 'AWS S3' for short. This FAQ describes a simple test form that uploads a single file, and, after the form is submitted, uploads the file to AWS and deletes the copy from the web server.

The form itself can be any ChronoForm including a File Upload element in the Form HTML and a matching Upload Files action in the form ON Submit Event. The upload to AWS is done using a Custom Code action after the Upload Files action.

Note: it isn't possible to use a basic ChronoForm to a user file directly from the browser to AWS, the file has to be loaded to the web-server first and then re-uploaded to AWS. You can upload files directly to AWS from the browser using an application like UpLoadify but this would need to be separate from the main ChronoForm.

You need to get an API library to handle the AWS upload. The official AWS API suite is enormous and complex and so I found and used a simpler library from undesigned.org.za. You can get the library from this GitHub page, click the DownLoad ZIP button at the bottom right of the page. When you have the library open the Zip and find the S3.php file, use FTP to upload it to your website. I used the /components/com_chronoforms/extras/amazon-s3/ folder. 

Here's the code I used in the Custom Code action, it is based on part of the example.php file from the GitHub package modified to work with ChronoForms.

<?php
// modify the next three lines to match your settings
// add your AWS access keys below
$file_input_name = 'input_file_1';
$bucket_name = 'rdj-s3test';
$region_code = 's3-eu-west-1';

$debug = false;

if ( !isset($form->data[$file_input_name]) || !$form->data[$file_input_name] ) {
  //no file upload found - stop processing
  return false;
}

$upload_file = $form->data['_PLUGINS_']['upload_files'][$file_input_name]['path'];

if ( !class_exists('S3')  ) {
  require_once JPATH_SITE.'/components/com_chronoforms/extras/amazon-s3/S3.php';
}

// AWS access info
if ( !defined('awsAccessKey') ) {
  define('awsAccessKey', 'change-this');
}
if ( !defined('awsSecretKey') ) {
  define('awsSecretKey', 'change-this');
}

// Check if our upload file exists
if ( !file_exists($upload_file) || !is_file($upload_file) ) {
  exit("\nERROR: No such file: $upload_file\n\n");
}

// Check for CURL
if ( !extension_loaded('curl') && !@dl(PHP_SHLIB_SUFFIX == 'so' ? 'curl.so' : 'php_curl.dll') ) {
  exit("\nERROR: CURL extension not loaded\n\n");
}

// Pointless without your keys!
if (  awsAccessKey == 'change-this' || awsSecretKey == 'change-this' ) {
  exit("\nERROR: AWS access information required\n\nPlease edit the following lines in this file:\n\n".
  "define('awsAccessKey', 'change-me');\ndefine('awsSecretKey', 'change-me');\n\n");
}

// Instantiate the class
$s3 = new S3(awsAccessKey, awsSecretKey);

// List your buckets:
$form->data['aws']['buckets'] = $s3->listBuckets();
if ( $debug ) echo '<div>S3::listBuckets(): '.print_r($form->data['aws']['buckets'], true).'</div>';*/

$form->data['aws']['bucket'] = $bucket_name;

// Create a bucket with public read access
if ( !in_array($bucket_name, $form->data['aws']['buckets']) ) {
  if ( $s3->putBucket($bucket_name, S3::ACL_PUBLIC_READ) ) {
    if ( $debug ) echo "<div>Created bucket {$bucket_name}".'</div>';
  } else {
    if ( $debug ) echo "<div>S3::putBucket(): Unable to create bucket (it may already exist and/or be owned by someone else)</div>'";
  return false;
  }
}
// Put our file (also with public read access)
$form->data['aws']['upload'] = $s3->putObjectFile($upload_file, $bucket_name, baseName($upload_file), S3::ACL_PUBLIC_READ);
if ( $form->data['aws']['upload'] ){
  if ( $debug ) echo "<div>S3::putObjectFile(): File copied to {$bucket_name}/".baseName($upload_file).'</div>';

  // Get object info
  $form->data['aws']['info'] = $s3->getObjectInfo($bucket_name, baseName($upload_file));
  if ( $debug ) echo "<div>S3::getObjectInfo(): Info for {$bucket_name}/".baseName($upload_file).': '.print_r($form->data['aws']['info'], 1).'</div>';

  // delete the local copy
  jimport('joomla.filesystem.file');
  JFile::delete($upload_file);
} else {
  if ( $debug ) echo "<div>S3::putObjectFile(): Failed to copy file</div>";
}
// build the object URL
$form->data['aws']['url'] = "https://{$region_code}.amazonaws.com/{$bucket_name}/{$form->data[$file_input_name]}";
?>

Note that there are lines here that you must edit to use your AWS nd form settings!!

When this is in place you can test your form to make sure that it is uploading files correctly to AWS, if you add a DeBugger action to the On Submit event of your form you should see output similar to this example:


    [aws] => Array
        (
            [buckets] => Array
                (
                    [0] => rdj-s3test
                    [1] => rdj-s3test52af1535a8d75
                )
            [bucket] => rdj-s3test
            [upload] => 1
            [info] => Array
                (
                    [date] => 1387212751
                    [time] => 1387212749
                    [hash] => a91d8e23680db231beacbb82f3c45bac
                    [type] => application/pdf
                    [size] => 14590
                )
            [url] => https://s3-eu-west-1.amazonaws.com/rdj-s3test/20131216085228_test_doc.pdf

This shows you the AWS buckets you have set up; the bucket used for this upload; the upload success or failure (you can use this to send a message back to the user); the object info returned from AWS and the new object URL.