Sep 5, 2018

Boto - Uploading file to a specific location on Amazon S3

When I upload a file from my local system to S3
  • Say media/downloads/logo.png it writes to <bucket>/media/downloads/logo.png
  • Suppose if I want to write to <bucket>/logo.png instead of media/downloads, please find the below script

import subprocess
import mimetypes
from boto.s3.connection import S3Connection, Location
from boto.s3.key import Key
import boto
import os

S3_BUCKET = ''

S3_KEY = ''
S3_SECRET = ''
conn = S3Connection(S3_KEY, S3_SECRET, calling_format=boto.s3.connection.OrdinaryCallingFormat())
bucket = conn.get_bucket(S3_BUCKET)
print bucket

key_name = 'logo.png'
path = 'media/img/'
full_key_name = os.path.join(path, key_name)
new_key = Key(bucket)
new_key.key = 'logo.png'
ctype = mimetypes.guess_type(full_key_name)[0] or "application/x-octet-stream"
new_key.set_metadata('Content-Type', ctype)
new_key.set_contents_from_filename(full_key_name)
if new_key.exists() == True:
   bucket.set_acl("public-read",new_key.key)
   url = new_key.generate_url(0, 'GET', None, False)
   print url



Thanks for reading.


Sep 2, 2018

How can you terminate custom/external HTTPS SSL certificate in AWS ELB and EC2

How can you terminate custom/external HTTPS SSL certificate in AWS ELB & EC2?
    1) at ELB level
        use AWS certificate manager, create a certificate & upload your existing certificate
        In ELB Listener rules, configure HTTPS(443 port) & attach the above certificate
        Limitation: You can add only one certificate per an ELB
    2) at EC2 level
        Suppose if you have multiple sites under EC2 (multi-tenant) & want to terminate HTTPS certificates for all the sites
        Having an ELB for each site will be costly solution, then you need to use TCP pass through solution
            https://test1.com
            https://test2.com
            https://test2.com
        In ELB Listener rules, configure TCP (443 port) pass through
        You could not obtain the clients IP address if the ELB was configured for TCP load balancing, so enable proxy protocol
        Enable proxy protocol in ELB through CLI (not available in AWS console), which allows X-Forwarded-For headers  
        Then the termination happens at you EC2 server level (Nginx/Apache)
       
        Nginx:
            server {
              listen *:443 ssl proxy_protocol;
              server_name *.site.com;
              set_real_ip_from 0.0.0.0/0;
              real_ip_header proxy_protocol;

              ssl on;
              ssl_certificate /opt/site/conf/ssl_keys/nginx_site.crt;
              ssl_certificate_key /opt/site/conf/ssl_keys/site.pem;

              location / {
                proxy_pass            http://127.0.0.1:80;
                proxy_read_timeout    90;
                proxy_connect_timeout 90;
                proxy_redirect        off;

                proxy_set_header      X-Real-IP $proxy_protocol_addr;
                proxy_set_header      X-Forwarded-For $proxy_protocol_addr;
                proxy_set_header      X-Forwarded-Proto https;
                proxy_set_header      X-Forwarded-Port 443;
                proxy_set_header      Host $host;
                proxy_set_header      X-Custom-Header nginx;
              }
            }

Sep 1, 2018

S3 Direct Upload Python Django

Using Pyhton Boto API, we can interact with Amazon S3 servers (for GET, PUT, POST etc)
We can also directly upload files to Amazon S3 from the client browser using Browser Uploads to S3 using HTML POST Forms.

Ref:
https://aws.amazon.com/articles/Java/1434

Prerequisites:
S3_BUCKET
S3_KEY
S3_SECRET
S3_URL

Note:
As per Amazon API, we need to encode policy format to base64 and further generate signature with SHA1
Both Policy and Signature need to be posted
Expiration : You can define the expiration time
acl: public-read/private
$key: upload path startes with
Check the POST S3 Url in HTML
Once the document is successfully uploaded to S3, S3 URL is shown on the page

Django Pyhton View:

import base64
import hmac, hashlib
import re

def direct_s3_upload(request):
  my_bucket = None
  policy_document = None
  policy_base_64 = signature = cors_xml = ""
  try:
    policy_document = '{"expiration": "2020-01-01T00:00:00Z", \
                       "conditions": [ \
                         {"bucket": "%s"}, \
                         {"acl": "public-read"}, \
                         ["starts-with", "$key", "uploads/"], \
                         ["content-length-range", 0, 524288000] \
                       ] \
                    }' % (<S3_BUCKET>)

    whitespace = re.compile(r'\s+')
    policy_document = whitespace.sub('', policy_document)
    policy_base_64 = base64.b64encode(whitespace.sub('', policy_document))
    signature = base64.b64encode(hmac.new(S3_SECRET, policy_base_64, hashlib.sha1).digest()) 
    dic1 = { 
           'MY_BUCKET_NAME': <S3_BUCKET>,
           'MY_AWS_KEY_ID': <S3_KEY>, 
           'MY_POLICY' : policy_base_64,
           'MY_SIGNATURE' : signature,
        }
  except:
    write_exception("direct_s3_upload")
  return render_to_response('test_s3_upload.html', context_instance=RequestContext(request, dic1))


HTML:  (test_s3_upload.html)
<html> 
  <head>
    <title>S3 POST Form</title> 
    <meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />
    <script>
      var bucketName = '{{MY_BUCKET_NAME}}';
      var AWSKeyId   = '{{MY_AWS_KEY_ID}}';
      var policy     = '{{MY_POLICY}}';
      var signature  = '{{MY_SIGNATURE}}';

      function S3ToolsClass() {
        var _handle_progress = null;
        var _handle_success  = null;
        var _handle_error    = null;
        var _file_name       = null;

        this.uploadFile = function(file, progress, success, error) {
          _handle_progress = progress;
          _handle_success  = success;
          _handle_error    = error;
          _file_name       = file.name;

          console.log(file.name)
          var fd = new FormData();
          fd.append('key', "uploads/" + file.name);
          fd.append('AWSAccessKeyId', AWSKeyId);
          fd.append('acl', 'public-read');
          fd.append('policy', policy);
          fd.append('signature', signature);
          fd.append("file",file);

          var xhr = new XMLHttpRequest({mozSystem: true});
          xhr.upload.addEventListener("progress", uploadProgress, false);
          xhr.addEventListener("load", uploadComplete, false);
          xhr.addEventListener("error", uploadFailed, false);
          xhr.addEventListener("abort", uploadCanceled, false);
          xhr.open('POST', 'https://s3.amazonaws.com/' + bucketName + '/');

          xhr.send(fd);
        }

        function uploadProgress(evt) {
          if (evt.lengthComputable) {
            var percentComplete = Math.round(evt.loaded * 100 / evt.total);
            _handle_progress(percentComplete);
          }
        }

        function uploadComplete(evt) {
          if (evt.target.responseText == "") {
            console.log("Upload complete - success") 
            _handle_success(_file_name);
          } else {
            console.log("Upload complete - not success") 
            _handle_error(evt.target.responseText);
          }
        }

        function uploadFailed(evt) {
          console.log("upload Failed")
          _handle_error("There was an error attempting to upload the file." + evt);
        }

        function uploadCanceled(evt) {
          console.log("upload cancelled")
          _handle_error("The upload has been canceled by the user or the browser dropped the connection.");
        }
      }
      var S3Tools = new S3ToolsClass();
      
      function uploadFile() {
        var file = document.getElementById('file').files[0];
        S3Tools.uploadFile(file, handleProgress, handleSuccess, handleError);
      }

      function handleProgress(percentComplete) {
        document.getElementById('progressNumber').innerHTML = percentComplete.toString() + '%';
      }

      function handleSuccess(fileName) {
        document.getElementById('progressNumber').innerHTML = 'Done!';
        document.getElementById('resultant_s3_url').innerHTML = 'https://s3.amazonaws.com/' + bucketName + '/uploads/' + fileName;
      }

      function handleError(message) {
        document.getElementById('progressNumber').innerHTML = 'Error: ' + message;
      }
    </script>
  </head>
  <body>
    <form id="form" enctype="multipart/form-data" method="post">
      <div class="row">
        1. Select a File<br>
        <input type="file" name="file" id="file"/>
      </div>
      <br>
      <div class="row">
        2. Upload File<br>
        <input type="button" onclick="uploadFile()" value="Upload" />
        <br/>
        <span id="progressNumber"></span>
      </div>
    </div>
    <br>
    <div class="row">
      3. Result S3 Url
      <br>
      <div id="resultant_s3_url"> </div>

    </div>
  </body>
</html>
Thank you for reading this article.