Simplify AWS S3 operations with Boto3

Posted By :Mohit Bansal |27th May 2020

?

BOTO 3

 

Boto3 is a software development kit for python. It provides interfaces to AWS. The released versions can be found on https://pypi.org/project/boto3/#history. It supports a wide number of AWS services like Database, Compute, Storage, etc. This blog is focussed on S3 i.e. (AWS Simple Storage Service)

 

?

 

INSTALLATION

 

Boto 3 can be installed using pip with the following command:

pip install boto3

 

CREDENTIALS SETUP

 

Using boto3 requires aws_access_key_id and aws_secret_access_key which can be obtained from the AWS account. In the ubuntu platform, these can be stored at ~/.aws/credentials

[default]

aws_access_key_id = YOUR_KEY

aws_secret_access_key = YOUR_SECRET

It also requires the region to be defined. In the ubuntu platform, the region can be stored at ~/.aws/config

[default]

region= YOUR_REGION

 

OPERATIONS

 

To use boto3, we must import it in the code file using:

import boto3

 

UPLOAD

Following code snippet shows example of uploading a file.

s3_resource = boto3.resource('s3')

s3_resource.Bucket(S3_BUCKET).put_object(Key=s3_file_path, Body=file, ContentEncoding='text/csv')

Resource represent an object-oriented interface to Amazon Web Services (AWS). We need to provide the s3 bucket, target path, file and ContentEncoding during file upload.

 

DOWNLOAD

The following code snippet shows an example of downloading an object from S3.

s3_client = boto3.client('s3')

obj = s3_client.get_object(Bucket=S3_BUCKET, Key=file_path)

Like Resource, the client also represents an object-oriented interface to Amazon Web Services (AWS). We need to provide the s3 bucket and path of the object for getting it.

 

COPY

We may come across a requirement to copy an exiting s3 object on another location on s3. Following code snippet shows example of copying an object in S3.

s3_client.copy_object(Bucket=S3_TARGET_BUCKET,

                                   Key=str(model_id) + "_" + str(raw_file_id) + "_downloadable.pdf",

                                   CopySource={'Bucket': S3_SOURCE_BUCKET, 'Key': s3_file_path},

                                   MetadataDirective='REPLACE')

We need to provide the source s3 bucket, source object path, target s3 bucket and target path to copy an object on s3.

 

Some other optional parameters are common to various operations like:

 

ACL

Access control lists (ACLs) are one of the resource-based access policy options you can use to manage access to your buckets and objects. ACLs can be used to grant basic read/write permissions to AWS accounts. For example, to grant public read permission, we can set ACL as: 

ACL='public-read'

 

CONTENT-DISPOSITION

Content-Disposition describes what the recipient should do with the content. Should it be displayed inline in the browser, or downloaded as an attachment and saved as a file? For example, to make a file downloadable in pdf format, we can set the content-disposition as:

ContentDisposition='attachment;', ContentType='application/pdf'

 

Similarly there other operations and parameters about which information can be found on https://boto3.amazonaws.com/v1/documentation/api/latest/guide/quickstart.html

 

Boto3 super simplifies the s3 operations while developing in python and supports almost all services and operations.

?


About Author

Mohit Bansal

He is tech enthusiast and always ready to learn new things. He has good skill in Python language.

Request For Proposal

[contact-form-7 404 "Not Found"]

Ready to innovate ? Let's get in touch

Chat With Us