Introduction

Amazon Simple Storage Service(S3) is one of the most used object storage services, and it is because of scalability, security, performance, and data availability. That means customers of any size or industries such as websites, mobile apps, enterprise applications, and IoT devices can use it to store any volume of data.

Amazon S3 provides easy-to-use management features so you can appropriately organize your data to fulfill your business requirements.

Many of us are using AWS s3 bucket on a daily basis; one of the most common challenges that are faced while working with cloud storage is syncing or uploading multiple objects at once. Yes, we can drag and drop or upload on a direct bucket page. Like the below image.

Upload Files

But the problem with this approach is if you’re uploading large objects over an unstable network if network errors occur you must have to restart uploading from the beginning.

restart upload

Suppose you are uploading 2000+ files and you come to know that upload fails and your uploading these files from the last 1 hour, re-uploading has become a time-consuming process. So, to overcome this problem we have two solutions that we will discuss in the next sections.

Prerequisites

  • AWS Account
  • Installed AWS CLI

Upload Objects Using Multipart Upload API

Multipart upload opens the gate to upload a single object as a set of parts. Considering that it is possible to upload object parts independently and in any order.

In case the transmission fails in any section, it is possible to retransmit that section without affecting any other sections. So, it’s a good practice to use multipart uploads instead of uploading the object in a single operation.

Advantages of Using multipart upload:

  • Improved throughput – improve uploading speed
  • Fast recovery from any network issues: no need to re-upload from beginning
  • Resume and pause object uploads
  • It is possible to upload any object as you are creating it.

We can use multipart file uploading API with different technologies SDK or REST API for more details visit

Manage AWS Cloud Services with Bacancy!
Stop looking for AWS certified architect for your project, because Bacancy is here! Contact the best! Get the best! Hire AWS developer today!

Copy Files to AWS S3 Bucket using AWS S3 CLI

Install AWS CLI

We need to install CLI. With the use of AWS CLI, we can perform an S3 copy operation. If you don’t know how to install CLI follow this guide: Install AWS CLI.

Configure AWS Profile

Now, it’s time to configure the AWS profile. For that, use “AWS configure” command. You can configure AWS credentials, find your credentials under IAM -> Users -> security_credentials tab on the AWS console

We are done with configuring the AWS profile. Now, you can access your S3 bucket name “bacancy-s3-blog” using the list below the bucket command

List All the Existing Buckets in S3

Use the below command to list all the existing buckets.

Copy Text
aws s3 ls

Copy Single File to AWS S3 Bucket

Use the below command to copy a single file to the S3 bucket.

Copy Text
aws s3 cp file.txt s3://<your bucket name>

AWS S3 Copy Multiple Files

Use the below command to copy multiple files from one directory to another directory using AWS S3.

Copy Text
aws s3 cp <your directory path> s3://<your bucket name> –recursive

Note: By using aws s3 cp recursive flag to indicate that all files must be copied recursively.

Copy Multiple Files

As you can see on the above video even if our network connection is lost or is connected after reconnecting the process goes on without losing any file.

Conclusion

So, this was about how to copy multiple files from local to AWS S3 bucket using AWS CLI. For more such tutorials feel free to visit the Cloud tutorials page and start learning! Feel free to contact us if you have any suggestions or feedback.

AWS Cost Optimization: Whitepaper

Read Now

Get In Touch

[email protected]

Your Success Is Guaranteed !

We accelerate the release of digital product and guaranteed their success

We Use Slack, Jira & GitHub for Accurate Deployment and Effective Communication.