Amazon Simple Storage Service(S3) is one of the most used object storage services, and it is because of scalability, security, performance, and data availability. That means customers of any size or industries such as websites, mobile apps, enterprise applications, and IoT devices can use it to store any volume of data.
Amazon S3 provides easy-to-use management features so you can appropriately organize your data to fulfill your business requirements.
Many of us are using AWS s3 bucket on a daily basis; one of the most common challenges that are faced while working with cloud storage is syncing or uploading multiple objects at once. Yes, we can drag and drop or upload on a direct bucket page. Like the below image.
But the problem with this approach is if you’re uploading large objects over an unstable network if network errors occur you must have to restart uploading from the beginning.
Suppose you are uploading 2000+ files and you come to know that upload fails and your uploading these files from the last 1 hour, re-uploading has become a time-consuming process. So, to overcome this problem we have two solutions that we will discuss in the next sections.
Multipart upload opens the gate to upload a single object as a set of parts. Considering that it is possible to upload object parts independently and in any order.
In case the transmission fails in any section, it is possible to retransmit that section without affecting any other sections. So, it’s a good practice to use multipart uploads instead of uploading the object in a single operation.
Advantages of Using multipart upload:
We can use multipart file uploading API with different technologies SDK or REST API for more details visit
Manage AWS Cloud Services with Bacancy!
Stop looking for AWS certified architect for your project, because Bacancy is here! Contact the best! Get the best! Hire AWS developer today!
We need to install CLI. With the use of AWS CLI, we can perform an S3 copy operation. If you don’t know how to install CLI follow this guide: Install AWS CLI.
Now, it’s time to configure the AWS profile. For that, use “AWS configure” command. You can configure AWS credentials, find your credentials under IAM -> Users -> security_credentials tab on the AWS console
We are done with configuring the AWS profile. Now, you can access your S3 bucket name “bacancy-s3-blog” using the list below the bucket command
Use the below command to list all the existing buckets.
Use the below command to copy a single file to the S3 bucket.
Use the below command to copy multiple files from one directory to another directory using AWS S3.
Note: By using aws s3 cp recursive flag to indicate that all files must be copied recursively.
As you can see on the above video even if our network connection is lost or is connected after reconnecting the process goes on without losing any file.
So, this was about how to copy multiple files from local to AWS S3 bucket using AWS CLI. For more such tutorials feel free to visit the Cloud tutorials page and start learning! Feel free to contact us if you have any suggestions or feedback.
Your Success Is Guaranteed !
We accelerate the release of digital product and guaranteed their success
We Use Slack, Jira & GitHub for Accurate Deployment and Effective Communication.
Try our free consultation to visualize the best outcome of your business ideas.
INSTANT 30 MIN FREE CONSULTATION CALL