13
How to Upload Files to AWS S3 Using Command Line?
This article is originally published at my blog askvikram
AWS S3 is a S imple S torage S ervice used is an object storage service with high availability, security and performance. All the files are stored as objects inside the containers called Buckets.
In this tutorial, you'll create S3 bucket, create subfolders and upload files to aws S3 bucket using the AWS Cli.
- Ensure you have installed and configured the AWS Cli using the guide How to Install and Configure AWS Cli on Ubuntu.
In this section, you'll create an S3 bucket which will logically group your files.
s3 mb command in aws cli is used to make bucket. Use the below command to make a new bucket in your s3.
aws s3 mb s3://newbucketname --region "ap-south-1"
- aws – Command to invoke AWS Client
- S3 – Denotes the service where the operation to be performed
- mb – Make bucket command to denote the make bucket operation
- S3://newbucketname – S3 URI, desired bucket name to be created
- region – keyword to specify on which region the bucket needs to be created
- ap-south-1 – the region name
You've created new bucket in your desired region. Now, you'll create a sub folder in S3 Bucket.
In this section, you'll create a subfolder inside your existing S3 bucket.
There are no such things called as folders in S3 bucket. You'll just create a sub objects inside your existing bucket. It logically acts as a sub folder.
Use the S3API to create a new subdirectory inside your S3 bucket as given below.
aws s3api put-object --bucket existing_bucket_name --key new_sub_directory_name/ --region "ap-south-1"
- aws – Command to invoke AWS Client
- S3api – Denotes the service where the operation to be performed
- put-object – Put object command to put a new object inside an existing object
- — bucket
-
Keyword bucket - existing_bucket_name – Name of the existing bucket where you want to create a sub object
- –key – keyword to specify the new key name
- new_sub_directory_name/ – Name of your desired new object name. / is mandatory at the end.
- region – keyword to specify on which region the bucket needs to be created
- ap-south-1 – the region name
A new sub directory is created in your existing bucket. Now, you'll upload files to the created bucket.
In this section, you'll upload a single file to s3 bucket in two ways.
- Uploading a file to existing bucket
- Create a subdirectory in the existing bucket and upload a file into it.
You can use the cp command to upload a file into your existing bucket as shown below.
aws s3 cp file_to_upload.txt s3://existing_bucket_name/ --region "ap-south-1"
- aws – Command to invoke AWS Client
- s3 – Denotes the service where the operation to be performed
- cp – Copy command to copy the file to the bucket
- file_to_upload.txt – File which needs to be uploaded
- s3://existing_bucket_name – Existing bucket name to which the file needs to be uploaded
- –region – Region keyword to specify the region
- ap-south-1 – actual region to which the file needs to be uploaded
You've copied a single file to an existing bucket.
You can use s3api putobject command to add an object to your bucket. In this context, you'll create a subfolder in the existing bucket and to upload a file into it by using the –key parameter in the command.
aws s3api put-object --bucket existing_bucket_name --key new_sub_directory_name/file_to_be_uploaded.txt --body file_to_be_uploaded.txt
- aws – Command to invoke AWS Client
- S3api – Denotes the service where the operation to be performed
- put-object – Put object command to put a new object inside an existing object
- — bucket
-
Keyword bucket - existing_bucket_name – Name of the existing bucket where you want to create a sub object
- –key – keyword to specify the new key name
- new_sub_directory_name/file_to_be_uploaded.txt – Name of your desired new object name. Here, you'll specify the full name of the object name to be created. / is used to create the sub objects in the existing buckets and file name is used to upload the files in the specified path.
You've created a new sub directory in the existing bucket and uploaded a file into it.
In this section, you'll upload all files from a directory to an S3 bucket using two ways.
- Using copy recursive
- Using Sync
For the demonstration purposes, consider there are three files( firstfile.txt, secondfile.txt, thirdfile.txt in you local directory). Now you'll see how copy recursive and sync will work with three files.
You can use the option – – dryrun with both copy recursive and sync command to check which files will be copied/synced without actually uploading the files. The keyword – – dryrun must be right after the keyword cp or sync
Copy recursive is a command used to copy the files recursively to the destination directory.
Recursive means it will copy the contents of the directories and if the source directory have the sub directories, then it will be copied too.
Use the below command to copy the files recursively to your s3 bucket.
aws s3 cp --recursive your_local_directory s3://full_s3_bucket_name/ --region "ap-southeast-2"
You'll see the below output which means the three files are uploaded to your s3 bucket.
Output
upload: ./ **firstfile.txt** to s3://maindirectory/subdirectory/ **firstfile.txt**
upload: ./ **secondfile.txt** to s3://maindirectory/subdirectory/ **secondfile.txt**
upload: ./ **thirdfile.txt** to s3://maindirectory/subdirectory/ **thirdfile.txt**
You've copied files recursively to your s3 bucket. Now, you'll see how to sync your local directory to your S3 bucket.
Sync is a command used to synchronize source and target directories. Sync is by default recursive which means all the files and subdirectories in the source will be copied to target recursively.
Use the below command to Sync your local directory to your S3 bucket.
aws s3 sync your_local_directory s3://full_s3_bucket_name/ --region "ap-southeast-2"
You'll see the below output.
Output
upload: ./ **firstfile.txt** to s3://maindirectory/subdirectory/ **firstfile.txt**
upload: ./ **secondfile.txt** to s3://maindirectory/subdirectory/ **secondfile.txt**
upload: ./ **thirdfile.txt** to s3://maindirectory/subdirectory/ **thirdfile.txt**
Since there are no files in your target bucket, all three files will be copied. If two files are already existing, then only one file will be copied.
You've copied files using CP and Sync command. Now, you'll see how to copy specific files using the Wildcard character.
In this section, you'll see how to copy a group of files to your S3 bucket using the cp Wildcard upload function.
Wildcard is a function which allows you to copy files with name in a specific pattern.
Use the below command to copy the files to copy files with the name starts with first.
aws s3 cp --recursive your_local_directory s3://full_s3_bucket_name/ --exclude "*" --include "first*" --region "ap-southeast-2"
- aws – command to invoke AWS Client
- S3 – denotes the service where the operation to be performed
- cp – copy command to copy the files
- your_local_directory – source directory from where the files to be copied
- full_s3_bucket_name – target s3 bucket name to which the files to be copied
- –exclude “*” – Exclude all files
- –include “first*” – Include files with names starting as first
- –region – Region keyword to specify the region
- ap-south-1 – actual region to which the file needs to be uploaded
Ensure you use the exclude keyword first and then include keyword second to use the wildcard copy appropriately.
You'll see the below output which means the file which starts with name first (firstfile.txt) is copied to your S3 Bucket.
Output
upload: ./firstfile.txt to s3://maindirectory/subdirectory/ **firstfile.txt**
You've copied files to your s3 bucket using Wildcard copy.
You've created directories and Subdirectories in your S3 bucket and copied files to it using cp and sync command. Copying files answers your question How to upload files to AWS S3 bucket.
You can host a static website using the files copied to your S3 buckets. Refer the guide How to host a static website on AWS S3.
Check if you have access to the S3 Bucket. Also check if you are using the correct region in the commands
cp –recursive is the command used to copy files recursively to an s3 bucket. You can also use Sync command which by default recursive.
You can use any of the commands dicussed in this article to transfer files from ec2 to s3 bucket.
13