Create a WeTransfer clone with AWS S3 😎
In this post we are looking at how we can create a clone of WeTransfer, so that we can upload and share our files with others.
To keep things simple, we are not looking to create the actual user interface. Instead, we are leveraging AWS S3 to upload and share our files via AWS CLI.
Create S3 Bucket
First, we need to create a new S3 Bucket. To avoid storing the files indefinitely or cleaning up the bucket manually, we can configure a new S3 Lifecycle to automatically cleanup files after 7 days.
Here is how you can do the above using AWS CLI. Make sure to provide a unique name for your S3 bucket.
aws s3api create-bucket \
--bucket wetransfer-clone \
--region us-east-1 \
--create-bucket-configuration LocationConstraint=us-east-1
aws s3api put-bucket-lifecycle \
--bucket wetransfer-clone \
--lifecycle-configuration file://lifecycle.json
The configuration for S3 lifecycle (used in the above commands).
{
"Rules": [
{
"ID": "Delete files after 7 days",
"Prefix": "",
"Status": "Enabled",
"Expiration": {
"Days": 7
}
}
]
}
Upload and share files
Next, we can upload our files and create pre-signed URLs for S3. This will allow to us to share the file without making it public.
It is a good idea to set the URL expiring before we actually delete the file. In the following example, we expire the URL in two days (172800 seconds).
If necessary, we can create a new pre-signed URL without re-uploading the file. As mentioned above, the file is automatically deleted in 7 days.
aws s3 cp your-large-file.zip s3://wetransfer-clone
aws s3 presign s3://wetransfer-clone/your-large-file.zip \
--expires-in 172800
The last command will provide the URL that you can share.
Large files
Thanks to AWS CLI we don't need to worry about large files.
When running aws s3 cp
, Amazon S3 automatically performs a multipart upload for large objects. In a multipart upload, a large file is split into multiple parts and uploaded separately to Amazon S3.
After all the parts are uploaded, AWS S3 combines the parts into a single file. A multipart upload can result in faster uploads and lower chances of failure with large files.
More details about S3 large file uploads
Conclusion
I hope you enjoyed the article and it's something that you can use in real life.
Make sure to follow me on dev.to, or Twitter to read more about web development and how you can automate more stuff!
Photo by Volodymyr Hryshchenko on Unsplash.