Photo by Jerry Zhang on Unsplash
โ๏ธ How to Host Your Portfolio Website on AWS: A Comprehensive Guide!
Series Introduction
Everything in this series:
Series Introduction: โ๏ธ How to Host Your Portfolio Website on AWS: A Comprehensive Guide!
Part 1: React App on S3 with Static Hosting + Route 53 + CloudFront
Hi there!
I'm Evelyn, a freshly hatched computer whiz, just out of the collegiate nest. ๐ฃ๐ป I want to share with you through this series of blogs how I host my portfolio website on AWS.
Before start reading the blogs, there are some prerequisites you need to complete, these contents will not be covered in this series:
Your static website code is done.
An AWS account.
Basic knowledge of AWS, like what is a S3 bucket and what is Route 53 used for. A good way to quickly get started with AWS is by obtaining the AWS Cloud Practitioner certification. ๐
Are you ready? Let's soar to the cloud!
Architecture
From the architecture diagram above, you could roughly understand the entire system's infrastructure.
Route 53: It's typically the entry point for the client requests. It resolves domain names to the appropriate destinations which can be various AWS resources, here is the CloudFront.
CloudFront: When Route 53 directs a user's request to CloudFront, CloudFront serves the content from its cache if it's available and up to date. If not, CloudFront retrieves the content from the origin, which could be an S3 bucket. It helps deliver our data securely to users globally with low latency and high transfer speeds. But the cache part can be tricky, I ran into an issue and will explain in detail in the following content.
S3 Bucket: Our website code is stored in an S3 bucket, and S3 offers the functionality for static website hosting.
Lambda: It runs our code in response to events and automatically manages the underlying compute resources for us. In our case, we will count how many times our websites have been viewed using Lambda and DynamoDB.
DynamoDB: It'a NoSQL database, it stores the data that the Lambda function could read and write.
We will also set up a CI/CD pipeline with GitHub Actions to automatically update the S3 bucket when there's a code change in the GitHub repo, which saves us from manually uploading files to S3 bucket after updating the website content.
Finally, we will use Terraform to implement the resume viewing feature and automate the provisioning and management of infrastructure through code rather than clicking around in the AWS console. There are many benefits from a long-term perspective.
What's Next
Thank you for reading up to this point! This is an introduction blog of this series, I hope you now have an understanding of the awesome things we are about to do!
Next we will upload the website content into S3 bucket for static hosting, see you in the next part!