Building and Deploying a React CV on AWS S3 with TypeScript: A Deep Dive into Headless Chromium with Puppeteer, AWS CDK, Custom Lambda Docker Runtimes and GitHub Actions - Part 2
This is the second part in the series covering my beautifully engineered CV where we will cover the DevOps aspects of the project, namely, the creation of the AWS infrastructure and the CI/CD setup to bind it all together seamlessly.
In the first part, which you can read here, we covered the bootstrapping of the Nx monorepo, the creation of the React CV website and of the Lambda Exporter that leverages headless Chromium orchestrated by Puppeteer to save a PDF of the CV.
We will be now be using AWS CDK(AWS Cloud Development Kit - a modern Java,NodeJS,Go, .NET interface that gets synthesized to CloudFormation JSON templates and applied as CloudFormation stacks) for the IaC aspects as it allows us to benefit from having the front-end, back-end and IaC in one language, sharing configuration and symbols without having to deal with duplicate code.
1. Infrastructure
While having it all run locally is great, unfortunately it is not visible to the rest of the world, so we need to deploy it somewhere the world can access it, which for us is on AWS S3 with a CloudFront CDN(Content Delivery Network).
There are 3 main component stacks for this setup:
ECR
Simple and straightforward stack that creates the Elastic Container Registry that will hold our Lambda Docker Runtime image and that the Lambda service will pull from.
As I am not expecting any advanced and complicated use-case scenarios, I am keeping it simple, with only latest
tag and lifecycle rule to only have one image thus reducing costs.
Lambda Exporter function
Lightweight stack centered around a
DockerImageFunction
, which is the L3 AWS CDK construct (higher level CDK language object providing more error checking, configurations and resource interconnection) equivalent of a CloudFormationAWS::Lambda::Function
with code pointing to the above ECR image registry, with versioning and aliasing alongside a role with minimal permissions configuration.
React Web
This is more involved compared to the other two as it also deals with ACM certificates, S3 bucket with it’s policies, CDN, WAF, S3 deployment of website bundle and of PDF via CloudFormation Custom Resource.
On each run of the
WebStack
deployment, the web bundle is deployed via theBucketDeployment
construct, theCustomResource
is triggered which runs the Lambda to export the latest deployed website bundle as a PDF to be readily available for download for anybody browsing the deployed website.
These can be run directly with the cdk
CLI(Command Line Interface) with a few extra parameters being required due to the monorepo context resulting in the below mouthful:
cdk synth -q -a 'npx ts-node --prefer-ts-exts -P apps/infra/tsconfig.app.json -r tsconfig-paths/register -r dotenv/config apps/infra/src/bin/<ECR|Exporter|Web>.ts
or by creating utility Nx scripts to bring it down to a more memorable
nx deploy infra
which seems more memorable and easier to type.
This of course has the precursor dependency of having to obtain a domain, have a hosted zone for it AWS Route53. The domain can be either directly purchased there or with a custom Hosted Zone to which you direct the NS
in your registrar of choice.
2. GitHub Actions CI/CD
The fourth and final step in the story, which enables us to edit our CV’s JSON files, run git commit
, git push
and have the linting, testing, building, deploying, semversioning and releasing done automatically is by setting up our GitHub Actions pipelines. We will have two pipelines, one triggered by a pull request which only does the evaluation of the validity of our commit and another for merges on main, which does the full suite of evaluation, building & deploying.
2.1 Pull Requests
The step names are self-explanatory and all adhere to the single responsibility principle, they do only one thing, be it setup a a dependency or running a process.
This verify
ensures any new pull request has passed linting, testing, building of web, exporter, exporter Docker image and of all the AWS CDK stacks by chaining the builds from the root repository
yarn build
to
nx run-many --all --target=build
Same principle applies to lint
and test
.
Note
Although we do not do any deploying in this step, we have here the 🔑 AWS Credentials step as the CDK build does a synthesis of the CloudFormation stacks, which requires authenticated access to the target account.
The 🧰 Setup QEMU and 🧰 Setup Docker BuildX are needed to be able to run docker build via the docker/build-push-action@v5
action.
2.2 Merging to main
Merging to main pipeline has the same verify job (you can get code on a branch through other means besides that of a pull requests :) ) with an added deploy job, which has the same setup steps as the verify
job, with the added deployment steps of:
As I have ensured all the steps can be run from local first before I spent time in setting up any pipeline, it makes sense that most of the content of the pipeline definition is simply delegating the work to the appropriate Yarn and Nx scripts that trigger AWS CDK CloudFormation synthesis with the addition of injecting the appropriate environment variables.
The cherry on top of the cake is the 🚀 Release step which runs
semantic-release
that reads all of the commits from the last tag to HEAD
and based on conventional commits message formatting increments MAJOR
.MINOR
.PATCH
version appropriately.
To ensure the commit messages always follow a pattern that semantic-release
can parse, we use husky
to setup git commit-msg
hook to run
with cz-conventional-changelog
plugin. In this manner we can be sure that we never push a commit that does not adhere to the standard.
Conclusion
Over the course of this two-part series, we've embarked on a comprehensive journey through the development and deployment of a modern, dynamic CV. From the initial bootstrapping of the Nx monorepo and crafting a React-based CV in part one, to the intricate DevOps orchestration involving AWS infrastructure and CI/CD pipelines in part two, this project has been a testament to the power of combining cutting-edge web development with robust DevOps practices.
Key Takeaways
Streamlined Development: By leveraging tools like AWS CDK and GitHub Actions, we've demonstrated how complex infrastructure and deployment processes can be streamlined, making them more efficient and less prone to error.
Rapid Deployment and Scalability: The use of Docker, CloudFront CDN, and serverless technologies ensures that our applications are not only rapidly deployable but also scalable to meet varying demands.
Automated Workflows: The integration of CI/CD pipelines exemplifies the automation of testing, building, and deployment processes, significantly reducing manual effort and enhancing productivity.
The methodologies and technologies employed in this project are not confined to personal CV development. They can be seamlessly adapted to a wide range of applications, from e-commerce websites to complex enterprise-level solutions. The principles of automation, scalability, and efficient deployment are universally applicable, offering a blueprint for future projects that demand high availability, rapid scalability, and continuous integration/delivery.
I'm curious to hear about your experiences. Have you undertaken a project that combines web development, backend event driven processes with DevOps in a similar manner? What challenges did you face, and how did you overcome them? Additionally, I welcome any suggestions or insights you might have. Perhaps there's an aspect of DevOps or a particular technology you've found particularly useful or challenging? Let's start a conversation in the comments below and learn from each other's experiences in this ever-evolving field of web development, backend development and DevOps.
You can find the entire code for this series here and the final product here.