Prereq
- repo on Github
- Next.js static website
- want to use custom domain, but don’t want to use AWS Route 53 for NS/DNS
Steps
1 Setup AWS
1.1 Request for a certificate in ACM
public cert →
example.com+ DNS validation → add CNAME in your DNS- wait 5+ min
- check for whether DNS added an extra dot at the CNAME value
1.2 Create S3 Bucket
- name whatever
click on bucket
- disable
Properties > Static website hosting(since files will be hosted via CF) - enable
Permissions > Block public access(since only CF will have access) edit
Permissions > Bucket policyto:{ "Version": "2008-10-17", "Id": "PolicyForCloudFrontPrivateContent", "Statement": [ { "Sid": "AllowCloudFrontServicePrincipal", "Effect": "Allow", "Principal": { "Service": "cloudfront.amazonaws.com" }, "Action": "s3:GetObject", "Resource": "arn:aws:s3:::AWS_BUCKET_NAME/*", "Condition": { "StringEquals": { "AWS:SourceArn": "arn:aws:cloudfront::AWS_ACCOUNT_ID:distribution/AWS_CLOUDFRONT_DEPLOYMENT_ID" } } } ] }AWS_BUCKET_NAME→ your bucket nameAWS_ACCOUNT_ID→ 12-digit number, click account on right top cornerAWS_CLOUDFRONT_DEPLOYMENT_ID→ 14-digit alphanumeric, check CF
- disable
1.3 Create CloudFront Distribution
click on distribution
edit
SettingsAlternate domain name (CNAME)toexample.comCustom SSL certificateto what you made earlierDefault root objectto entry point to your app (probablyindex.html)
create
OriginOrigin domainto s3 bucket (do NOT uses3-websitedomain)Origin accesstoOrigin access control settingsclick
Create new OACSigning behaviortoSign requestsOrigin typetoS3
2 Test Next.js export
2.1 Add static export to Next.js config
/** @type {import('next').NextConfig} */
const nextConfig = {
output: "export",
distDir: "out",
images: {
unoptimized: true,
},
};
export default nextConfig;images config controls static image exports, see https://nextjs.org/docs/app/api-reference/next-config-js/images#aws-cloudfront to enable image optimization on CF<Image> optimization doesn’t work on static exports2.2 Upload static file to S3
upload contents of the
.outfolder to the root directory of S3- use cli or just web
- go to CF distribution domain name (
[14-digit alphanumeric].cloudfront.net) and check whether the website is accessible
3 Setup Github Actions
3.1 Create IAM Policies for Github Actions
click
Create Policycreate policy
AmazonS3PutOnlyAccesswith following JSON{ "Version": "2012-10-17", "Statement": [ { "Sid": "VisualEditor0", "Effect": "Allow", "Action": "s3:PutObject", "Resource": "*" } ] }this policy only allows write access to all bucketscreate policy
AmazonS3LimitedBucketAccess-AWS_BUCKET_NAMEwith following JSON{ "Version": "2012-10-17", "Statement": [ { "Sid": "VisualEditor1", "Effect": "Allow", "Action": "s3:*", "Resource": "arn:aws:s3:::AWS_BUCKET_NAME/*" } ] }this policy allows all access only to the specified bucket
3.2 Create IAM User for Github Actions
click
Create Userand namegithub-actionsPermissions > Add permissions > Attach policies directlyand chooseAmazonS3PutOnlyAccessPermissions > Permissions boundary > Add boundaryand chooseAmazonS3LimitedBucketAccess-AWS_BUCKET_NAME
Technically vice versa should work but do whatever makes most sense
3.3 Add AWS access keys to Github repository
- click on the created user
Security credentials > Access keys > Create access keyand chooseApplications running outside AWScopy keys and add it to
GITHUB_REPOSITORY > Settings > Secretsand add to repository secrets (not environment secrets)- name it
AWS_ACCESS_KEY_IDandAWS_SECRET_ACCESS_KEY
- name it
3.4 Add Github Actions workflow
add following code to
./.github/workflows/deploy.yml# This workflow will do a clean installation of node dependencies, cache/restore them, build and upload /out folder to Amazon S3 name: aprilsecond on: push: branches: - main jobs: build-and-deploy: runs-on: ubuntu-latest steps: - name: Checkout uses: actions/checkout@v4 - name: Install pnpm uses: pnpm/action-setup@v4 with: version: 9 run_install: false - uses: actions/setup-node@v4 with: node-version: 21 cache: 'pnpm' cache-dependency-path: pnpm-lock.yaml - name: Install dependencies run: pnpm install - name: Build run: pnpm build - name: Configure AWS Credentials uses: aws-actions/configure-aws-credentials@v4 with: aws-region: us-east-2 aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }} aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }} - name: Copy files to S3 with the AWS CLI run: | aws s3 cp --recursive --no-progress ./out s3://AWS_BUCKET_NAME/aws s3 cpcommand copies and overwrites existing content in the bucket (--recursiveflag is required,--no-progressflag is not required but improves readability on Github Actions log)
- push this file and see how it works
4 Connecting to domain
- add
ALIASrecord to your CF domain
Notes
- check for S3 free tier limits, updating large websites can quickly use up 20K GET requests and 2K PUT requests