Deploying a Hugo site to AWS S3 using the AWS CLI

After getting Hugo configured and generated some content (posts), I wanted a(n easier) way to get them to the AWS S3 bucket for my website.

The manual way was to log into https://console.aws.amazon.com, go to the S3 page, select the bucket for the website, and upload the content.

Thankfully, Hugo has built-in configuration to allow automatic upload/update of AWS S3 buckets.

From the Hugo Deploy section of the guide, I copied/pasted this text into the config.toml file:

[deployment]
# By default, files are uploaded in an arbitrary order.
# Files that match the regular expressions in the "Order" list
# will be uploaded first, in the listed order.
order = [".jpg$", ".gif$"]


[[deployment.targets]]
# An arbitrary name for this target.
name = "mydeployment"
Automate the deployment of # The Go Cloud Development Kit URL to deploy to. Examples:
# GCS; see https://gocloud.dev/howto/blob/#gcs
# URL = "gs://<Bucket Name>"

# S3; see https://gocloud.dev/howto/blob/#s3
# For S3-compatible endpoints, see https://gocloud.dev/howto/blob/#s3-compatible
# URL = "s3://<Bucket Name>?region=<AWS region>"
URL = "s3://nodinrogers.com?region=us-west-2"

# Azure Blob Storage; see https://gocloud.dev/howto/blob/#azure
# URL = "azblob://$web"

# You can use a "prefix=" query parameter to target a subfolder of the bucket:
# URL = "gs://<Bucket Name>?prefix=a/subfolder/"

# If you are using a CloudFront CDN, deploy will invalidate the cache as needed.
#cloudFrontDistributionID = <ID>
cloudFrontDistributionID = "UFS6N57VZSAUSF"

# Optionally, you can include or exclude specific files.
# See https://godoc.org/github.com/gobwas/glob#Glob for the glob pattern syntax.
# If non-empty, the pattern is matched against the local path.
# All paths are matched against in their filepath.ToSlash form.
# If exclude is non-empty, and a local or remote file's path matches it, that file is not synced.
# If include is non-empty, and a local or remote file's path does not match it, that file is not synced.
# As a result, local files that don't pass the include/exclude filters are not uploaded to remote,
# and remote files that don't pass the include/exclude filters are not deleted.
# include = "**.html" # would only include files with ".html" suffix
# exclude = "**.{jpg, png}" # would exclude files with ".jpg" or ".png" suffix


# [[deployment.matchers]] configure behavior for files that match the Pattern.
# See https://golang.org/pkg/regexp/syntax/ for pattern syntax.
# Pattern searching is stopped on first match.

# Samples:

[[deployment.matchers]]
# Cache static assets for 1 year.
pattern = "^.+\\.(js|css|svg|ttf)$"
cacheControl = "max-age=31536000, no-transform, public"
gzip = true

[[deployment.matchers]]
pattern = "^.+\\.(png|jpg)$"
cacheControl = "max-age=31536000, no-transform, public"
gzip = false

[[deployment.matchers]]
# Set custom content type for /sitemap.xml
pattern = "^sitemap\\.xml$"
contentType = "application/xml"
gzip = true

[[deployment.matchers]]
pattern = "^.+\\.(html|xml|json)$"
gzip = true

All I had to do was uncomment and edit the URL and cloudFrontDistribution lines to match my AWS/CloudFront settings..

I replaced my actual cloudFrontDistributionID with a random string

Needed to make sure that I've generated the most recent version of my site, to include any content/posts since the last generation:

hugo
Start building sites
hugo v0.88.0-ACC5EB5B+extended linux/amd64 BuildDate=2021-09-02T11:51:42z VendorInfo=hugoguru
. . .

Now that Hugo is configured to push my website to the AWS S3 bucket, need to install the AWS CLI on our Ubuntu box:

sudo apt install awscli -y
aws --version
aws-cli/1.18.69 Python/3.8.10 Linux/5.4.0-80-generic botocore/1.16.19

Once the AWS CLI is configured, configure it to use the IAM user credentials we created for our Hugo posts:

aws configure
AWS Access Key ID [None]: r2uykg63d79X4WGwB8po
AWS Secret Access Key [None]: ?icE/bf?q;8_7M.EDdK2t(Fd_cvui6NKm*w?jAME
Default region name [None]: us-west-2
Default output format [None]:

I've replaced both the AWS Access Key ID and the AWS Secret Access Key with random strings

Before actually pushing anything, I did a dry run, to make sure my login credentials and Hugo AWS configuration were ok:

hugo deploy --dryRun
Deploying to target "nodinrogers" (s3://nodinrogers.com?region=us-west-2)
Identified 307 file(s) to upload, totaling 1.7 MB, and 28 file(s) to delete.
. . .
WARN 2021/07/29 15:20:25 Skipping 28 deletes because it is more than --maxDeletes (0). If this is expected, set --maxDeletes to a larger number, or -1 to disable this check.

To avoid the Skipping xx deletes . . . messages, we add --maxDeletes -1 switch to the Hugo command.

To actually deploy the Hugo website to the AWS S3 bucket:

hugo deploy --maxDeletes -1
hugo deploy
Deploying to target "nodinrogers" (s3://nodinrogers.com?region=us-west-2)
Identified 307 file(s) to upload, totaling 1.7 MB, and 28 file(s) to delete.
Success!

References

Hugo Deploy https://gohugo.io/hosting-and-deployment/hugo-deploy/

How to Install AWS CLI on Ubuntu 20.04 https://linoxide.com/how-to-install-aws-cli-on-ubuntu-20-04/