We have all heard about DNS catastrophes. I just read about horror story on reddit the other day, where an Azure root DNS zone was accidentally deleted with no backup. I experienced a similar disaster a few years ago – a simple DNS change managed to knock out internal DNS for an entire domain, which contained hundreds of records. Reading the post hit close to home, uncovering some of my own past anxiety, so I began poking around for solutions. Immediately, I noticed that backing up DNS records is usually skipped over as part of the backup process. Folks just tend to never do it, for whatever reason.
I did discover, though, that backing up DNS is easy. So I decided to fix the problem.
I wrote a simple shell script that dumps out all Route53 zones for a given AWS account to a json file, and uploads the zones to an S3 bucket. The script is a handful lines, which is perfect because it doesn’t take much effort to potentially save your bacon.
If you don’t host DNS in AWS, the script can be modified to work for other DNS providers (assuming they have public API’s).
Here’s the script:
#!/usr/bin/env bash set -e # Dump route 53 zones to a text file and upload to S3. BACKUP_DIR=/home/<user>/dns-backup BACKUP_BUCKET=<bucket> # Use full paths for cron CLIPATH="/usr/local/bin" # Dump all zones to a file and upload to s3 function backup_all_zones () { local zones # Enumerate all zones zones=$($CLIPATH/aws route53 list-hosted-zones | jq -r '.HostedZones[].Id' | sed "s/\/hostedzone\///") for zone in $zones; do echo "Backing up zone $zone" $CLIPATH/aws route53 list-resource-record-sets --hosted-zone-id $zone > $BACKUP_DIR/$zone.json done # Upload backups to s3 $CLIPATH/aws s3 cp $BACKUP_DIR s3://$BACKUP_BUCKET --recursive --sse } # Create backup directory if it doesn't exist mkdir -p $BACKUP_DIR # Backup up all the things time backup_all_zones
Be sure to update the <user> and <bucket> in the script to match your own environment settings. Dumping the DNS records to json is nice because it allows for a more programmatic way of working with the data. This script can be run manually, but is much more useful if run automatically. Just add the script to a cronjob and schedule it to dump DNS periodically.
For this script to work, the aws cli and jq need to be installed. The installation is skipped in this post, but is trivial. Refer to the links for instructions.
The aws cli needs to be configured to use an API key with read access from Route53 and the ability to write to S3. Details are skipped for this step as well – be sure to consult the AWS documentation on setting up IAM permissions for help with setting up API keys. Another, simplified approach is to use a pre-existing key with admin credentials (not recommended).