Curl on Windows using a Docker wrapper

Does the Windows built-in version of “curl” confuse or intimidate you?  Maybe you come from a Linux or Unix background, and yearn for some of your favorite go-to tools?  Newer versions of Powershell include a cmdlet for interacting with the web called Invoke-WebRequest, which is useful, but is not a great drop in replacement for those with experience in non Windows environments.  The Powershell cmdlets are a move in the right direction to unifying CLI experiences but there are still many folks that have become attached to curl over the years, including myself.  It is worth noting that a Windows compatible version of curl has existed for a long time, however it has always been a nuisance dealing with the zip file, just as using SSH has always been a hassle on Windows.  It has always been possible to use the *nix equivalent tools, it is just clunky.

I found a low effort solution for adding curl to my Windows CLI flow, that acts as a nice middle ground between learning Invoke-WebRequest and installing curl binaries directly, which I’d like to share.  This alias trick is a simple way to use curl for working with API’s and other various web testing in Windows environments without getting tangled in managing versions, and dealing with vulnerabilities.  Just download the latest Docker image to update curl to the newest version, and don’t worry about its implementation across different systems.

Prerequisites are light.  First, make sure to have the Docker for Windows app installed (stable or beta are both fine) as well as a semi-recent version of Powershell.

Next step.  If you haven’t set up a Powershell profile, there are also lots of links and resources about how to do it.   I even wrote about it recently, so I am skipping that step as well.  Start by adding the following snippet to your Powershell profile (by default located in C:\Users\<user>\Documents\WindowsPowerShell\Microsoft.PowerShell_profile.ps1) and saving.

# Curl alias using docker
function Docker-Curl {
   docker run --rm byrnedo/alpine-curl $args
}

# Aliases
New-Alias dcurl Docker-Curl

Then source you terminal and run the curl command that was just created.

dcurl -h

One issue you might notice from the snippet above is that the Docker image is not an “official” image.  If this bothers you (security concerns, etc.), it is really easy to create your own, secure image.  There are lots of examples of how to create minimal images with Curl pre-installed.  Just be aware that your custom image will need to be maintained and occasionally rebuilt/published to guard against future vulnerabilities.  For brevity, I have skipped this process, but here’s an example of creating a custom image.

Optional

To update curl, just run the docker pull command.

docker pull apline-curl

Now you have the best of both worlds.  The built-in Invoke-WebRequest cmdlet provided by Powershell is available, as well as the venerable curl command.

My number one case for using curl in a container is that it has been in existence for such a long time (less bugs and edge cases) and it can be used for nearly any web related task.  It is also much handier to use curl for those with a background using *nix systems, rather than digging around in unfamiliar Powershell docs for similar functionality.  Having the ability to run some of my favorite tools in an easy, reproducible way on Windows has been a refreshing experience while sliding back into the Windows world.

Read More

Backup Route 53 zones

We have all heard about DNS catastrophes.  I just read about horror story on reddit the other day, where an Azure root DNS zone was accidentally deleted with no backup.  I experienced a similar disaster a few years ago – a simple DNS change managed to knock out internal DNS for an entire domain, which contained hundreds of records.  Reading the post hit close to home, uncovering some of my own past anxiety, so I began poking around for solutions.  Immediately, I noticed that backing up DNS records is usually skipped over as part of the backup process.  Folks just tend to never do it, for whatever reason.

I did discover, though, that backing up DNS is easy.  So I decided to fix the problem.

I wrote a simple shell script that dumps out all Route53 zones for a given AWS account to a json file, and uploads the zones to an S3 bucket.  The script is a handful lines, which is perfect because it doesn’t take much effort to potentially save your bacon.

If you don’t host DNS in AWS, the script can be modified to work for other DNS providers (assuming they have public API’s).

Here’s the script:

#!/usr/bin/env bash

set -e

# Dump route 53 zones to a text file and upload to S3.

BACKUP_DIR=/home/<user>/dns-backup
BACKUP_BUCKET=<bucket>
# Use full paths for cron
CLIPATH="/usr/local/bin"

# Dump all zones to a file and upload to s3
function backup_all_zones () {
  local zones
  # Enumerate all zones
  zones=$($CLIPATH/aws route53 list-hosted-zones | jq -r '.HostedZones[].Id' | sed "s/\/hostedzone\///")
  for zone in $zones; do
  echo "Backing up zone $zone"
  $CLIPATH/aws route53 list-resource-record-sets --hosted-zone-id $zone > $BACKUP_DIR/$zone.json
  done

  # Upload backups to s3
  $CLIPATH/aws s3 cp $BACKUP_DIR s3://$BACKUP_BUCKET --recursive --sse
}

# Create backup directory if it doesn't exist
mkdir -p $BACKUP_DIR
# Backup up all the things
time backup_all_zones

Be sure to update the <user> and <bucket> in the script to match your own environment settings.  Dumping the DNS records to json is nice because it allows for a more programmatic way of working with the data.  This script can be run manually, but is much more useful if run automatically.  Just add the script to a cronjob and schedule it to dump DNS periodically.

For this script to work, the aws cli and jq need to be installed.  The installation is skipped in this post, but is trivial.  Refer to the links for instructions.

The aws cli needs to be configured to use an API key with read access from Route53 and the ability to write to S3.  Details are skipped for this step as well – be sure to consult the AWS documentation on setting up IAM permissions for help with setting up API keys.  Another, simplified approach is to use a pre-existing key with admin credentials (not recommended).

Read More

Hide file extensions in PowerShell tab completion

One thing I have quickly discovered as I get acclimated to my new Windows machine is that by default the Windows Powershell CLI appends the executable file extension to the command that gets run, which is not the case on Linux or OSX.  That got me wondering if it is possible to modify this default behavior and remove the extension.  I’m going to ruin the surprise and let everybody know that it is definitely possible change this behavior, thanks to the flexibility of Powershell and friends.  Now that the surprise is ruined, read on to find out how this solution works.

To check which file types Windows considers to be executable you can type $Env:PathExt.

PS > $Env:PathExt
.COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH;.MSC;.PY;.PYW;.CPL

Similarly, you can type $Env:Path to get a list of places that Windows will look for files to execute by default.

PS > $Env:PATH
C:\Program Files\Docker\Docker\Resources\bin;C:\Python35\Scripts\;C:\Python35\;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\Program Files (x86)\NVIDIA Co
ram Files\nodejs\;C:\Program Files\Git\cmd;C:\Program Files (x86)\Skype\Phone\;C:\Users\jmreicha\AppData\Local\Microsoft\WindowsApps;C:\Users\jmreicha\AppData\Local\atom\bin;C:\Users\jmreicha\AppData\Roaming\npm

The problem though, is that when you start typing in an extension that is part of this path, say python, and tab complete it, Windows will automatically append the file extension to the executable.  Since I am more comfortable using a *nix style shell it is an annoyance having to deal with the file extensions.

Below I will show you a hack for hiding these from you Powershell prompt.  It is actually much more work than I thought to add this behavior but with some help from some folks over at stackoverflow, we can add it.  Basically, we need to overwrite the functionality of the default Powershell tab completion with our own, and then have that override get loaded into the Powershell prompt when it gets loaded, via a custom Profile.ps1 file.

To get this working, the first step is to look at what the default tab completion does.

(Get-Command 'TabExpansion2').ScriptBlock

This will spit out the code that handles the tab completion behavior.  To get our custom behavior we need to override the original code with our own logic, which I have below (I wish I came up with this myself but alas).  This is note the full code, just the custom logic.  The full script is posted below.

$field = [System.Management.Automation.CompletionResult].GetField('completionText', 'Instance, NonPublic')
$source.CompletionMatches | % {
        If ($_.ResultType -eq 'Command' -and [io.file]::Exists($_.ToolTip)) {
            $field.SetValue($_, [io.path]::GetFileNameWithoutExtension($_.CompletionText))
        }
    }
Return $source

The code looks a little bit intimidating but is basically just looking to see if the command is executable and on our system path, and if it is just strips out the extension.

So to get this all working, we need to create a file with the logic, and have Powershell read it at load time.  Go ahead and paste the following code into a file like no_ext_tabs.ps1.  I place this in the Powershell path (~/Documents/WindowsPowerShell), but you can put it anywhere.

Function TabExpansion2 {
    [CmdletBinding(DefaultParameterSetName = 'ScriptInputSet')]
    Param(
        [Parameter(ParameterSetName = 'ScriptInputSet', Mandatory = $true, Position = 0)]
        [string] $inputScript,

        [Parameter(ParameterSetName = 'ScriptInputSet', Mandatory = $true, Position = 1)]
        [int] $cursorColumn,

        [Parameter(ParameterSetName = 'AstInputSet', Mandatory = $true, Position = 0)]
        [System.Management.Automation.Language.Ast] $ast,

        [Parameter(ParameterSetName = 'AstInputSet', Mandatory = $true, Position = 1)]
        [System.Management.Automation.Language.Token[]] $tokens,

        [Parameter(ParameterSetName = 'AstInputSet', Mandatory = $true, Position = 2)]
        [System.Management.Automation.Language.IScriptPosition] $positionOfCursor,

        [Parameter(ParameterSetName = 'ScriptInputSet', Position = 2)]
        [Parameter(ParameterSetName = 'AstInputSet', Position = 3)]
        [Hashtable] $options = $null
    )

    End
    {
        $source = $null
        if ($psCmdlet.ParameterSetName -eq 'ScriptInputSet')
        {
            $source = [System.Management.Automation.CommandCompletion]::CompleteInput(
                <#inputScript#>  $inputScript,
                <#cursorColumn#> $cursorColumn,
                <#options#>      $options)
        }
        else
        {
            $source = [System.Management.Automation.CommandCompletion]::CompleteInput(
                <#ast#>              $ast,
                <#tokens#>           $tokens,
                <#positionOfCursor#> $positionOfCursor,
                <#options#>          $options)
        }
        $field = [System.Management.Automation.CompletionResult].GetField('completionText', 'Instance, NonPublic')
        $source.CompletionMatches | % {
            If ($_.ResultType -eq 'Command' -and [io.file]::Exists($_.ToolTip)) {
                $field.SetValue($_, [io.path]::GetFileNameWithoutExtension($_.CompletionText))
            }
        }
        Return $source
    }    
}

To start using this tab completion override file right away, just source the file as below and it should start working right away.

. .\no_ext_tabs.ps1

If you want the extensions to be hidden every time you start a new Powershell session we just need to create a new Powershell profile (more reading on creating Powershell profiles here if you’re interested) and have it load our script. If you already have a custom profile you can skip this step.

New-Item -path $profile -type file -force

After you create the profile go ahead and edit it by adding the following configuration.

# Dot source not_ext_tabs to remove file extensions from executables in path
. C:\Users\jmreicha\Documents\WindowsPowerShell\no_ext_tabs.ps1

Close your shell and open it again and you should no longer see the file extensions.

There is one last little, unrelated tidbit that I discovered through this process but thought was pretty handy and worth sharing with other Powershell N00bs.

Powershell 3 and above provides some nice key bindings for jumping around the CLI, similar to a bash based shell if you are familiar or have a background using *nix systems.

Powershell key shortcuts

You can check the full list of these key bindings by typing ctrl+alt+shift+? in your Powershell prompt (thanks Keith Hill for this trick).

Read More

Backing up Jenkins configurations to S3

If you have worked with Jenkins for any extended length of time you quickly realize that the Jenkins server configurations can become complicated.  If the server ever breaks and you don’t have a good backup of all the configuration files, it can be extremely painful to recreate all of the jobs that you have configured.  And most recently if you have started using the Jenkins workflow libraries, all of your custom scripts and coding will disappear if you don’t back it up.

Luckily, backing up your Jenkins job configurations is a fairly simple and straight forward process.  Today I will cover one quick and dirty way to backup configs using a Jenkins job.

There are some AWS plugins that will backup your Jenkins configurations but I found that it was just as easy to write a little bit of bash to do the backup, especially since I wanted to backup to S3, which none of the plugins I looked at handle.  In genereal, the plugins I looked at either felt a little bit too heavy for what I was trying to accomplish or didn’t offer the functionality I was looking for.

If you are still interested in using a plugin, here are a few to check out:

Keep reading if none of the above plugins look like a good fit.

The first step is to install the needed dependencies on your Jenkins server.  For the backup method that I will be covering, the only tools that need to be installed are aws cli, tar and rsync.  Tar and rsync should already be installed and to get the aws cli you can download and install it with pip, from the Jenkins server that has the configurations you want to back up.

pip install awscli

After the prerequisites have been installed, you will need to create your Jenkins job.  Click New Item -> Freestyle and input a name for the new job.

jenkins job name

Then you will need to configure the job.

The first step will be figuring out how often you want to run this backup.  A simple strategy would be to backup once a day.  The once per day strategy is illustrated below.

backup periodically

Note the ‘H’ above means to randomize when the job runs over the hour so that if other jobs were configured they would try to space out the load.

The next step is to backup the Jenkins files.  The logic is all written in bash so if you are familiar it should be easy to follow along.

# Delete all files in the workspace
rm -rf *

# Create a directory for the job definitions
mkdir -p $BUILD_ID/jobs

# Copy global configuration files into the workspace
cp $JENKINS_HOME/*.xml $BUILD_ID/

# Copy keys and secrets into the workspace
cp $JENKINS_HOME/identity.key.enc $BUILD_ID/
cp $JENKINS_HOME/secret.key $BUILD_ID/
cp $JENKINS_HOME/secret.key.not-so-secret $BUILD_ID/
cp -r $JENKINS_HOME/secrets $BUILD_ID/

# Copy user configuration files into the workspace
cp -r $JENKINS_HOME/users $BUILD_ID/

# Copy custom Pipeline workflow libraries
cp -r $JENKINS_HOME/workflow-libs $BUILD_ID

# Copy job definitions into the workspace
rsync -am --include='config.xml' --include='*/' --prune-empty-dirs --exclude='*' $JENKINS_HOME/jobs/ $BUILD_ID/jobs/

# Create an archive from all copied files (since the S3 plugin cannot copy folders recursively)
tar czf jenkins-configuration.tar.gz $BUILD_ID/

# Remove the directory so only the tar.gz gets copied to S3
rm -rf $BUILD_ID

Note that I am not backing up the job history because the history isn’t important for my uses.  If the history IS important, make sure to add a line to backup those locations.  Likewise, feel free to modify and/or update anything else in the script if it suits your needs any better.

The last step is to copy the backup to another location.  This is why we installed aws cli earlier.  So here I am just uploading the tar file to an S3 bucket, which is versioned (look up how to configure bucket versioning if you’re not familiar).

export AWS_DEFAULT_REGION="xxx"
export AWS_ACCESS_KEY_ID="xxx"
export AWS_SECRET_ACCESS_KEY="xxx"

# Upload archive to S3
echo "Uploading archive to S3"
aws s3 cp jenkins-configuration.tar.gz s3://<bucket>/jenkins-backup/

# Remove tar.gz after it gets uploaded to S3
rm -rf jenkins-configuration.tar.gz

Replace the AWS_DEFAULT_REGION with the region where the bucket lives (typically us-east-1), make sure to update the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY to use an account with access to write to AWS S3 (not covered here).  The final thing to note, <bucket> should be replaced to use your own bucket.

The backup process itself is usually pretty fast unless the Jenkins server has a massive amount of jobs and configurations.  Once you have configured the job, feel free to run it once to test if it works.  If the job worked and returns as completed, go check your S3 bucket and make sure the tar.gz file was uploaded.  If you are using versioning there should just be one file, and if you choose the “show versions” option you will see something similar to the following.

s3 backup

If everything went okay with your backup and upload to s3 you are done.  Common issues configuring this backup method are choosing the correct AWS bucket, region and credentials.  Also, double check where all of your Jenkins configurations live in case there aren’t in a standard location.

Read More

Generate Certbot certificates with a container

This is a little bit of a follow up post to the origin post about generating certs with the DNS challenge.  I decided to create a little container that can be used to generate a certificate based on the newly renamed dehyrdated script with the extras to make DNS provisioning easy.

A few things have changed in the evolution of Let’s Encrypt and its tooling since the last post was written.  First, some of the tools have been renamed so I’ll just try to clear up some of the names if there is any confusion.  The official Let’s Encrypt client has been renamed to Certbot.  The shell script used to provision the certificates has been renamed as well.  What used to be called letsencrypt.sh has been renamed to dehydrated.

The Docker image can be found here.  The image is essentially the dehydrated script with a few other dependencies to make the DNS challenge work, including Ruby, a ruby script DNS hook and a few Gems that the script relies on.

The following is an example of how to run the script:

docker run -it --rm \
    -v $(pwd):/dehydrated \
    -e AWS_ACCESS_KEY_ID="XXX" \
    -e AWS_SECRET_ACCESS_KEY="XXX" \
    jmreicha/dehydrated-dns --cron --domain test.example.com --hook ./route53.rb --challenge dns-01

Just replace test.example.com with the desired domain.  Make sure that you have the DNS zone added to route53 and make sure the AWS credentials used have the appropriate permissions to read and write records on route53 zone.

The command is essentially the same as the command in the original post but is a lot more convenient to run now because you can specify where on your local system you want to dump the generated certificates to and you can also easily specify/update the AWS credentials.

I’d like to quickly explain the decision to containerize this process.  Obviously the dehydrated tool has been designed and written to be a standalone tool but in order to generate certificates using the DNS challenge requires a few extra tidbits to be added.  Cooking all of the requirements into a container makes the setup portable so it can be easily automated on different environments and flexible so that it can be run in a variety of setups, with different domain names and AWS credentials.  With the container approach, the certs could potentially be dropped out on to a Windows machine running Docker for Windows if desired, for example.

tl;dr This setup may be overkill for some, but it has worked out well for my purposes.  Feel free to give it a try if you want to test out creating Certbot certs with the deyhrdated tool in a container.

Read More