Curl on Windows using a Docker wrapper

Does the Windows built-in version of “curl” confuse or intimidate you?  Maybe you come from a Linux or Unix background, and yearn for some of your favorite go-to tools?  Newer versions of Powershell include a cmdlet for interacting with the web called Invoke-WebRequest, which is useful, but is not a great drop in replacement for those with experience in non Windows environments.  The Powershell cmdlets are a move in the right direction to unifying CLI experiences but there are still many folks that have become attached to curl over the years, including myself.  It is worth noting that a Windows compatible version of curl has existed for a long time, however it has always been a nuisance dealing with the zip file, just as using SSH has always been a hassle on Windows.  It has always been possible to use the *nix equivalent tools, it is just clunky.

I found a low effort solution for adding curl to my Windows CLI flow, that acts as a nice middle ground between learning Invoke-WebRequest and installing curl binaries directly, which I’d like to share.  This alias trick is a simple way to use curl for working with API’s and other various web testing in Windows environments without getting tangled in managing versions, and dealing with vulnerabilities.  Just download the latest Docker image to update curl to the newest version, and don’t worry about its implementation across different systems.

Prerequisites are light.  First, make sure to have the Docker for Windows app installed (stable or beta are both fine) as well as a semi-recent version of Powershell.

Next step.  If you haven’t set up a Powershell profile, there are also lots of links and resources about how to do it.   I even wrote about it recently, so I am skipping that step as well.  Start by adding the following snippet to your Powershell profile (by default located in C:\Users\<user>\Documents\WindowsPowerShell\Microsoft.PowerShell_profile.ps1) and saving.

# Curl alias using docker
function Docker-Curl {
   docker run --rm byrnedo/alpine-curl $args
}

# Aliases
New-Alias dcurl Docker-Curl

Then source you terminal and run the curl command that was just created.

dcurl -h

One issue you might notice from the snippet above is that the Docker image is not an “official” image.  If this bothers you (security concerns, etc.), it is really easy to create your own, secure image.  There are lots of examples of how to create minimal images with Curl pre-installed.  Just be aware that your custom image will need to be maintained and occasionally rebuilt/published to guard against future vulnerabilities.  For brevity, I have skipped this process, but here’s an example of creating a custom image.

Optional

To update curl, just run the docker pull command.

docker pull apline-curl

Now you have the best of both worlds.  The built-in Invoke-WebRequest cmdlet provided by Powershell is available, as well as the venerable curl command.

My number one case for using curl in a container is that it has been in existence for such a long time (less bugs and edge cases) and it can be used for nearly any web related task.  It is also much handier to use curl for those with a background using *nix systems, rather than digging around in unfamiliar Powershell docs for similar functionality.  Having the ability to run some of my favorite tools in an easy, reproducible way on Windows has been a refreshing experience while sliding back into the Windows world.

Read More

Hide file extensions in PowerShell tab completion

One thing I have quickly discovered as I get acclimated to my new Windows machine is that by default the Windows Powershell CLI appends the executable file extension to the command that gets run, which is not the case on Linux or OSX.  That got me wondering if it is possible to modify this default behavior and remove the extension.  I’m going to ruin the surprise and let everybody know that it is definitely possible change this behavior, thanks to the flexibility of Powershell and friends.  Now that the surprise is ruined, read on to find out how this solution works.

To check which file types Windows considers to be executable you can type $Env:PathExt.

PS > $Env:PathExt
.COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH;.MSC;.PY;.PYW;.CPL

Similarly, you can type $Env:Path to get a list of places that Windows will look for files to execute by default.

PS > $Env:PATH
C:\Program Files\Docker\Docker\Resources\bin;C:\Python35\Scripts\;C:\Python35\;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\Program Files (x86)\NVIDIA Co
ram Files\nodejs\;C:\Program Files\Git\cmd;C:\Program Files (x86)\Skype\Phone\;C:\Users\jmreicha\AppData\Local\Microsoft\WindowsApps;C:\Users\jmreicha\AppData\Local\atom\bin;C:\Users\jmreicha\AppData\Roaming\npm

The problem though, is that when you start typing in an extension that is part of this path, say python, and tab complete it, Windows will automatically append the file extension to the executable.  Since I am more comfortable using a *nix style shell it is an annoyance having to deal with the file extensions.

Below I will show you a hack for hiding these from you Powershell prompt.  It is actually much more work than I thought to add this behavior but with some help from some folks over at stackoverflow, we can add it.  Basically, we need to overwrite the functionality of the default Powershell tab completion with our own, and then have that override get loaded into the Powershell prompt when it gets loaded, via a custom Profile.ps1 file.

To get this working, the first step is to look at what the default tab completion does.

(Get-Command 'TabExpansion2').ScriptBlock

This will spit out the code that handles the tab completion behavior.  To get our custom behavior we need to override the original code with our own logic, which I have below (I wish I came up with this myself but alas).  This is note the full code, just the custom logic.  The full script is posted below.

$field = [System.Management.Automation.CompletionResult].GetField('completionText', 'Instance, NonPublic')
$source.CompletionMatches | % {
        If ($_.ResultType -eq 'Command' -and [io.file]::Exists($_.ToolTip)) {
            $field.SetValue($_, [io.path]::GetFileNameWithoutExtension($_.CompletionText))
        }
    }
Return $source

The code looks a little bit intimidating but is basically just looking to see if the command is executable and on our system path, and if it is just strips out the extension.

So to get this all working, we need to create a file with the logic, and have Powershell read it at load time.  Go ahead and paste the following code into a file like no_ext_tabs.ps1.  I place this in the Powershell path (~/Documents/WindowsPowerShell), but you can put it anywhere.

Function TabExpansion2 {
    [CmdletBinding(DefaultParameterSetName = 'ScriptInputSet')]
    Param(
        [Parameter(ParameterSetName = 'ScriptInputSet', Mandatory = $true, Position = 0)]
        [string] $inputScript,

        [Parameter(ParameterSetName = 'ScriptInputSet', Mandatory = $true, Position = 1)]
        [int] $cursorColumn,

        [Parameter(ParameterSetName = 'AstInputSet', Mandatory = $true, Position = 0)]
        [System.Management.Automation.Language.Ast] $ast,

        [Parameter(ParameterSetName = 'AstInputSet', Mandatory = $true, Position = 1)]
        [System.Management.Automation.Language.Token[]] $tokens,

        [Parameter(ParameterSetName = 'AstInputSet', Mandatory = $true, Position = 2)]
        [System.Management.Automation.Language.IScriptPosition] $positionOfCursor,

        [Parameter(ParameterSetName = 'ScriptInputSet', Position = 2)]
        [Parameter(ParameterSetName = 'AstInputSet', Position = 3)]
        [Hashtable] $options = $null
    )

    End
    {
        $source = $null
        if ($psCmdlet.ParameterSetName -eq 'ScriptInputSet')
        {
            $source = [System.Management.Automation.CommandCompletion]::CompleteInput(
                <#inputScript#>  $inputScript,
                <#cursorColumn#> $cursorColumn,
                <#options#>      $options)
        }
        else
        {
            $source = [System.Management.Automation.CommandCompletion]::CompleteInput(
                <#ast#>              $ast,
                <#tokens#>           $tokens,
                <#positionOfCursor#> $positionOfCursor,
                <#options#>          $options)
        }
        $field = [System.Management.Automation.CompletionResult].GetField('completionText', 'Instance, NonPublic')
        $source.CompletionMatches | % {
            If ($_.ResultType -eq 'Command' -and [io.file]::Exists($_.ToolTip)) {
                $field.SetValue($_, [io.path]::GetFileNameWithoutExtension($_.CompletionText))
            }
        }
        Return $source
    }    
}

To start using this tab completion override file right away, just source the file as below and it should start working right away.

. .\no_ext_tabs.ps1

If you want the extensions to be hidden every time you start a new Powershell session we just need to create a new Powershell profile (more reading on creating Powershell profiles here if you’re interested) and have it load our script. If you already have a custom profile you can skip this step.

New-Item -path $profile -type file -force

After you create the profile go ahead and edit it by adding the following configuration.

# Dot source not_ext_tabs to remove file extensions from executables in path
. C:\Users\jmreicha\Documents\WindowsPowerShell\no_ext_tabs.ps1

Close your shell and open it again and you should no longer see the file extensions.

There is one last little, unrelated tidbit that I discovered through this process but thought was pretty handy and worth sharing with other Powershell N00bs.

Powershell 3 and above provides some nice key bindings for jumping around the CLI, similar to a bash based shell if you are familiar or have a background using *nix systems.

Powershell key shortcuts

You can check the full list of these key bindings by typing ctrl+alt+shift+? in your Powershell prompt (thanks Keith Hill for this trick).

Read More

Quickly get Node.js up and running on Windows

Installing software on Windows in an automatable, repeatable and easy way in Windows has always been painful in the past.  Luckily, in recent years there have been some really nice additions to Windows and its ecosystem that have improved the process significantly.  The main tools that ease this process are Powershell and Chocolatey and these tools have significantly improved the developer  and administrative experiences in Windows.

In the past, in order to install something like a programming language and its environment you would have to manually download the zip or tar file, extract it, put it in the correct place, set up environment variables and system paths manually, etc.  Things would also break pretty easily and it was just painful in general to work with.

Hopefully you are already familiar with Powershell at least because I won’t be covering it much in this post.  If you have any recent version of Windows you should have Powershell.  Below I describe Chocolately a little bit and why it is useful so you can find out more in the post or you can check out the Chocolately website, which does a much better job of explaining its benefits, how it is used and why package managers are good.

Update Windows execution policy

This process is pretty straight forward.  Make sure you open up a Powershell prompt with admin privileges, otherwise you will run into problems.  The first step is to change the default system execution policy (if you haven’t already).  On a fresh install of Windows, you will need to loosen up the security in order to install Chocolatey, which will be used to install and mange Node.js.  Luckily there are just a few Powershell commands that need to run.  To check the status of the execution policy, run the following.

Get-ExecutionPolicy
Restricted

This should tell you what your execution policy is currently set to.  To loosen the policy for Choco, run the following command.

Set-ExecutionPolicy -ExecutionPolicy RemoteSigned

Follow the prompt and choose [Y] to update the policy.  Now, if you run Get-ExecutionPolicy you should see RemoteSigned.

Get-ExecutionPolicy
RemoteSigned

If you don’t have your execution policy opened up to at least RemoteSigned, you will have trouble installing things from the internet, including Chocoloatey.  You can find more information about Execution Policies here if you don’t trust me or just want a better idea of how they work.

Install Chocolatey

If you aren’t familiar, Chocolately is a package manager for Windows, similar to apt-get on Debian Linux systems or yum on Redhat based systems.  It allows users to quickly and easily install and manage software packages on Windows platforms through Powershell.

The steps to installing are Chocolatey are listed below.

iwr https://chocolatey.org/install.ps1 -UseBasicParsing | iex

This command will take care of pretty much all of the setup so just watch it do its thing.  Again, make sure you are inside of an elevated admin shell, otherwise you will likely have problems with the installation.

Install Node.js

The last step (finally) is to install Node.js.  Luckily this is the easiest part.  Just run the following command.

choco install nodejs.install

Choose [Y] to accept that you want to run the install script and let it run.  There should be some colored output and when it is done Node should be installed on your system.  You will need to make sure you close and re-open you Powershell prompt to get the Node binaries to be picked up on your PATH, or just source the shell by running “RefreshEnv” to pick up the new path.  If you are in an admin shell I would recommend dropping out of it by simply closing the current session and opening up a new, non privileged session.

install node

Once you have a fresh shell you can test that Node installed properly.

node -v
v6.6.0

Now you are ready to go.  It only took a few minutes with the Choco package manger.  If you are new to Node in general and are looking for a good resource, the learn you the node project on github is pretty decent.

Let me know if you have any caveats to add to this method, it is the easiest and fastest way I have found to installing Node as well as other pieces of software in Windows without any hassle.

Read More

Exchange Transport Service won’t start

Due to an outage this weekend, I’d like to take a minute to briefly describe the scenario that occurred and how it was resolved.  If you are having trouble starting your Exchange Transport Service then you may potentially be running into the same issue I was having during the outage.  Luckily there is an easy remedy for the service failing to start.  Basically what was happening was the Exchange message queue database was beginning to fail due to some sort of corruption, causing the Transport service to fail.  Because the Transport service wasn’t running, the Edge Sync process was failing, causing external mail delivery to fail.  Obviously a big issue, since you cannot receive any email from external domains if this is not working correctly.

To troubleshoot this, there are a few obvious signs that you should look at first.  The main thing you should check first is your disk sizes, I wrote about it in my previous post.  If your disks are full or are filling up then you are pretty much dead in the water and will need to fix your disk issue.  In my scenario the disk sizes were not an issue so the next tool I turned to were the logs.  I found a number of interesting entries in the Windows Application Event logs that gave me some clues.  I want to detail as many of these messages as I can so that people who are having similar issues know what to look for.

Transport error Transport error Transport error Transport error

There are a few possible resolutions to this problem.  Through some Google searches one solution I found is that you can attempt to repair the corruption in the queue databases by running the database through ESE util.  There is no guarantee this will work and it can potentially take a lot of time, depending on the size of your queue database. There is some good information here about the mail queue and how it works.

If you decide to repair the database, the mail queue file is located in the following location:

C:\Program Files\Microsoft\Exchange Server\V14\TransportRoles\data\Queue

Inside this directory is a file called tmp.edb.  This is the file that you will need to repair.

The other method is much simpler and was the solution I went with.  Instead of attempting to repair the database corruption, simply copy and rename the queue folder and restart the Transport service.  Doing this will force the Transport service to create a new, fresh copy of the database queue along with all of the accompanying config files and associated items that are required to get things up and running.  It is faster and simpler, IMO.  The only problem with this approach is that items that were stuck in the queue when the database corruption occurred will be lost.  For me, this was an acceptable loss.  If not, you will probably have to use the first method and attempt to repair the database or try to somehow work with a shadow copy or backup somehow to get unstuck.

Read More

Monitor your Exchange disk sizes

A word to the wise.  If you all of a sudden are unable to send and receive email messages in your Exchange environment, take a look and make sure the Exchange server disks aren’t being filled up.  Today I ran across an interesting (and by interesting I mean that this could have caused a serious outage) issue where Windows updates were very routinely being downloaded for our next patch management installation cycle but unknowingly were also causing our email services to stop functioning correctly.  I am thankful the scenario didn’t get ugly and luckily this event gives me the opportunity to talk about a few of things that I think might be useful for readers and other admins.

It turns out that this month’s wave of Windows updates caused the disks on our Hub Transport servers to quietly fill up during the day, unbeknownst to any of the admins.  In normal circumstances this process is by design and almost never becomes an issue, however in this case there was not enough disk available for Exchange to work correctly.  This could have been disastrous had we not known that the disk was starting to fill up.  We could have been chasing our tails for a much longer period of time and the situation could have escalated to a more stressful situation.  For some reason, the company likes to be able to send and receive emails.  Thank god for monitoring that works.

There are a couple things that need to be investigated at this point.  First, had we not known that the Windows updates were what were causing the disk to fill up, a logical place to start looking for clues would be to examine the log files on the suspect servers.  I would like to take a little bit of time and quickly go over some steps for looking at logs in an Exchange environment, when thinking about potential disk space issues a few things come to mind.  Are log files growing rapidly?  Did somebody turn on verbose logging and accidentally forget to turn it off?  To verify the logs aren’t the issue there are a few places that are good to look.  If you are familiar with or have ever used message tracking in Exchange you know how powerful it can be.  Sometimes that can also potentially be an issue with your disk filling up.  Here is the location that these message tracking logs are stored:

C:\Program Files\Microsoft\Exchange Server\V14\TransportRoles\Logs\MessageTracking

Another location that gets used when you turn on verbose logging for troubleshooting send or receive connectors are the smtpsend and smtpreceive directories.  These can fill up quite quickly if you forget to turn off verbose logging on a send or receive connector when are you done troubleshooting.  This location is here:

C:\Program Files\Microsoft\Exchange Server\V14\TransportRoles\Logs\ProtocolLog

Finally, there is a location for logging protocol settings on the hub transport.  These logs can be found here:

C:\Program Files\Microsoft\Exchange Server\V14\TransportRoles\Logs\ProtocolLog

I would like to point out quickly that any and all of the behaviors of these logging methods can be modified using the Exchange Management Shell, and sometimes for more detailed settings can only be modified by the EMS.

If these quick spot checks don’t uncover any immediate problems another good technique to help gain some insight into where your disk space issues are is to use a tool that enumerates file locations and file sizes.  There are a few tools available, one of them I like to use is Space Sniffer.  It is fast, easy to use and gives a good visual representation of directory sizes and file sizes.  The tool can do much more but in this case we are just interested in finding the disk issue quickly.  We were able to quickly find that the size and contents of the %windir%\softwaredistribution\download folder were growing rather quickly.  I just happen to know that this is the temporary location that Windows uses to store Windows update files before they are installed.

There are a few things that can be done here.  You can either clear the temporary Windows updates files, delete other unnecessary files or you can grow your disks.  We were lucky because our Hub Transport servers are VM’s and increasing the disk size of these servers is simple.  That seems like the best option if it is a possibility, just in case something like this happens again we will have the additional space so the Exchange servers won’t bog down.

Ultimately we prevented the disaster from occurring but the incident is a great illustration of the lesson I’d like to share.  Make sure you have a good monitoring and alerting solution in place.  Otherwise you may not have any clue where to start looking.  If we did not have a reliable monitoring tool in place it would have been much more difficult to track this problem down in the first place because our Exchange environment is large and complex.  Because we have good monitoring tools we were able to quickly identify the problem and resolve it before anything bad happened.  On a side note, I am still thinking about how we can take this monitoring and alerting one step further in the future to become proactive instead of reactive but for now the monitoring tools are doing their job and because of this we avoided a potential disaster.  If you have any thoughts on proactive monitoring and alerting relating to these types of disk issues let me know, I’d love to hear how you handle it.

Read More