Defaulting Google search results to the past year

If you spend as much time looking around the internet for answers to obscure questions as I do, you quickly realize that often times, Google will happily present you with results that are many years old and out of date.  This is especially frustrating when you eventually find an answer after spending quite some time searching, only to realize the answer you’re looking for is 5 years old.  For more reading on efficient searching check out this post on the 10 tab rule, which has some useful ideas for better searching in general.

xkcd

There is a trick that will allow you to customize the Google default results.  The key to mapping the Google search terms is really just the URL.  Google uses search parameters for querying, so you can do some really cool things.  Obviously this is a powerful concept, so a lot of useful searches can be mapped.  This idea can be taken further to map keywords to other searches, like YouTube, Google maps, Stack Overflow, etc, or basically any site that provides a search interface.

I have only tested these key/search mappings for Google search results on Google Chrome, so if you use another browser there might be a similar trick, I just haven’t attempted it.  Open Chrome settings and navigate to the “Search” section.

This will pull up a dialog box with a list of default search engines.  Scroll to the bottom of the list and add the following values to the corresponding fields from the screenshot.

The Keyword is just a “>” symbol, and it can be anything really.  I chose the > because it is quick and easy to get to.  The rest is pretty self explanatory.  After the entry has been added, scroll through the search engines and find the new “Google recent” entry – there is a button that says “Make default” if you hover over the search engine entry.  Click that and then click done.

Now when you do a Google search from your search bar it will default to items from the past year.

Bonus

You can extend this trick further to map keys in your Google search bar to do other searches.  For example, you can map a key (or word) to search for image results.  In the below example, I am using “I” as the keyword.

After adding the above snippet into the search settings you can navigate to the search bar, type in I (followed by a tab) and the term to search for and it will automatically do an image search.  The key to making the mapping being a tab completion in the search bar is the q=%s part in the URL.

The last bonus search that has worked for me is the “feeling lucky” search.

That keyword (I used a tilde) can once again be anything, but preferably should be fast to get reach to make searching easier.

One final note

Sometimes you actually do want to search for results that are over a year old.  This is true of information that doesn’t really change often.  So if you are having trouble finding a website you think should be at the top of the search, make sure that the default search result is set to any time.  Ideally you would make another key mapping to handle this searching behavior.

Read More

Configure a Rancher HAProxy health check

If you are familiar with ELB/ALB you will know that there are slight idiosyncrasies between the two.  For example, ELB allows you to health check a back end server by TCP port.  Basically allowing the user to check if a back end comes up and is listening on a specified port.  ALB is slightly different in its method for health checking.  ALB uses HTTP checks (layer 7) to ensure back end instances are up and listening.

This becomes a problem in Rancher, when you have multiple stacks in a single environment that are fronted by the Rancher HAProxy load balancer.  By default, the HAProxy config does not have a health check endpoint configured, so ALB is never able to know if the back end server is actually up and listening for requests.

A colleague and I  recently discovered a neat trick for solving this problem if you are fronting your environment with an ALB.  The solution to this conundrum is to sprinkle a little bit of custom configuration to the Rancher HAProxy config.

In Rancher, you can modify the live settings without downtime.  Click on the load balancer that sits behind the ALB and navigate to the Custom haproxy.cfg tab.

haproxy config

Modify the HAProxy config by adding the following:

# Use to report haproxy's status
defaults
    mode http
    monitor-uri /_ping

Click the “Edit” button to apply these changes and you should be all set.

Next, find the health check configuration for the associated ALB in the AWS console and add a check the the /_ping path on port 80 (or whichever port you are exposing/plan to listen on).  It should look similar to the following example.

Health checks

Below is an example that maps a DNS name to an internal Nginx container that is listening for requests on port 80.

HAPRoxy configuration

The check in ALB ensures that the HAProxy load balancer in Rancher is up and running before allowing traffic to be routed to it.  You can verify that your Rancher load balancer is working if the instances behind your ALB start showing a status of healthy in the AWS console.

NOTE: If you don’t have any apps initially behind the Rancher load balancer (or that are listening on the port specified in the health check) the AWS instances behind ALB will remain unhealthy until you add configuration in Rancher for the stacks to be exposed, as pictured above.

After setting up HAProxy, publicly accessible services in private Rancher environments can easily be managed by updating the HAProxy config.  Just add a dns name and a service to link to and HAProxy is able to figure out how and where to route requests to.  To map other services that aren’t listening on port 80, the process is  very similar.  Use the above as a guideline and simply update the target port to whichever port the app is listening on internally.

Read More

Powershell for Linux!

Microsoft has been making a lot of inroads in the Open Source and Linux communities lately.  Linux and Unix purists undoubtedly have been skeptical of this recent shift.  Canonical for example, has caught flak for partnering with Microsoft recently.  But the times are changing, so instead of resenting this progress, I chose to embrace it.  I’ll even admit that I actually like many of the Open Source contributions Microsoft has been making – including a flourishing Github account, as well as an increasingly rich, and cross platform platform set of tools that includes Visual Studio Code, Ubuntu/Bash for Windows, .NET Core and many others.

If you want to take the latest and greatest in Powershell v6 for a spin on a Linux system, I recommend using a Docker container if available.  Otherwise just spin up an Ubuntu (14.04+) VM and you should be ready.  I do not recommend trying out Powershell for any type of workload outside of experimentation, as it is still in alpha for Linux.  The beta v6 release (with Linux support) is around the corner but there is still a lot of ground that needs to be covered to get there.  Since Powershell is Open Source you can follow the progress on Github!

If you use the Docker method, just pull and run the container:

docker run -it --rm ubuntu:16.04 bash

Then add the Microsoft Ubuntu repo:

# apt-transport-https is needed for connecting to the MS repo
apt-get update && apt-get install curl apt-transport-https
curl https://packages.microsoft.com/keys/microsoft.asc | apt-key add -
curl https://packages.microsoft.com/config/ubuntu/16.04/prod.list | tee /etc/apt/sources.list.d/microsoft.list

Update and install Powershell:

sudo apt-get update
sudo apt-get install -y powershell

Finally, start up Powershell:

powershell

If it worked, you should see a message for Powershell and a new command prompt:

# powershell
PowerShell
Copyright (C) 2016 Microsoft Corporation. All rights reserved.

PS />

Congratulations, you now have Powershell running in Linux.  To take it for a spin, try a few commands out.

Write-Host "Hello Wordl!"

This should print out a hello world message.  The Linux release is still in alpha, so there will surely be some discrepancies between Linux and Windows based systems, but the majority of cmdlets should work the same way.  For example, I noticed in my testing that the terminal was very flaky.  Reverse search (ctrl+r) and the Get-History cmdlet worked well, but arrow key scrolling through history did not.

You can even run Powershell on OSX now if you choose to.  I haven’t tried it yet, but is an option for those that are curious.  Needless to say, I am looking for the

Read More

Backup Route 53 zones

We have all heard about DNS catastrophes.  I just read about horror story on reddit the other day, where an Azure root DNS zone was accidentally deleted with no backup.  I experienced a similar disaster a few years ago – a simple DNS change managed to knock out internal DNS for an entire domain, which contained hundreds of records.  Reading the post hit close to home, uncovering some of my own past anxiety, so I began poking around for solutions.  Immediately, I noticed that backing up DNS records is usually skipped over as part of the backup process.  Folks just tend to never do it, for whatever reason.

I did discover, though, that backing up DNS is easy.  So I decided to fix the problem.

I wrote a simple shell script that dumps out all Route53 zones for a given AWS account to a json file, and uploads the zones to an S3 bucket.  The script is a handful lines, which is perfect because it doesn’t take much effort to potentially save your bacon.

If you don’t host DNS in AWS, the script can be modified to work for other DNS providers (assuming they have public API’s).

Here’s the script:

#!/usr/bin/env bash

set -e

# Dump route 53 zones to a text file and upload to S3.

BACKUP_DIR=/home/<user>/dns-backup
BACKUP_BUCKET=<bucket>
# Use full paths for cron
CLIPATH="/usr/local/bin"

# Dump all zones to a file and upload to s3
function backup_all_zones () {
  local zones
  # Enumerate all zones
  zones=$($CLIPATH/aws route53 list-hosted-zones | jq -r '.HostedZones[].Id' | sed "s/\/hostedzone\///")
  for zone in $zones; do
  echo "Backing up zone $zone"
  $CLIPATH/aws route53 list-resource-record-sets --hosted-zone-id $zone > $BACKUP_DIR/$zone.json
  done

  # Upload backups to s3
  $CLIPATH/aws s3 cp $BACKUP_DIR s3://$BACKUP_BUCKET --recursive --sse
}

# Create backup directory if it doesn't exist
mkdir -p $BACKUP_DIR
# Backup up all the things
time backup_all_zones

Be sure to update the <user> and <bucket> in the script to match your own environment settings.  Dumping the DNS records to json is nice because it allows for a more programmatic way of working with the data.  This script can be run manually, but is much more useful if run automatically.  Just add the script to a cronjob and schedule it to dump DNS periodically.

For this script to work, the aws cli and jq need to be installed.  The installation is skipped in this post, but is trivial.  Refer to the links for instructions.

The aws cli needs to be configured to use an API key with read access from Route53 and the ability to write to S3.  Details are skipped for this step as well – be sure to consult the AWS documentation on setting up IAM permissions for help with setting up API keys.  Another, simplified approach is to use a pre-existing key with admin credentials (not recommended).

Read More

Hide file extensions in PowerShell tab completion

One thing I have quickly discovered as I get acclimated to my new Windows machine is that by default the Windows Powershell CLI appends the executable file extension to the command that gets run, which is not the case on Linux or OSX.  That got me wondering if it is possible to modify this default behavior and remove the extension.  I’m going to ruin the surprise and let everybody know that it is definitely possible change this behavior, thanks to the flexibility of Powershell and friends.  Now that the surprise is ruined, read on to find out how this solution works.

To check which file types Windows considers to be executable you can type $Env:PathExt.

PS > $Env:PathExt
.COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH;.MSC;.PY;.PYW;.CPL

Similarly, you can type $Env:Path to get a list of places that Windows will look for files to execute by default.

PS > $Env:PATH
C:\Program Files\Docker\Docker\Resources\bin;C:\Python35\Scripts\;C:\Python35\;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\Program Files (x86)\NVIDIA Co
ram Files\nodejs\;C:\Program Files\Git\cmd;C:\Program Files (x86)\Skype\Phone\;C:\Users\jmreicha\AppData\Local\Microsoft\WindowsApps;C:\Users\jmreicha\AppData\Local\atom\bin;C:\Users\jmreicha\AppData\Roaming\npm

The problem though, is that when you start typing in an extension that is part of this path, say python, and tab complete it, Windows will automatically append the file extension to the executable.  Since I am more comfortable using a *nix style shell it is an annoyance having to deal with the file extensions.

Below I will show you a hack for hiding these from you Powershell prompt.  It is actually much more work than I thought to add this behavior but with some help from some folks over at stackoverflow, we can add it.  Basically, we need to overwrite the functionality of the default Powershell tab completion with our own, and then have that override get loaded into the Powershell prompt when it gets loaded, via a custom Profile.ps1 file.

To get this working, the first step is to look at what the default tab completion does.

(Get-Command 'TabExpansion2').ScriptBlock

This will spit out the code that handles the tab completion behavior.  To get our custom behavior we need to override the original code with our own logic, which I have below (I wish I came up with this myself but alas).  This is note the full code, just the custom logic.  The full script is posted below.

$field = [System.Management.Automation.CompletionResult].GetField('completionText', 'Instance, NonPublic')
$source.CompletionMatches | % {
        If ($_.ResultType -eq 'Command' -and [io.file]::Exists($_.ToolTip)) {
            $field.SetValue($_, [io.path]::GetFileNameWithoutExtension($_.CompletionText))
        }
    }
Return $source

The code looks a little bit intimidating but is basically just looking to see if the command is executable and on our system path, and if it is just strips out the extension.

So to get this all working, we need to create a file with the logic, and have Powershell read it at load time.  Go ahead and paste the following code into a file like no_ext_tabs.ps1.  I place this in the Powershell path (~/Documents/WindowsPowerShell), but you can put it anywhere.

Function TabExpansion2 {
    [CmdletBinding(DefaultParameterSetName = 'ScriptInputSet')]
    Param(
        [Parameter(ParameterSetName = 'ScriptInputSet', Mandatory = $true, Position = 0)]
        [string] $inputScript,

        [Parameter(ParameterSetName = 'ScriptInputSet', Mandatory = $true, Position = 1)]
        [int] $cursorColumn,

        [Parameter(ParameterSetName = 'AstInputSet', Mandatory = $true, Position = 0)]
        [System.Management.Automation.Language.Ast] $ast,

        [Parameter(ParameterSetName = 'AstInputSet', Mandatory = $true, Position = 1)]
        [System.Management.Automation.Language.Token[]] $tokens,

        [Parameter(ParameterSetName = 'AstInputSet', Mandatory = $true, Position = 2)]
        [System.Management.Automation.Language.IScriptPosition] $positionOfCursor,

        [Parameter(ParameterSetName = 'ScriptInputSet', Position = 2)]
        [Parameter(ParameterSetName = 'AstInputSet', Position = 3)]
        [Hashtable] $options = $null
    )

    End
    {
        $source = $null
        if ($psCmdlet.ParameterSetName -eq 'ScriptInputSet')
        {
            $source = [System.Management.Automation.CommandCompletion]::CompleteInput(
                <#inputScript#>  $inputScript,
                <#cursorColumn#> $cursorColumn,
                <#options#>      $options)
        }
        else
        {
            $source = [System.Management.Automation.CommandCompletion]::CompleteInput(
                <#ast#>              $ast,
                <#tokens#>           $tokens,
                <#positionOfCursor#> $positionOfCursor,
                <#options#>          $options)
        }
        $field = [System.Management.Automation.CompletionResult].GetField('completionText', 'Instance, NonPublic')
        $source.CompletionMatches | % {
            If ($_.ResultType -eq 'Command' -and [io.file]::Exists($_.ToolTip)) {
                $field.SetValue($_, [io.path]::GetFileNameWithoutExtension($_.CompletionText))
            }
        }
        Return $source
    }    
}

To start using this tab completion override file right away, just source the file as below and it should start working right away.

. .\no_ext_tabs.ps1

If you want the extensions to be hidden every time you start a new Powershell session we just need to create a new Powershell profile (more reading on creating Powershell profiles here if you’re interested) and have it load our script. If you already have a custom profile you can skip this step.

New-Item -path $profile -type file -force

After you create the profile go ahead and edit it by adding the following configuration.

# Dot source not_ext_tabs to remove file extensions from executables in path
. C:\Users\jmreicha\Documents\WindowsPowerShell\no_ext_tabs.ps1

Close your shell and open it again and you should no longer see the file extensions.

There is one last little, unrelated tidbit that I discovered through this process but thought was pretty handy and worth sharing with other Powershell N00bs.

Powershell 3 and above provides some nice key bindings for jumping around the CLI, similar to a bash based shell if you are familiar or have a background using *nix systems.

Powershell key shortcuts

You can check the full list of these key bindings by typing ctrl+alt+shift+? in your Powershell prompt (thanks Keith Hill for this trick).

Read More