Every once and awhile you will probably encounter a situation where you need to enable and then use telnet in a security focused environment. In certain situations telnet can be a great tool to test the functionality of firewall rule. Iif you aren’t certain whether or not a rule is working telnet can be a great way to help debug. The problem in Server 2008 and above is that telnet isn’t enabled by default. Luckily with PowerShell it is easy to enable the telnet functionality.
The following set of commands is a quick depiction of how you can enable telnet from a PowerShell prompt to ensure the ability of testing certain ports. Try it out.
Bam! As always, it is always easier to stay in command prompt and this is a great way to test port connectivity. I can understand why telnet is disabled by default on fresh server builds but sometimes it can become useful to have telnet as a tool to test connectivity. If you would like to debate the merits of disabling/enabling telnet on a server just drop me a line, I obviously will not be focusing on this aspect here. Anyway, just as easily as it is to enable telnet through PowerShell it can be disabled with the following command. If you already have the server manager module imported, skip to the second command.
It seems like everybody is using git these days. And for most, not everybody is stuck using Windows in their day to day workflow. Unfortunately, I am. So that means it is much more painful to get up and running with a lot of the coolest and best open source projects that are offered by members of github and other online code repositories being shared via git. However, there is hope and it is possible for Windows users to join the git party. So in this post, I would like to describe just how to do that. And it should only take a few minutes if done correctly. I will mention beforehand that there are a few steps that need to be completed in order for this technique to work successfully that typically are taken care of in a Linux or OSX environment.
The goal of this post is to work through these steps as best I can to get users up and running as quickly as possible and as easily as possible, reducing the amount of confusion and fumbling around with settings. This post is designed for beginners that are just getting their feet wet with git but hopefully others can use it as a resource if they are coming from a different environment and are confused by the Windows way of doing things.
First step – Download and install the git port for Windows.
This is pretty straight forward. Download and run the executable to install git for Windows. If you just want to get up and running or are lazy, you can leave all of the defaults when you run through the installation wizard.
Second step – Add the git binaries to your system path variable.
This is the most important step, because out of the box git won’t work in your ordinary PowerShell command prompt, it needs to be opened separately. So to fix this and add all the necessary binaries open up your environmental variables (in Windows 8).
Third step (optional) – Download and install posh-git for better PowerShell and git integration.
I have highlighted part of this process before in an older post but will go through the steps again because it is pretty straight forward. To be able to get posh-git you need to have a sort of PowerShell package management tool called PsGet (instructions here). To get this tool run the following command from your PowerShell command prompt.
Once the command has completed you should be able to simply run this install command and be finished.
install-module posh-git
That should be it. With these simple steps you should be able to utilize git from the command line like you are accustomed to on other operating systems. As I said, there is a tad more leg work but you can really utilize the flexibility of PowerShell to get things working. I hope it helps, and as always let me know if you have any tips or questions.
Here is a Powershell script you can use to quickly determine the total amount of space taken up by all of your Exchange database files (edb files) on an Exchange server. I’d like to note that this may not necessarily be a 100% accurate representation but is a great way to get a ballpark number without having to add the numbers up yourself, manually.
I noticed that I had no way to calculate the total amount of space being used by my Exchange databases the other day. And even after scouring through teh Googles I was unable to find what I was looking for quickly so I wrote this script up quick to fix that problem. Just copy the previous bit of code into a ps1 file with notepad and execute the script from your EMS. It is a super simple way to iterate through all the databases, save their sizes to a variable and then spit that variable out when it is complete.
Offline files in Windows are a set of features that essentially give users the ability to work with files off of or outside of the network. So for example if a user had a laptop that had a mapped drive or network share and were to take their computer outside of the network, the features offered by offline files would allow this user to continue working with these files. I will not cover the details of how all of this magic works in this post, I just want to show people the best way I found to disable this feature with the least amount of problems. If you want to go straight from the source, here is the original article the gave me about 95% of the information necessary for accomplishing this task.
The remainder of this post will detail my findings and experience from the link above. This feature (offline files) is enabled by default in Windows 7. Here is a good overview of the benefits of offline files. However, for me personally as an admin, this feature so far has caused much confusion in the work environment for users that are not accustomed to having such a feature in our move towards Windows 7.
These settings can of course be controlled on a per user basis by changing the settings and configuration of the “Sync Center” tool in Windows. But when you are involved in a larger environment and need this sort of process automated for many users, Group Policy becomes the most effective way to handle this problem. There are a few steps to get offline folders disabled correctly so I thought I would share all the pieces in case somebody runs across a similar need as I did. The first step to disable the offline file features is to adjust the following settings in Group Policy:
Allow or Disallow use of the Offline Files feature: Disabled
Prohibit user configuration of Offline Files: Enabled
Sync all offline files when logging on: Disabled
Sync all offline files before logging off: Disabled
Sync offline files before suspend: Disabled
Remove “Make available offline” command: Enabled
Prevent use of Offline Files folder: Enabled
Next, we need to tell Group Policy to shut off the offline file service and disable it on all Windows machines that have the service installed (Windows XP, 7, 8 machines). To do this you will need to modify your Group Policy settings on a machine that has the service installed it already, through RSAT. This is an important step, you will not be able to find this service if you are adjusting the GP settings from a server. This service is located in the following location:
Computer Configuration -> Windows Settings -> Security Settings -> System Services
The specific service we are looking for is the “cscservice“, which corresponds to the service labeled “Offline Files” in the Windows services list.
The last step to get this policy working correctly is to add in a registry key that will fix machines that have already been used to cache certain network resources. Essentially adding this registry key tell the machine to blow up its database of offline files and tells the machine to remove the cached files as well. To configure this settings we need to add in a custom reg entry:
Computer Configuration -> Preferences -> Windows Settings -> Registry
Key: HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\services\CSC\Parameters Value name: FormatDatabase Value type: DWORD Value data: 1
Here is a good article with instructions on how change the registry settings by hand and a screenshot of my own GP environment with how the settings should look via the GP Management Console.
That should be all the necessary changes that need to be made. If I missed anything let me know, hopefully this will save people time in the future.
I recently returned from my trip to Orlando and the Cisco Live! 2013 conference this year, so I thought I would take some time to reflect and go over the experience and report on some key highlights I was able to take away from this years conference. This year was my first Cisco Live! event and I have to say I was really impressed with the experience as a whole. There were maybe a few gripes here and there but overall the event in its entirety was pretty awesome. So in this post I just want to discuss some of the details of the event further. I don’t have a lot else to report on so I will go ahead and get get going, and begin by going over some of the Cisco specific trends that I noticed. Of course this is all a subjective experience and some may disagree but here is what I felt to be generally true throughout much of the event.
Technological takeaways (in my personal experience).
Cisco is hedging a huge bet that SDN is going to take off in the immediate future.
Mobility is going to continue to explode and increase the diversity of networks so we need to prepare and build our network infrastructures to handle the drive towards mobility.
Cisco really drove home the concept of the future connectedness of devices by pushing their idea of “the internet of everything”. This is the concept that technological experiences will converge and be tightly coupled. One example that was presented was a seamless experience at a hotel. The real chunk of info to take away is that as technology continues to evolve we will need to adapt networks to suit these needs.
In general, I felt these were the main drivers and ideas for a lot of what will be happening in the future of networking, at Cisco and abroad. Obviously Cisco was there to push their products so I will go ahead and cover a few of the key ideas and products that Cisco believes will help drive these future changes
The maturation of the ISE platform. This will be the convergence of a number of disparate technologies Cisco currently offers into a unified identity and access platform, this will correlate with the increase of mobile and the BYOD movement.
The SDN components. Essentially this is Cisco One line of products focused on the evolving SDN space. This includes the OnePK toolkit for OpenFlow development, the One controller for OpenFlow traffic control. There were more SDN components, I just can’t think of them right now.
New product introductions and evolutions. The Nexus 7710 and 7718 for scaling out the data center, the 6800 series to augment the capabilities of the 6500 series, improving performance, scale and speed.
There were many more announcements and products covered but to me, these aforementioned products were the main focus and effort. If I missed anything you thought was important let me know. Now that I have the big announcements covered I’d like to cover some of the other key highlights from the event.
The Good
Organization. Everything from hotel shuttles, information kiosks, to a very helpful event staff. I must say the event planners and organizers really thought things through (for the most part).
Deep dive sessions. The presenters were often the people who helped create the RFC’s or were responsible for writing the code. You can’t get much closer to the source than that. The few presenters I spoke with were all super nice people as well.
Free certification tests. This ranged all the way from CCNA all the way up to CCIE tests.
Universal Studios. Free food, amazing rides, it was just a great all around experience. Plus free booze, so you know, that was pretty awesome.
Journey. Do I even need to say more?
World of Solutions. This was their product and demo floor. Other than the fact that I sold my sole, I learned a lot here and was introduced to a ton of new products I otherwise would not have known about, plus I got about 20 t-shirts. Also free booze here as well.
Keynote speech by sir Richard Branson. It took on the format of a question/answer type interview, it was really cool to hear Branson talk and answer questions so candidly. No free booze but I gained some respect for him.
The Bad
The mobile apps. It almost seemed like this was an afterthought because much of the functionality either didn’t work at all or was crippled. It was a good idea but the execution was lacking, I’m sure this will get fixed next year.
The website was down the first day, due to a load balancer that broke. This caused a lot of confusion and problems, but I was able to print my schedule out at a kiosk so it wasn’t a huge issue for me.
The shuttle to and from Universal was a disaster for me. Many others didn’t experience this issue but it took about 45 minutes to get to the theme park and at least an hour to get back to my hotel. I can’t really complain looking back but it was frustrating at the time.
I would definitely recommend that anybody responsible for supporting their network at any capacity to attend this event at least one time. One nice thing about this event is that it doesn’t matter what skill level you are at, all ranges were covered and represented. I am lucky that I was able to attend this year and am very thankful. This was a great experience, it was incredibly eye opening and the positive effect it had on my own thought process can’t be overstated. I think that it will benefit me throughout my career and hopefully can be used to create opportunities for myself in the future.