Fix Xbox Strict NAT on PFSense

Out of the box, it turns out that PFSense is not configured to handle some connection settings for Xbox Live.  Unfortunately I couldn’t find much of an explanation as to what this message actually means as far as degraded online performance but noticed that I would randomly get kicked out of games, get disconnected from XBox Live and have communication issues every once in a awhile so decided to take a look at what was actually going on because the mentioned issues started to get annoying.

I figured it should be easy enough to fix, but I couldn’t find a definite guide on how to fix this issue so I figured I would make sure it is clear for those who find this post and are having the same issue.  I tried a few different combinations, including port forward combinations mentioned in some forums, firewall rule changes, various UPnP settings, etc. but none of these combo’s worked and were unclear not very clear either.

Eventually I found this guide, which works and is great but doesn’t depict how to set everything up.  There are a few steps to get this working correctly so I will briefly describe them below.

Verify the IP address of you Xbox 360.  There is documentation around for finding it, but essentially go to system -> network -> advanced and it should give you the information.  You may want to set a static IP for your Xbox but I won’t cover that here.  Ask me if you have issues.

Now you will need to modify your firewall settings (Firewall -> NAT).  Choose the “Outbound” tab and change the mode to Manual Outbound NAT rule generation.  After you have saved the settings, create an entry for your Xbox and give it the address of your Xbox, with a mask of /32.

Firewall rule

Once this rule has been created, move it up to the top of the rule list.  You should have something similar to the following when done.

Firewall rules

Next, modify UPnP settings (Services -> UPnP & NAT-PMP) and select the following settings.

  • Enable UPnP & NAT-PMP
  • Allow UPnP port mapping
  • External Interface -> WAN
  • Interfaces -> LAN
  • User specified permissions 1- > allow 88-65535 192.168.39.17/32 88-65535

It should look something like this.

UPnP settings

Go ahead and save the settings and restart your Xbox (just turn off and on) to make sure the settings get picked up and that should be it.  I’m not entirely sure the user permissions need to be this wide open but it works so it is there for now.  I will update the post if I find any evidence that the settings need to be modified.

Read More

Document Storage: Part 5

Document Storage Project

This is Part 5: Uploading Scanned Images.

There’s two components to this part: configuring somewhere for the files to be uploaded to and setting up your MFD to upload to them. Most modern MFDs will upload to a CIFS share, which is what we’re going to use here. First thing’s first, we need to install Samba:

apt-get install samba

Now we need to set up Samba. We’ll have user-level security (it’ll be much easier to lock things down if we want to increase security at a later date, and besides share-level security went out with the Ark) and a single share called incoming. We also need a user for the MFD to log into Samba with; we’ll call this user “scanner”. We’ll also have a group called “scanner” so we can be a little more flexible over who can access this share should we wish.

Edit /etc/samba/smb.conf as follows:

......

# "security = user" is always a good idea. This will require a Unix account
# in this server for every user accessing the server. See
# /usr/share/doc/samba-doc/htmldocs/Samba3-HOWTO/ServerType.html
# in the samba-doc package for details.
   security = user

......

[incoming]
        path = /home/incoming
        guest ok = no
        browseable = no
        read only = no
        valid users = @scanner

Now, we need a new user for the MFD. Samba requires that users also have corresponding Unix accounts, so first we create a Unix account, then we set their Samba password. We also need to ensure the permissions on /home/incoming are correct – the folllowing commands deal with this:

  useradd scanner
  smbpasswd scanner
  chgrp scanner /home/incoming
  chmod g+rwx /home/incoming

Make sure you choose a password that is not only secure, but possible to type in on your MFD! Check this works by connecting to the following folder in Windows:

\\(hostname)\incoming

You’ll need to use the username/password for the scanner user you set up.

For the final part of this, you need to set up your MFD to scan to this directory.

I’ve chosen an Oki MB451 multifunction unit for a number of reasons:

  • It’s cheap.
  • It has a double-sided document feeder for scanning. More and more documents are being sent double-sided; it seems like a step back to have a document feeder that can’t deal with this.
  • It supports scanning directly to email and CIFS share without requiring extra software on the PC. (This is important; certainly a few years ago a lot of manufacturers claimed their products could do this but it wasn’t apparent until after you’d taken it out of the box that their product didn’t do any of it without additional software on your PC. Certain large photocopier-type units still have this restriction, though sometimes you can buy an optional bolt-on to overcome it. I prefer avoiding the need for extra bolt-ons because they’re usually extortionately priced and often difficult to source).
  • It has a nice big display. These units can be a pig to set up at the best of times; a large display often goes some way to alleviate this problem.
  • You can set up lots of profiles – preconfigured shortcuts that say “everything scanned under this profile should be stored under this name in this share accessed with this username and password; files should have this format”. Unfortunately you can’t nail a profile to say “everything scanned under this profile is double-sided” but you can’t have everything!
  • The printer supports Postscript, which means it’ll be pretty much guaranteed to work under any OS I can throw at it for a long time to come.

I won’t go into detail regarding MFD configuration – there’s simply too many on the market and they all vary. It’s enough to explain that I’ve set up a profile called “Correspondence” and I’ve pointed it at \\(hostname)\incoming.

With the profile I’ve set up, scanned documents will be stored under \\(hostname)\incoming\Correspondence-#####.pdf.

Test this all works by scanning a document and making sure it appears in the /home/incoming directory on your Linux box.

There’s only one thing left to do – tie all this together so incoming documents are automatically OCR’d, made available via Apache and OCR’d so they’re indexable in Sphider….

Read More

How to rescue your data on a failing hard drive with dd_rescue

I just got done answering this question for someone at work:

What was that app that allowed you to duplicate the failing hard drive?

So I figured, why not put it out there for the Internets to see?

It’s the dd_rescue command within Knoppix. (I’ve always used 5.0 or 6.1 – I think dd_rescue is still included in version 7)

You need to boot to the Knoppix CD/DVD and run dd_rescue on the DISK, not the PARTITION.

Syntax:

dd_rescue /dev/sda /dev/sdb

Given that /dev/sda is your source drive, and /dev/sdb is the destination drive.

You can check your disk configuration like this:

fdisk -l

Read the output from there to see which disk is which. Your best bet is to use a sanitized drive as the destination so it’s easier to tell which disk has stuff on it and which doesn’t.

Since this is all command line, I HIGHLY recommend running a practice job using data that doesn’t matter before going forward with real data. This process can easily ruin good data if you have the command backwards.

Additional switches:

You can use these switches to “sandwich” a bad spot on the drive and pick up as much data as possible.

-s defines the start position.
Ex: (will start at 15G and go forwards)

dd_rescue –s 15G /dev/sda /dev/sdb

-r defines working from front to back
Ex: (will start at 15G and go in reverse, back to the beginning of the drive)
(-r does not use DMA buffer, which can increase error correcting abilities)

dd_rescue –r –s 15G /dev/sda /dev/sdb

Read More

Top to bottom troubleshooting: Part 2

In this post I will be going over the main methods that I use to remove infections from Windows based computers.  This technique is another best case, works on 9/10 machines I see type deal, so it covers the majority of the common infection issues users are likely to see.

I like to start at the lowest level I can when troubleshooting these types of issues and work my way up, using a similar approach to how I troubleshoot hardware problems.  That way it is much more difficult to miss things.  If you skip over issues, you almost always have inconveniences down the road.  That is especially true when there are rootkit infections present.

As much as we’d like to say that rootkits are going the way of the curb with the adoption of 64-bit operating systems, we are finding time and again that this simply is not true.  Malware creators are finding creative ways to bypass the 64-bit security mechanisms, making it important to check for their presence lest you get burnt later on.

Note: I almost always run these tools in safe mode first.  I know there are user level rootkits and the argument can be made to clean in normal mode but for me, cleaning in safemode gives a much better idea of what and where to look for things.

Infection Removal

Probably the best all around tool for infection removal out there, a handy piece of software called ComboFix by sUBs.  It is always the first thing I run and is the heavy hitter of my removal tools.  This tool has rootkit detection built into it, so it will alert you if it detects the presence of suspicious activity (this is actually the first thing to inspect for but we will get to it in a bit).  I don’t know how many times that this tool has saved me an excessive amount of time and effort.  It is pretty much automatic; download it, make sure it is up to date and then let it do its thing. It spits out a log file when it is finished giving clues to what exactly it was able to remove.

If ComboFix detects the presence of a rootkit and says it has successfully removed said rootkit, be sure to double check with another, different scanner to be sure.  CombFix is good at what it does but it is not designed to remove some of the more advanced rootkits so be sure to be thorough in your removal process.

Rootkit Inspection

There are a few really handy tools when dealing with rootkits, making your life easier when removing these bastards.  The first  is a nice little program called  TDSSKiller,  which can detect and remove a variety of different rootkits.  It runs a quick scan for a number of well known rootkits and attempts to remove them, afterwards producing a log file for later analysis.  If the program is unable to remove the rootkit(s) it will at least give you clues of where to look at with the more advanced tools mentioned in this section.

The one I always find myself coming back to when a machine is hosed is called  GMER.  This tool will give you a quick way to detect the presence of rootkits when you fire it up.  If it detects rootkit activity it will usually tell you and list the items in red within the Rootkil/Malware tab of the program. There are many advanced features and uses for this program which I may cover in other posts but are out scope for this topic.

Another solid rootkit scanner I have had success with is  RootRepeal.  If I am suspicious of rootkits after I have done a basic analysis with GMER I will usually run a full scan with RootRepeal, mainly because the scan doesn’t take nearly as long as the full scan in GMER.

Temp File Cleaning

Once we have cleaned out rootkits, it is a good idea to clean out temporary files.  Infections are commonly hidden inside of temp files and folders so be sure to check them just in case.   Another good reason to clean out temp files at this point is because it actually speeds up the process of malware scanning since these potentially malicious files and folders will have been cleaned already.  At the very least, temp file cleaning helps to improve the performance of the computer and free up space if there are an ungodly amount of temp files (believe me, I have seen some crazy %&#$).  I have had the best results with TFC by OldTimer for this type of thing.  It is fast and powerful.

Virus Cleaning

We’re almost done, so bear with me.  The final step in the process is to clean up all the loose ends left behind from the removal process.  Usually there are bad registry keys or trojan remnants or whatever else left over that still linger after the main cleaning process has been run.

At this stage there are a couple of handy tools.  The first is the  Malwarebytes free scanner.  This is another essential tool that I use on nearly every infection I work on.  If I believe a machine has be cleaned up enough at this point I will run a quick scan and if no malware traces are found I will call it a day.  If the quickscan reveals traces, I will remove them, reboot the machine and run a full scan to search for further traces.

Another good 2nd opinion scanner is Hitman Pro, a free cloud based scanner that does an excellent job of analyzing left over malware traces.

Once these scans are clean all we need to do is put a reliable anti-virus software on the computer and call it a day.  This however can be tricky.   Without starting a holy war here I have to say I have had luck recently with Microsoft Security Essentials (which is free up to 8 licenses or something).  I’ve heard good things about the full paid version of Malwarebytes for real time protection, as well as Kaspersky, NOD32 and Avira ant-virus products.  The reason I have been recommending MSE is due to its light installation, low background noise, freeness and its decent detection rates (flame me if you must).

As a side note, one thing I will say with full  certainty  though is that I absolutely abhor Symantec Anti-anything as well as McAfee-anything.  There are programs designed  specifically  to remove them because they are so bad.  They are expensive on memory and cpu overhead, take up tons of space and get in the way of everything a user does.

Conclusion

There are a ton of viruses and malware out there which are continually evolving and expanding into new areas.  Likewise, there are a ton of tools out there to combat the bad guys.  Some of these tools are better than others, but generally speaking the combination of tools I have outlined here will combat the majority of malicious code out there targeted towards the average user.  These tools have kept up with the malware writers and while they don’t offer a perfect solution they do a pretty good job.

So to reiterate, here is the general order of my cleaning process:

  • Combofix
    • TDSSKiller
    • GMER
    • RootRepeal
  • TFC
  • Malwarebytes
  • Hitman Pro
  • Microsoft Security Essentials (virus protection)

Read More

Top to bottom troubleshooting: Part 1

In this post I will be discussing the techniques that I have found to be the most tried and true methods for fixing broken Windows machines.  Granted there are a million and one ways that things can go wrong (as we all know) but using this approach I have found that it can cure 9/10 computers that make their way to me.  For the other 1/10, I haven’t really discovered a bulletproof technique for fixing as each of those issues is usually something entirely unique and may lead to future posts.

The first and absolute most important step in the troubleshooting process is ensuring that the hardware of a system is functioning properly.  The reason is simple;  often times a machine can display wacky symptoms due to bad hardware and it becomes a situation where you chasing to try to fix the symptom rather than the core problem.  I have seen it (and learned it painfully myself) too many times before, a piece of hardware is faulty and causes the computer to fail at different points, making it impossible to isolate problems effectively, causing many headaches in the troubleshooting process.

Physical Inspection

The easiest and often times most overlooked technique to fixing an issue quickly is to simply crack open the case and check for symptoms.  You would be surprised how filthy a neglectd case can become over time so vacuuming and spraying out the case with compressed air becomes just as important as any other step in the process, laptops included.

After cleaning out the case inspect for physical damage to internal components.  So many times I have seen leaky capacitors cause sporadic issues on otherwise perfectly running machines.  At this point you should also check to make sure the fans are running (especially for graphics cards), the CPU heat sink is clean, correct wiring to components, etc.

Power Supply

This step can be tricky, and is something that you just seem to get a feeling for over time (it certainly doesn’t hurt try at any stage but can sometimes be more work than is necessary).  I have been seeing fewer and fewer instances of bad power supplies recently so I don’t know if the manufacturing quality has gone up or if I have just had good luck.

There are power supply testers that can be purchased but I will usually just grab a known good power supply off the shelf and hook it up to test if the original power supply is bad or in the process failing.  Simple enough.  This is an issue that is pretty straight forward, either it works or it doesn’t work.  Green good.  If the power supply doesn’t work then you won’t be able to test anything else.

Hard Drive (doesn’t apply to SSD drives)

This is the stage where I see the majority of problems.  It is vital to ensure the hard drive is healthy and working properly.  The most battle tested and reliable tool in my bag for hardware troubleshooting is a tool called “Drive Fitness Test” from Hitachi.

Essentially this tool scans the hard drive for bad sectors as well as testing the features of S.M.A.R.T.  It is simple in function but so many times is overlooked in the process.  Another tool that for testing drives is “MHDD”.  This tool is VERY comprehensive in its analysis of the drive but  unfortunately  lacks good documentation (it was made by some mad scientist  Russian  dude) so there is somewhat of a learning curve to it.

Memory

Another important step to consider is testing out the memory modules of a misbehaving machines.  Although this is the least often cause of failures it is an important step the process because it may come back to bite you later, the same way a leaky cap or some other simple, overlooked step can be.  The go to tool for testing RAM is called “Memtest 86”, this is found on most Linux distributions these days so if you have an old disc laying around you are ready to rock.

Conclusion

These are some of the fundamentals that I have painstakingly learned the hard way over the past 5 years.  There are many, many other tools for testing out faulty pieces of hardware.  Even with so many options for testing tools, I keep coming back to these basics time and time again.  They have really become a foundation in my troubleshooting techniques and just seem to get the job done.

So to recap, here is the general order that things should be checked when testing for broken hardware:

  • Physical Inspection
  • Power Supply
  • Hard Drive
  • Memory

Next Up: Infection Removal

Read More