If you are doing massive SEO with 301 redirects and such, I recommend to use Linux curl to test your rules. I have been doing this for the past 3 years now and I am surprised why I have never blogged about it. Curl allows us to specify the user-agent and this would be ideal to simulate search engine crawling to your website if you have made specific rules to redirect search engine bots but not humans.
This simple example illustrates how you can simulate a Google bot crawl against jamesattard.com:
curl -A "Googlebot" www.jamesattard.com
This will show how jamesattard.com will look like to a Google bot.