Source: http://www.free-seo-news.com/newsletter718.htm#facts
Every website owner wants to see their most important pages on top of search results. If you lead search engine
bots, your valuable pages will be crawled every time bots visit your site.
Here are ways on how you can guide search engine bots to crawl the right pages:
1. Use the meta noindex tag and the right robots.txt file – Your robots.txt file tells search engines which parts of your website should be accessible. Using this file, you can set disallow rules for folders and files that you don’t want to be indexed.
The meta noindex tag is also used to exclude individual pages from the index. However, if you have a good robots.txt file, there is no need to use this tag.
2. Check your website structure – If you have properly categorized pages, search engine bots can easily find your important pages. These pages should be accessible within a few clicks.
3. Make sure your internal links are working properly – Search engine bots follow links on your pages. Having broken links show that you have a low-quality site. Making sure that your links are working will prevent Googlebot from crawling non-existing pages.