Michael Wyszomierski of the Google Search Quality Team talks about errors that Google will report to you in the Webmaster Control Panel about the crawl and indexing of your website.
My Comments
There are two types of crawl errors they are site errors and specific URL errors.
Site errors means that the Googlebot cannot access your website and are usually from an item listed in your robots.txt file. You should give these errors top priority for fast resolution.
The most common URL is a “not found” error. This means that a page may be missing from your site or the URL changed and your sitemap.xml file should be updated to remove the triggering of these error messages.
Having specific URL errors on your site is completely natural and nothing to worry about, but best practices should include a review and clean up periodically of these problems.
The crawl errors are more important. Although a crawl error can be caused when your server is not available due to a glitch or network error, they should be investigated to make sure that your site is not blocking the crawl by search engine robots. You may even need to get with your web host if server errors happen frequently.