Google Sitemaps is an easy way for you to help improve your coverage in the Google index. It’s a collaborative crawling system that enables you to communicate directly with Google to keep us informed of all your web pages, and when you make changes to these pages.
Basically, you submit a simple XML file to Google, so that it has a better grip on the structure and scope of your site. Sounds like a great idea.
I see an execution gap here, though. DarrenBarefoot.com is, what, 2600 pages? I’m obviously not going to build that XML file manually (with one node for each page). Google does provide a Sitemap Generator, but it’s Python code meant to be run on my web server. My Python skills are nil, so that route isn’t viable for me either. I expect that there’s a good many ‘webmasters’ (as in, people who design and run websites) who don’t know Python from perl.
This gap may be temporary, though, because Google Sitemaps has been released under the Attribution/Share Alike Creative Commons license. Maybe somebody will grab the code and build an idiot-proof solution for the Sitemap Generator.
UPDATE: I meant to also mention that there’s an interview with Shiva Shivakumar, engineering director and the technical lead for Google Sitemaps.