Google testing sitemap indexing support
03 Jun 2005Google has launched an experiment designed to speed the flow of Web site information to the search giant’s index.
Sitemaps, which is currently in beta, calls for Web administrators to place a Sitemaps-formatted file on their Web servers. This allows the Google crawlers to see which pages are present on a site and which have been changed.
c|net also has an article that talks briefly about it at a high level.
A slashdot reader had this to say “Google has launched Google Sitemaps. It seems to be a service that allows webmasters to define how often their sites’ content is going to change, to give Google a better idea of what to index. It uses some basic XML as the method of submitting a sitemap. More information on the protocol is available in an FAQ. What’s most interesting is that Google is licensing the idea under the Attribution/Share Alike Creative Commons license. According to the Google Blog, this is being done ‘…so that other search engines can do a better job as well. Eventually we hope this will be supported natively in webservers (e.g. Apache, Lotus Notes, IIS).’ They even offer an open source client in Python.”
I’ve been trying to get into the google sitemaps site with little success. I imagine it’s being hit fairly hard right now and is timing out on 9/10 attempts. If I can ever get in I wouldn’t mind trying it out on one of my domains but its unclear what benefit I would get. I imagine the site would be better indexed which is always good. Google says up front that its not going to improve your page ranking.
If this takes off, it would be useful to have a sitemap generation tool built into the blogging engines (wordpress, movabletype, etc.).