I'd reccomend signing up and running that script to anyone. Sounds like you have little to lose.
We're undertaking an experiment called Google Sitemaps that will either fail miserably, or succeed beyond our wildest dreams, in making the web better for webmasters and users alike. It's a beta "ecosystem" that may help webmasters with two current challenges: keeping Google informed about all of your new web pages or updates, and increasing the coverage of your web pages in the Google index.[...]
(taken from the google blog)
short blurb about it:
they also have a python script for generating that xml stuff:
[This is a plain FYI thread. Dunno if it's worth a shot.]
It's obvious this is phase 1 of some bigger thing. I see later versions will have more ways to describe your site, and imagine if browsers could use this information to pre-cache pages, or to allow you to subscribe to a site just like you can with rss now.
[Web Developer and RPG Fanatic]
AW Dot Com
More than keywords, the text, and links? I don't think an individual could do more than that.Originally Posted by whisperstorm
Pre-cache pages? This was attempted in the days of yore, and led to massive congestion of servers. The Web thrives on the fact that not everyone visits every page. Even if it were the case they'd bring back pre-caching, you can use links from the current page as references and pre-cache all you want from there. After all, you can't go somewhere on a site if there aren't links to it! No XML required!Originally Posted by whisperstorm
Or, you could ... subscribe to a site just like you can with RSS now!Originally Posted by whisperstorm
This google sitemaps thing is an attempt to get to some 'deep' pages that cannot be found by a normal web spider crawl. For example if you have a database with 10000 items that can only be found by doing a text search then entering a sitemap of all these pages for google will mean that search engines will be able to find and index them.
As long as all your pages can be found by a normal crawl (meaning they are linked to from somewhere) you don't have to bother with this. Having a normal HTML sitemap page that links to all your other pages is another valid way of revealing all your webpages to search engines.
From the Google Sitemap About page:Originally Posted by Sillysoft
"Google Sitemaps is intended for all web site owners, from those with a single web page to companies with millions of ever-changing pages."
Dustin is pretty much spot on, I think.
But you can give your pages different priorities and google's index could be (theoretically) more up to date. That's the main difference. You can be also 100% sure (in theory that is) that there aren't any blind spots anymore.
Maybe it's a good idea to run that script after every (bigger/important) site update.