If you think the big three Search Engines cannot work together, just read this post
So I went ahead and checked out http://www.sitemaps.org/
The protocol is just one line of of the Google sitemap, just the definition line is different.
Now I have several sites, including a recent one.
Since that has not much visitors yet, I decided to run a test there :-)
Testing the new Sitemaps protocol.
It is a Joomla based website with the Open-Sef component.
For You who don't know this component, it has a build in Google Sitemap XML creator.
I generated the file which I now have named sitemap.xml instead of gsitemap.xml.
The generated file has the google definition, so I needed to change that.
Downloaded the file, changed the protocol header to xmlns=”http://www.sitemaps.org/schemas/sitemap/0.9 and uploaded the file again.
Via Google's webmaster central I did a resubmit to see if I got any errors.
Result: No Errors, It worked straight away.
Went over to Yahoo's site explorer, which already has a backlink to Sitemaps.org !
Put in the path to the sitemap.xml file in the Feed….
It didn't give me any errors….
For Live.com I am waiting and reading Live Search Weblog to see how they progress at Microsoft.
Work Sitemap.xml the same way as Robots.txt
What I would Like most for all of them to do, is working this thing like the robots.txt file!!!!
You Guys and Girls making this a standard, take some of our labour away from saying where and how the file is placed.
Make it a standard crawl !! Instruct your robots to look for a file called sitemap.xml the way they look for robots.txt.
Now that would be really, really Cool.
Robots.txt tells the spider what not to crawl, Sitemap.xml tells them what they should crawl…
Then it would be a real win-win situation and we would even benefit more from this initiative.
Update !! Read Autodiscovery of Sitemap.xml via Robots.txt