Yes !! They finally got it !!
In my post of November last year I asked that the search engines “Make it a standard crawl !! Instruct your robots to look for a file called sitemap.xml the way they look for robots.txt.”
Well, now they are going to do it…. in a even better way as well..
You can read about the announcement on The Ask Blog Sitemaps Autodiscovery they will tell you more on the Search Engine Strategies in New York.
It is going to be simple addition to your robots.txt file like:
SITEMAP: http://www.the URL of your sitemap here.xml
You need to put in the full URL, like for this site is is going to be
Put your sitemap.xml link in your Robot.txt file now!
If you have mutliple sitemap.xml files, you have to put the paths into a sitemap index file and point to that file.
If you don't want to wait on the Autodiscovery, You can still send them:
To Google via the Webmaster Central,
To Yahoo as a feed in their Site Explorer,
To ask via the command:
http://submissions.ask.com/ping?sitemap=http://www.the URL of your sitemap here.xml
To Live? I guess just via your Robots.txt file…