It appears that Google, along with the other major online search engines, have combined to change the way they view sitemaps. Before you could submit to each major engine with a location (usually .xml) of your sitemap file and they would crawl it. It appears there is an even better way.

Robots.txt Used to Locate Sitemap

Now you can place a line of text in your robots file so all spiders (who understand what to look for) will be able to locate your sitemap file easily. This is how your new entry in the robots.txt file should look like:


You can still submit to the search engine webmaster consoles directly, however, this is a one stop solution to help the spiders find what they need. Of course, this also implies that you use sitemap.xml files to start with. Many SEOs are now advocating that you bypass the whole ‘add sitemap’ because you loose certain information like where your weaker pages (PR wise) are.

More of the news here:

[tags]sitemaps, site map xml, sitemaps robots.txt, sitemap robots, sitemap protocol, google sitemap[/tags]