I thought it was:
Sitemap: http: //www.example.com/sitemap.xml
http://www.sitemaps.org/protocol.html#submit_robots
http://ysearchblog.com/archives/000437.html |
Sorry Brent, I fixed my post now! |
Great, all my sites already has sitemap in the very same place :) |
Phillip, you don't have much reason to use sitemaps, sure. But for huge ecommerce or news sites with constantly changing content a sitemap will make sure you get quick and accurate indexing.
I hope that the next version of the blog tools we all use comes with built in sitemap support. |
I wonder what to do with gzip:ed sitemap. At least google webmaster tools read it (couple of years ago) just nicely as sitemaps.xml.gz. |
This is great. All those per-engine tags are a waste of bandwidth.
Also, MSN/Live search is a member of sitemaps.org, but I've never seen a place to ping them. It's pointless now, but it makes me wonder if they just automatically GET /sitemap.xml?
I wonder if I can use duplicate sitemaps on one site now, hehe. |
If you add this line to your robots.txt and use Google Webmaster Tools' "robots.txt analysis" tool to validate it, you get a "Syntax not understood" error.
<american>What gives?</american> |
Sitemaps allow me to get more of my dynamic content into Google's search engine. I have a genealogy site that has over 12,000 links and the Google crawling wasn't getting to all of my content. With the use of Sitemaps I have better coverage now in searches of my site. |
Great move but there are pitfalls and a few things to consider before one makes use of autodiscovery. http://sebastianx.blogspot.com/2007/04/is-xml-sitemap-autodiscovery-for.html
Heather Paquinas: MSN has not yet implemented sitemaps but they parse XML documents following URLs, so just use the submission form to announce your sitemap like an RSS feed until they get native sitemaps support done eventually.
Pasi Savolainen: That's part of the standard so all engines should read gzip:ed sitemaps
Tony Ruscoe: That's on the to-do list. |
I don't see that many pitfalls from having an auto discovery feature in the robots.txt file. It would allow you to have multiple search engines find your sitemap file without having to have accounts at each search engine like Webmaster Console (Google) or Site Explorer (Yahoo).
As far as people not having enough technical knowledge, there are plenty of tutorials online about robots.txt creation and usage. And the sitemaps.org site has tutorials on how to create a sitemap. |
... and every Webmaster and non-geeky site owner as well is skilled enough to recognize and fix for examples canonical issues to avoid implicit pitfalls? I doubt it. The protocols aren't the problem. As usual server configurations can lead to problems. |
We're adding support for the new instruction in our robots.txt analysis tool. The tool should be updated soon! |
Vanessa, It would be nice if you'd let the many people asking the same question over at the OFFICIAL google webmaster help forum know the answer as well. |
Thanks for your answer Vanessa, the same answer that I have ever read on Matt's blog, on Google Groups, .... you're everywhere! :-D |