Tuesday, July 8, 2008
Matt Cutts Q&A (Video)
Danny Sullivan of Search Engine Land did an interview with Google’s anti-web spam boss Matt Cutts at the SMX conference on June 3rd. The issues they talked about in the first half of this (the second half likely won’t be live until some days later) include:
- The dark craft of creating fake stories to get backlinks (the “13 year old steal’s dad’s credit card to buy hookers” incident), and Google’s reaction to it. Matt: “... to have that land in your lap and not take any action on it at all I think would have been a little irresponsible.” He adds though, “We’re not gonna, you know, be going out on patrol trying to figure out what’s true and what’s not true.”
- Short-term Search Engine Optimization gains vs building long-term, future proof trust. Matt thinks of short-term SEO as the “Milli Vanilli” way...
- People scraping PageRank values, e.g. to build a PageRank information archive. Matt tells the story of how Google once sent out fake PageRank numbers to a guy doing such scraping, as the guy’s scraper was easily identified when it requested information...
- “Widgetbait”, like web counters which are distributed with a link included. Google looks at different parameters to decide what is blackhat, like checking how on-topic the target page and anchor text are. Say you have an Ubuntu release countdown widget and the backlink on the widget goes to Ubuntu.com, then it may be OK... whereas if you have a countdown widget and the link goes to an unrelated “debt consolidation" type of page (or worse, the link is hidden), it may not be OK.
- Matt: “We’re also always, in web spam, willing to take both manual and algorithmic action to try to improve the relevancy of our search results.”
- As we know for some time, search results indexed in search results aren’t wanted by Google. On the other hand, the Googlebot sometimes fills out forms itself, which may also trigger the bot to find search results. Matt says, “Google in some very limited situations can fill out a form, and we do that more to find new pages on a site, not to get the search result pages.”
- Matt differentiates between two types of rules in the webmaster guidelines; those which are mere gray-zonish suggestions (the technical guidelines), and those which are to be taken more literally (the quality guidelines). “So, one of the example technical guideline is, ’don’t have more than 100 links on your page.’ And that’s not a hard [rule] ... like ’If you put 102 links, your site is gone!’ Right, we don’t do it that way.”
>> More posts
Advertisement
This site unofficially covers Google™ and more with some rights reserved. Join our forum!