Google Blogoscoped

Forum

Google Manually Alters Organic SERPs if A Domain is Too Dominant

Search-Engines-Web.com [PersonRank 10]

Saturday, January 5, 2008
16 years ago3,321 views

This has been suggested before by SEOs, but now here are two VIP resources in Google who verify the justification:

Peter Norvig, formerly the Director of Search Quality at Google and now its Director of Research
http://radar.oreilly.com/archives/2008/01/human_vs_machine_google_wallstreet.html

<blockquote>For what it's worth, while Google strongly favors a proprietary algorithmic approach (much like hedge funds and Wall Street firms trading for their own account), they also recognize the importance of human intervention. Peter Norvig, formerly the Director of Search Quality at Google and now its Director of Research, pointed out to me that there is a small percentage of Google pages that dramatically demonstrate human intervention by the search quality team. As it turns out, a search for "O'Reilly" produces one of those special pages. Driven by PageRank and other algorithms, my company, O'Reilly Media, used to occupy most of the top spots, with a few for Bill O'Reilly, the conservative pundit. It took human intervention to get O'Reilly Auto Parts, a Fortune-500 company, onto the first page of search results. There's a special split-screen format for cases like this.</blockquote>

Here is a related post from Google's spam chief in December ...
http://www.mattcutts.com/blog/subdomains-and-subdirectories/

<blockquote>For several years Google has used something called “host crowding,” which means that Google will show up to two results from each hostname/subdomain of a domain name. That approach works very well to show 1-2 results from a subdomain, but we did hear complaints that for some types of searches (e.g. esoteric or long-tail searches), Google could return a search page with lots of results all from one domain. In the last few weeks we changed our algorithms to make that less likely to happen in the future.

This change doesn’t apply across the board; if a particular domain is really relevant, we may still return several results from that domain. For example, with a search query like [ibm] the user probably likes/wants to see several results from ibm.com. Note that this is a pretty subtle change, and it doesn’t affect a majority of our queries. In fact, this change has been live for a couple weeks or so now and no one noticed. :) The only reason I talked about the subject at PubCon at all was because someone asked for my advice on subdomains vs. subdirectories.</blockquote>

Search-Engines-Web.com [PersonRank 10]

16 years ago #

Peter Norvig reveals more tidbits about Google

http://www.technologyreview.com/Infotech/19868/
TR: You claim that Google's accuracy is pretty good. How do you know how good it is, and how do you make it better?

PN: We test it in lots of ways. At the grossest level, we track what users are clicking on. If they click on the number-one result, and then they're done, that probably means they got what they wanted. If they're scrolling down, page after page, and reformulating the query, then we know the results aren't what they wanted. Another way we do it is to randomly select specific queries and hire people to say how good our results are. These are just contractors that we hire who give their judgment. We train them on how to identify spam and other bad sites, and then we record their judgments and track against that. It's more of a gold standard because it's someone giving a real opinion, but of course, since there's a human in the loop, we can't afford to do as much of it. We also invite people into the labs, or sometimes we go into homes and observe them as they do searches. It provides insight into what people are having difficulty with.

Forum home

Advertisement

 
Blog  |  Forum     more >> Archive | Feed | Google's blogs | About
Advertisement

 

This site unofficially covers Google™ and more with some rights reserved. Join our forum!