Google Blogoscoped

Forum

Two Alleged SEO Spam Tricks  (View post)

SEO/SEM blog [PersonRank 0]

Wednesday, October 8, 2008
15 years ago5,179 views

Wow, those are some tactics. Using cheap man power to influence online positions. How clever, yet dangerous. What Google has to say about it?

TOMHTML [PersonRank 10]

15 years ago #

Last year there was already spam on Google Maps:
http://blumenthals.com/blog/2007/07/27/first-case-of-large-scale-abuse-at-google-maps/

Tomas Kapler [PersonRank 1]

15 years ago #

actually this is not true. There was a big search engine in 1997 who calculated positions using click tracking and it bankcrupt after few weeks, because it was too easy to write a click bot.

Google uses clicks, that's right. But they do not put it in the search engine ranking algorithm, they use it for controling algorithm -> way of find if the changes in the algorithm got appropriate and wanted results.

Why they grow up? It is easy. They are older, they are investing money to propagation, they are watching SEO trends, they are more known, and so they grow, because in many sectors there are not so many companies investing money to propagation and watching SEO trends, or there is only e.g. 5 or 10 such companies. So it is quite easy to go from 200 page to second page, but it is not so easy to go from e.g. 7.th position to the second

Philipp Lenssen [PersonRank 10]

15 years ago #

> Google uses clicks, that's right. But they do not put it in
> the search engine ranking algorithm, they use it for
> controling algorithm -> way of find if the changes in the
> algorithm got appropriate and wanted results.

Thomas, this sounds interesting, do you have some links & references?

Tomas Kapler [PersonRank 1]

15 years ago #

to philipp: speaking with google guy on a conference a year ago and well known inside knowledge of one other big search engine. And more important -> it is also the only logical way. So even if the guy would not tell the truth or something has changed in the last year, i believe this is not the case.
One of the problems of putting clicking to search ranking algorithms is VERY BIG difference between natural ways of arrival on the servers – there are many servers which are not visited because of someone search them, but because they are known (e.g. services/servers like yahoo), or because they have other natural ways of arrival, or doing a lot of advertisement.
Second VERY BIG problem is causality spiral -> prefering the servers with high clicking would put them higher so they would naturaly got even higher clicking, so you would put them higher ... and you are then unable to find if they are visited because they are high, or because they have just better title and description, or because they are more popular.

Tomas Kapler [PersonRank 1]

15 years ago #

post scriptum: just for explanation my position – i am now CTO of one bigger (in the terms of our small 10 milion inhabitans country – 40 employees) web development company (http://www.developstudio.com) so working with this for 13 years.
I haven't tried to employee chineese students, but of course we have written and tried clicking bot ;) which used many computers just a day after we realised that google is measuring clicking and nothing has changed.
But of course something could have changed, or they have very sofisticated antibot sollution, or it works just at some situations etc. so i may not be right, but as I have described above, I believe that it would not be a good idea to put clicking into algorithm.

Mike Blumenthal [PersonRank 1]

15 years ago #

Being close to the centroid used to be the main ranking factor in all Maps product.

Google is now using Location Prominence (similar to Page Rank) to rank businesses in the Local Universal results. It appears that distance to the centroid might still have some weight, it is much less than in the past. This was confirmed by Carter Maslan in his interview with Eric Enge in the spring. An confirmed in research that we have done. Location Prominence uses a number of factors including geo references & in-bound links for ranking calculation.

That being said, on non competitive phrases and in markets with very few listings where Location Prominence can not be used Google goes back to the distance from the centroid as the primary ranking factor.

Mike Blumenthal
Understanding Google Maps and Yahoo Local

http://blumenthals.com/blog/

[added link to Mike's blog as it is relevant here]

Don Draper [PersonRank 0]

15 years ago #

If they're using clicks, then it's likely they're looking for a diversity of users providing the clicks so a clickbot from a small group of IP addresses wouldn't make an impact. A better experiment would be to use something like Amazon Mechanical Turk to have a few thousand people do a search and click a result to see if it moves.

The flip side is what if Google penalizes a site if they think a click bot is operating? That would be dangerous, because then you could just turn your clickbots loose on your competitors.

Ant Onaf [PersonRank 0]

15 years ago #

I can concur that the second rumor seems to work. I have done it in the past it had success, although I did have other campaigning going on, so I can't say cold-heartedly that my improvement in rankings were contributed fully to logging into different systems and searching on my keyword then clicking my listing.

I had the ability to do this myself back in 2002 when I worked at a company where I traveled. Basically, I traveled daily and I worked as an IT consultant so I had an unlimited arsenal of systems I could log into and get click happy with. Anyway, I would suspect you'd be able to test the same without hiring bikers or traveling from station to station, but instead from the comfort of your home use the many proxy sites available on the web, though Google may have a list of IP addresses for the major proxies or heavily populated proxy sites, so I'd go with the less abused proxy sites, those either just starting out or those buried deep in the search results – they probably get the least traffic and use, which makes things look more natural...from an IP prospective.

Andy Wong [PersonRank 10]

15 years ago #

Massive people tactic (People Flood tactic) has been working well in last 10 years, and it is not algorithm's job to beat such tactics, especially there are cheap human labours around, and it is pretty to reorganize the tactic to beat algorithm change on the server side. I am pretty sure that Google guys were well aware of these, and they just need to put such abuse under control.

beussery [PersonRank 10]

15 years ago #

Google Maps rankings are based on relevance, geo distance and other factors. Distance from the center of town can be important.

http://maps.google.com/support/bin/answer.py?answer=7091&topic=13435

barbol [PersonRank 0]

15 years ago #

google maps rankings are also related with user comments (number and quality), and not only on google maps, also on other social media yellow pages.

DoFollow [PersonRank 0]

15 years ago #

The second tactic is not a rumor. It is true and it's being practiced daily.

Forum home

Advertisement

 
Blog  |  Forum     more >> Archive | Feed | Google's blogs | About
Advertisement

 

This site unofficially covers Google™ and more with some rights reserved. Join our forum!