Google Blogoscoped

Forum

First Click Free for Web Search  (View post)

WebSonic.nl [PersonRank 10]

Friday, October 17, 2008
15 years ago13,534 views

"While working on our mission to organize the world's information and make it universally accessible and useful, we sometimes run into situations where important content is not publicly available. In order to help users find and access content that may require registration or a subscription, Google offers an option to web and news publishers called "First Click Free." First Click Free has two main goals:..."

http://googlewebmastercentral.blogspot.com/2008/10/first-click-free-for-web-search.html

Above 1 comments were made in the forum before this was blogged,

Ludwik Trammer [PersonRank 10]

15 years ago #

It seems every time I read about Google changing policy it's a bad news. I have to tell – I liked the old Google better...

mak [PersonRank 5]

15 years ago #

I can see this turning into "evil".
Nice thoughts on how this thing may be good/bad Philipp.

Ziv Levin [PersonRank 0]

15 years ago #

Google is going down the Microsoft path.

They're afraid several start-ups are going to catch up to them – something which seems more and more likely in the past year. So they're leveraging their power (and losing lots of PR along the way) to try and bind current users to the Googlesphere.

Btw, I wonder if Google will urge webmasters to treat Chrome users the same as users coming from the Google SERP??

Eugene Villar [PersonRank 5]

15 years ago #

Excellent analysis of the implications. I don't think the positive gains outweigh the negative ones.

Mary Nicole Hicks [PersonRank 1]

15 years ago #

This is the one thing I hate most about search results. I want a result now not after I:

Click the first result
See I have to register
Find the Google Cache blocked
Register
Wait 10 minutes for confirmation email
Login
Find it is not what I wanted
Endure Spam at the email I gave

Repeat for the next 5 results...

Can't Google offer me a little search option to say:
Exclude pages requiring registration... Tick.

JohnMu [PersonRank 10]

15 years ago #

[put at-character here]Mary Nicole Hicks: You can report pages like that as spam – you should always be able to see the same page as Googlebot saw when visiting it from the search results. Any site that shows a registration page instead of the page indexed is cloaking. For information on reporting spam, please see http://www.google.com/support/webmasters/bin/answer.py?answer=35265 (or use the preferred way: through a Webmaster Tools account)

[put at-character here]Philipp: While I agree that you could look at this in a negative way (some sites are excluding other search engines), I think it's better to make information on these semi-private sites available instead of hiding it away altogether. The sites are free to provide the same content for other search engine users. The only thing we request is that Google users are able to see the content Google indexes.

The alternative for us would be to not include this content. Seeing how – in general – content that some users are willing to pay for is often valuable, I think that making some of it available to all Google users is an important step.

T L Holaday [PersonRank 0]

15 years ago #

As you point out, a page with FCF will rank lower than an open page with the same information because it can have no useful links from human authors. Google is doing the right thing here.

Marcin Sochacki (Wanted) [PersonRank 10]

15 years ago #

I think it FCF really is an evil feature; too bad Google bowed to online marketers wishes and made that feature official.

Many times I have seen pages in Google index, which were clearly using cloaking, even when it was clearly forbidden. I reported those sites via spam reporting tool, but I have a feeling nobody at Google takes those reports seriously, unless they come in masses. I.e. my single report for a given URL will probably not make them delist misbehaving sites.

[put at-character here]JohnMu:
"The sites are free to provide the same content for other search engine users."
Right, but say if I wanted to create a new search engine I would have to make such a deal with each website with FCF policy. It is a significant barrier of entry, because as a small startup I would be probably ignored by those webmasters. As a result I would simply choose the obvious and lame solution: index the web with Googlebot agent. Is user-agent spam something Google really wants? E.g. Yahoo may soon start indexing with this UA:

Mozilla/5.0 (compatible; Yahoo! Slurp, like Googlebot; http://help.yahoo.com/help/us/ysearch/slurp)

This madness started with all the browsers including "Mozilla" in UA, even though they are not based on Mozilla code. Then Safari took it to the higher level by using "(KHTML, like Gecko)" misnomer. And then Google Chrome went totally overboard by cramming all possible names in its UA, including "Safari" even though it's not related to Safari browser ("Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/0.A.B.C Safari/525.13").

On the one hand, Google is promoting net neutrality and trying to prevent Internet DRM-ification, but on the other FCF feature is kind of a deal between two parties to increase the number of visitors and revenue, while screwing endusers and other search engines.

Kevin Skobac [PersonRank 0]

15 years ago #

Haven't they already been doing this with Google News – offering some content that's traditionally behind paid wall, when you pass through a Google News link?

Ionut Alex. Chitu [PersonRank 10]

15 years ago #

I think Philipp overreacted. This move is likely to add new high-quality web sites to Google's index and it will probably impact the results for obscure/specialized queries.

How would you convince a high-quality website that requires paid registration to provide that information for free? Including the web site in Google's results will add more traffic and some new subscribers, while Google users get access to useful information.

Hanan Cohen [PersonRank 7]

15 years ago #

There is a difference between what things smell and what they actually are.

To me, FCF smells like another try to Balkanize the web.

Marcin Sochacki (Wanted) [PersonRank 10]

15 years ago #

[put at-character here]Ionut:
How about the additional high-quality junk in the index?
See one of the comments below Google blog post:
"""
Having optimized for subscription based companies (Classmates.com and Smartsheet) I have been struggling with providing content for the search engines without compromising what drives revenue.
"""

Clearly, the major incentive behind FCF is driving traffic and revenue and NOT providing high quality content. Obviously, some websites do provide useful content behind user registration box, but that's minority. It's all about indexing as many pages as possible, regardless of quality and user frustration.

Seth Finkelstein [PersonRank 10]

15 years ago #

Good points. Manual trackback:

"Google, Wikipedia – seeing RE-INTERMEDIATION in action!"
http://sethf.com/infothought/blog/archives/001396.html

Paul C [PersonRank 1]

15 years ago #

So what happens when I change my HTTP Agent to GoogleBot. Does that mean I automatically get into any page on these sites without registering?

Philipp Lenssen [PersonRank 10]

15 years ago #

Paul, for users surfing the web:
I guess depending on your tools you could fake the referrer, making it look like you've just come from www.google.com?q=foobar. Though the site could make some educated guesses about whether or not that referrer is real, e.g. it could check if it considers "foobar" to be a realistic keyword for that landing page. A smart referrer faking could even provide a dynamic string, using e.g. the "feeling lucky" URL mentioned. Now of course, even if you solved that problem on your end, you still won't be able to send around such URLs safely or post them on the web because others may not have that workaround in their browser. Also, if people continue surfing with that string enabled, the faking would do its own to distort webmaster statistics for everyone, not just first-click-free sites.

For developers writing a crawler:
Google outlines a process where the crawler can be verified (http://www.google.com/support/webmasters/bin/answer.py?answer=80553) so as far as I understand it it would take more than just faking the user-agent string to "Googlebot". Now what if someone writes a crawler but then fakes the referrer in that crawler, as Marcin indicated? Perhaps that would be a possibility, though a site could probably limit the amount of pages that could be crawled in a specific short time this way, disabling realistic chances of crawling.

JohnMu [PersonRank 10]

15 years ago #

I think what you are worried about is that more and more sites will use FCF as an excuse to start charging (or requiring registration) for their mediocre content. I agree, that's something to keep in mind, but realistically speaking, if the content is that bad, it generally won't rank very high anyway, be it paid or not. Placing content behind a barrier does not make it better, it does not automatically make it rank higher nor does it automatically attract links (and visitors) as well as great content that is publicly available :-) (I'd even argue that the reverse is true: it makes it much harder to rank as well.)

While I agree that it would be absolutely fantastic if all content were freely available, in practice this is not always possible. One argument I've seen is that the owner says that they're not getting a lot of traffic from Google anyway, so why should they make more of their content available for free: we're hoping that by once they see that more and more users from Google are interested in their content, they might reconsider and make it available to more users, even those who do not come in from Google.

[put at-character here]Marcin Make sure that you report spam through your Webmaster Tools account (or create one if you don't have one yet). Those spam reports are all reviewed and acted upon. Sometimes it's not so clear based on the URLs alone, so it's always good to provide enough context in your reports.

[put at-character here]Philipp FCF needs to be implemented in a way that all users are allowed to access the content, regardless of whether or not the keyword is actually on the page (it might rank for a variety of reasons even if the keyword is not mentioned).

Matt Cutts [PersonRank 10]

15 years ago #

Publishers with premium content have historically not had any sanctioned way to surface that content to Google websearch users. They could cloak, but that put them at risk of having their sites removed--and certainly Google has removed both magazines and newspaper content from our index because of cloaking.

What I like about First Click Free (FCF) is that it provides a way that publishers can surface that content without cloaking (because the page that a user coming from Google sees is the same that Googlebot saw). The user sees the page that Googlebot saw and can get useful information, yet when the user clicks to a new article, the publisher gets a chance to offer a subscription or payment for additional premium content.

I'm glad that John did a post on the official Google webmaster blog last week to lay out the ideas of FCF. That said, I'd be interested to hear other people's thoughts on ways to surface premium content to users without publishers doing things like cloaking. We're always open to hearing ways where we can improve.

Roger Browne [PersonRank 10]

15 years ago #

John Wu wrote:
> The only thing we request is that Google users are able to
> see the content Google indexes.

It can't work that way, because not all browsers are configured to send the Referrer header, so not all Google users will see the content Google indexes.

But I guess most sites will handle it like Britannica did: they will give the first click free to users arriving from ANY url other than their own site. This would give Google no special advantage.

Philipp Lenssen [PersonRank 10]

15 years ago #

Matt:
> What I like about First Click Free (FCF) is that it provides a
> way that publishers can surface that content without cloaking
> (because the page that a user coming from Google sees is
> the same that Googlebot saw).

Google cloaking guidelines:
> Cloaking refers to the practice of presenting different content
> or URLs to users and search engines.

Does this mean when Google talks about "users" they just mean "Google users"?

Matt Cutts [PersonRank 10]

15 years ago #

Philipp, that section is intended to give a general idea of what cloaking is, and I think it does convey that idea. To determine cloaking for Google's purposes, we look at the difference between what Googlebot saw and what our users saw.

I think we have to define cloaking in terms of Google and its users, and it's not our place to define cloaking in terms of a site's behavior toward Slurp, MSNbot, Twiceler, or the interaction of those bots with Yahoo, MSN, Cuil, etc. Those search engines will have their own policies on what they consider cloaking and are willing to take action on in their search engines.

Now here's my reply question. :) Google has given its most recent thoughts on the policies to allow publishers to surface premium content while still providing a good user experience. Have you contacted other search engines regarding their policies on cloaking or First Click Free, and if so--what have other search engines said about their policies?

Philipp Lenssen [PersonRank 10]

15 years ago #

> I think we have to define cloaking in terms of Google and it's
> users, and it's not our place to define cloaking in terms of a
> site's behavior toward Slurp, MSNbot, Twiceler...

.... blogs, links sent in emails, links sent in chat, news articles linking, forum comments, new crawlers starting out today, ...

Which all confirms that – should "FCF" ever take off – Googlebot is indeed starting to see a very different web from much of the rest of us... a distortion, for better or worse. And from the arguments I'm seeing here, I wonder if Google is OK with starting an initiative that may affect the web at large, but then argues it's out of their scope to care what happens with other sites. I also wonder what Larry and Sergey would say to this initiative would they be starting to write a web crawler today. What's your guess Matt?

> Have you contacted other search engines regarding
> their policies on cloaking or First Click Free, and if
> so--what have other search engines said about
> their policies?

The Yahoo guidelines say "Some, but not all, examples of the more common types of content that Yahoo! does not want include ... Pages that give the search engine different content than what the end user sees (cloaking)". I don't see any detailed guidelines in regards to cloaking at MSN (http://help.live.com/help.aspx?mkt=en-US&project=wl_webmasters). I can ask them, though I don't think they recently announced a policy change at their blogs like Google did?

Steve [PersonRank 0]

15 years ago #

I'd like to introduce the concept of "Steve Free Click First". To use Steve FCF, you have to allow full access to Steve (and his friends), but you can make everyone else register if you want. I don't provide any code to make this work, and I don't actually provide any benefit, but it would be rather jolly if you did it.

Using my system (which is totally different to Google's, naturally), my users will be able to search "site:www.my-restricted-content-site.com" and open all documents in a new tab to get all FCF stuff free with no registration / payment / whatever.

Now, who wants to rush to implement Steve FCF on their registration site?

(Any similarity to Google FCF is coincidental)

Why would someone want to give Google customers full access to documents that they had already decided were for subscription / registration only?

The business model still requires people to sign up to make it work, and it seems to be based on the idea that people won't think of just using Google to navigate registration websites instead of the website's own navigation system.

Well done Google, you just convinced some people to off-site their own website navigation, and give their valuable registration-only content away for free.

Matt Cutts [PersonRank 10]

15 years ago #

"I don't think they recently announced a policy change at their blogs like Google did?"

We recently posted, but this wasn't a policy change. Regarding the idea that Google defines cloaking between Googlebot vs. our users, I believe that definition goes back to GoogleGuy days. I'm in a meeting, but I'll see if I can find where GoogleGuy said that.

As far as FCF being usable for web search, we've given that guidance since 2007 to many publishers that asked and I remember highlighting that point specifically at SMX Advanced this past June. Here's a comment where I said the same thing: http://www.seroundtable.com/archives/017331.html#comment-958767

Colin Colehour [PersonRank 10]

15 years ago #

I think FCF is a great thing for users. It opens up more content to at least the GoogleBot crawler which it never saw before because of registration requirements or being blocked by forms.

This sounds pretty similar to what we've already seen in Google News Archive. Content is crawled but it might be registration restricted or even require payment before you can see the full content.

What I would love to see is a way for all search engines to do things like this. Find a way of making an open standard like Sitemaps and open this up for more competition.

Affan Laghari [PersonRank 1]

15 years ago #

My two cents:

1) If I have a website that needs registration to view my content, FCF means I can set one (or more) of my pages to get indexed by Google so that these pages will show in SERPs and people coming from Google will be able to view that page. But what EXCLUSIVE benefit do I have by showing that content to Google's users BUT hiding from Yahoo, Msn, and other engines' users. I mean if I decide to open one of my pages to the outer world, why shouldn't I open them to users of ALL SEARCH ENGINES rather than just opening these pages to Google's users and hiding from other engines' users. If I open that FCF page to other engines too, then I have greater chances of getting more traffic to those pages as these pages will get indexed in all search engines rather than just in Google. I need visitors whom I am targeting. I WON'T care whether these users come from Google, Yahoo, Msn, Wikia, or Hakia. So why the special VIP treatment to Google.

2) How about rankings for this FCF content. Incoming Links are one the most important factors in rankings. And these links come from websites and blogs, not from Google's SERPs.

So if I link to an FCF page, I would be giving my users a bad experience. And if I know that, then I won't link to that FCF page, at least not without something like the Foobar mentioned by Phillip. And ethically speaking, FCF pages should show that they are FCF pages so people won't digg them, Stumble them, link to them, etc. only to give other netizens a bad experience.

OK. So no links means no anchor text exclusively for that page. Which means if you want to evaluate the quality of that page, you have to rely on onpage factors, which means we will have the same problems that pagerank came to solve.
OK, so maybe I am not that smart. And you will be using links (and offpage factors) to that domain in general and would evaluate that page based on the trust/authority of that domain. So that leaves user as losers.
It means authority sites can make some of their content subscription-based, make a couple of FCF pages that suck but they get them ranked anyway because they have used all onpage optimization factors well and have a highly trusted, authority domain.

Bob [PersonRank 0]

15 years ago #

Suckage begins

Philipp Lenssen [PersonRank 10]

15 years ago #

Matt, I have a question... per Google guidelines, are webmasters (who want to be indexed in Google) allowed to provide any content they want as "FCF-fallback-for-non-Google-users", or does it have to be a registration/ payment page? E.g. would one be allowed per Google guidelines to create a site praising Obama, but when someone sends that link around after finding it in Google, it will display McCain praise to the non-Google visitor.... or adult content, or trigger a download of software, or display propaganda, or a campaign against Google, and what-not (and would anything change if that page also includes a registration box somewhere which would retrieve the original content)? In this model I'm describing, Googlebot and "Google users" would all see the same.

Matt Cutts [PersonRank 10]

15 years ago #

Philipp, my quick answer would be that we haven't seen this happen in practice (radically different content shown to a user typing in a url directly vs. visiting a url from Google). If it became a problem (in terms of deceptive/abusive/malicious/spammy content) and users were angry and complaining, then we might consider looking at it, but our primary concern is that a user clicking on a search result should see what Googlebot saw. FCF provides a mechanism to allow that while also giving publishers a chance to surface content that users would find helpful but that would normally require registration; in that sense, it exposes more content that would normally be locked behind walls to users.

Affan, I do think your point #1 is interesting; if a publisher does this for Google, why not do it for every search engine? I'd hope that they would, which is why I asked Philipp if he'd asked any other search engines whether first-click-free would be fine under their policies.

Matt Cutts [PersonRank 10]

15 years ago #

And if any other search engine wants to weigh in with their thoughts on how to surface premium content while staying within search engines' quality guidelines, I'd be interested to know their thoughts.

mbegin [PersonRank 10]

15 years ago #

Why does "Premium Content" need to be indexed in Google? With all the "normal webpages" that Google indexes, there must be another "free content page" that could answer the users query...

And if not, too bad – The premium content owner should find their own way to "market" their content – Google shouldn't worry about it.

Matt Cutts [PersonRank 10]

15 years ago #

'Why does "Premium Content" need to be indexed in Google?'

mbegin, I think if a publisher doesn't want to surface that content, that's their choice and I absolutely support that. But if there's really useful content that is currently part of the "invisible web" or "deep web" that a publisher does want to surface, we wanted to describe how to do that without violating our guidelines (e.g. cloaking, which as I've mentioned we have removed large websites for doing).

mbegin [PersonRank 10]

15 years ago #

Matt:
> To determine cloaking for Google's purposes, we look at the
> difference between what Googlebot saw and what our users saw.

I'm a Google user, but what if my friend sends me an email and I click his link and it turns out it's to a FCF page? Even though I'm generally a "Google user", I don't get the first click free here!

Or what if I click a link in a blog or somewhere else on the web – Again, I'm a Google user, but no first click free! :(

Even worse, what if I found a great article or website using a Google search, then I bookmark that link or email it to my friend, or blog about it, etc....only to realize later that it was a stupid FCF page.....

mbegin [PersonRank 10]

15 years ago #

If a publisher wants to surface premium content that's currently part of the "invisible web" or "deep web", I would suggest Google AdWords.

I don't think "premium content" really belongs in with the natural search results... It should at least be flagged with a "Premium Content" tag, or FCF tag or something similar to the "Sponsored Links" tag you see with AdWords. When a user clicks the link from the search results, it could take them to an intermediary page that explains it's a first click free page so they don't end up bookmarking it, blogging it, emailing the link, etc.

With this new "First Click Free for Web Search" policy, premium content providers are essentially "buying" inclusion into the Google search results – not with money, but with "special code" on their websites to allow a Google search user to see what Googlebot see's.....

Andy Wong [PersonRank 10]

15 years ago #

Aha, catch you, google, u r becoming evil.

jake [PersonRank 0]

15 years ago #

[put at-character here]Matt Cutts

springerlink, ieeexplore and other scientific journals show links to
their papers as PDFS. But when you click on them they are not the
actual documents but landing pages. These get returned when trying to
do a filetype:PDF (a poor man's filter for sites with pay-walls). Is
this behavior acceptable according to Google's cloaking policy?

In addition keywords displayed in the search results for a given 'hit'
are not present in the page returned to a Google user (on ieeexplore
results for example). Is this behavior acceptable?

Scientific journals seem to have been getting some kind of special
deal from Google for years to be able to do some of the most blatant
cloaking with impunity. Will Google disclose those sites which are
getting 'special exceptions' to this policy?

Robert [PersonRank 0]

15 years ago #

Business idea: Run a referer faking proxy. Call it "All clicks free".

The core three lines of code have been written in 2003: http://curl.haxx.se/mail/curlphp-2003-02/0080.html

nilsito [PersonRank 0]

15 years ago #

In my opinion, the only user friendly implementation of "first click free" would allow the (search engine) user to filter out any results of sites that use "first click free" (or, if you want, "second click non-free").

However, that would require a more sophisticated protocol which tells GoogleBot if a site uses this mechanism. Right now, there is no direct way to check that.

[Anonymous] [PersonRank 4]

15 years ago #

Free if you use Google

Two-tear systems never work in the end.

"Is the content of that URL accessible or not?"

User need to know... otherwise some webmasters will post links to content they believe if free only to find a user has to pay for it. This is free adverting for a paid product.

Affiliated Publishers look out... Why sould I pay you a commision to link to my eBook, when I can get thos links for free.

betamoo [PersonRank 0]

15 years ago #

This blog entry is quite misleading because it makes is sound like Google is allowing sites to do this for the first time.

The reality is that sites likes the Wall Street Journal have been doing this for a while, and not just for Google. The WSJ allows major search engines and social news sites (like Digg) to click to content that otherwise would require registration.

Instead, Google is now providing a standard way to do this. What Google is doing is simply saying, "Ok, you're doing this right now and a lot of you are doing it in a stupid way. If you had just asked me, I could've explained a better way..."

The design of the web means that content providers can always determine where the user is originating from. At least this way there can be a standard.

I do agree with nilsito's comment suggesting that Google allow users the option of filtering this content or alerting users to this type of content. And I think Google is trying to do exactly that by making a standard way for sites to do this.

Rodriguez [PersonRank 0]

15 years ago #

"The design of the web means that content providers can always determine where the user is originating from. At least this way there can be a standard."

If I had to sum it up, I'd say that's almost the exact opposite of the design of the web – often developers have to work quite hard to get around its limitations.

Users can provide that information to content providers if they wish, or they can provide false or no information. Since HTTP is stateless, the only information a server can have about previous requests is what the browser (chooses to) tell it.

There really aren't that many ways content providers could go about this – using the HTTP referer is about all they have to go on, so it's not really even a case of Google providing a standard way.

What it is, is Google officially allowing people to cloak website content under certain conditions. Which is what some people have evidently been doing anyway.

Philipp Lenssen [PersonRank 10]

15 years ago #

> This blog entry is quite misleading because it makes
> is sound like Google is allowing sites to do this for
> the first time.

It's possible there were cases of this before – also see the reports on WebmasterWorld from last year on this blog* – and as the post mentioned it was already available in Google News, but it was never a move this official and explicit for the organic web**. E.g. Matt above talks about highlighting this point at a search conference (to which only some people attend), but that's something different than putting it as an official post onto Google's webmaster blog.

*WMW is actually hiding secondary clicks from Google results for me (i.e. the second "first click", after you've already visited another page of theirs from the SERPs), so also going against FCF, but I'm hearing from soneone else that they can't reproduce this. Might be a thing specific to the location? But I guess it's not too useful checking who cloaks in Google anymore as a) turns out now the guidelines aren't clear what cloaking is (because Google doesn't clearly differentiate between "users" and "Google users") and b) a whole lot of cases can now, since FCF, be interpreted by Google as mere falsely-configured-but-honest implementations of FCF which aren't worth following up on. Also, one might ask the question: if Google doesn't care about cloaking *when done to non-Google users*, why should they get any help from people who sometimes use Google but are also "non-Google users" a large part of the day?

**In fact, depending on your reading of the Google webmaster guidelines, FCF *still* collides with Google's definition of cloaking which is live on the site. Google says: "Cloaking refers to the practice of presenting different content or URLs to users and search engines. Serving up different results based on user agent may cause your site to be perceived as deceptive and removed from the Google index." They do not refer to "Google users" and "Google" but "users" and "search engines". FCF however includes showing "users" different things (in FCF, it's only forbidden to show *Google users* something else).

Sanket [PersonRank 0]

15 years ago #

I don't understand why a website would want to open up content which was previously hidden behind a Registration/payment wall. Wouldn't that reduce the number of site subscriptions and revenue? This would not be acceptable to web masters, in my opinion, unless Google pays them to compensate for their losses.
Also, as you mentioned, it will be a nightmare for people who embed links to make sure it works before publishing content.

Eddie [PersonRank 0]

15 years ago #

great idea, but unfortunately as is, poor implementation.

the main question is: what will keep people from using the "site:" search modifier, or faking the referrer, to see my entire site's premium content for free? it doesnt have to a perfect solution, but, as is, it's just too easy to be useful.

one idea is to make FCF only available to users logged in to google. google would then pass an anonymous reference to the google user ID to the content provider's site, and the owner of the site can decide how many FCFs to give each user (1, 3, 5, 10, who knows). google would make a web service available to verify that the user ID being supplied was recently active in a google search. of course this locks the user even more into the googlesphere, and restricts opening up FCF to users from other SEs. now if everyone was on openID that problem too would be solved.

using referrer alone is certainly not enough protective enough to be useful.

Robert [PersonRank 0]

15 years ago #

> google would then pass an anonymous reference to the google user ID

...which would the protected against faking by what? Do you want Google to handle authentication for the whole web?

Eddie [PersonRank 0]

15 years ago #

> google would make a web service available to verify that the user ID being supplied was recently active in a google search

Eddie [PersonRank 0]

15 years ago #

"web service available to verify that the user ID being supplied" means the "web service available to verify that the anonymous reference to the user ID being supplied"

Forum home

Advertisement

 
Blog  |  Forum     more >> Archive | Feed | Google's blogs | About
Advertisement

 

This site unofficially covers Google™ and more with some rights reserved. Join our forum!