facebook rss twitter

Google’s concession to Murdoch unlikely to be enough

by Scott Bicheno on 2 December 2009, 10:08

Tags: Google (NASDAQ:GOOG)

Quick Link: HEXUS.net/qau5s

Add to My Vault: x

Token effort

Following recent reports that media mogul Rupert Murdoch is considering making all News International content available exclusively on Microsoft's Bing search engine, in exchange for a nice lot of cash, Google has announced changes to its controversial First Click Free programme, which allows people to access premium content for free.

In a blog, Josh Cohen, senior business product manager at Google News, started by stressing Google has strict policies about ‘cloaking' - in which different versions of a web page are shown to the indexing software and to the end-user. "We do this so that users aren't deceived into clicking through to a site that's not what they were expecting," he said.

This created issues when it comes to premium content - i.e. paid-for. For it to be indexed, Google needs to have full access to the content. So Google developed a programme called First Click Free, in which Google, and thus end-users, can access the full piece of premium content for free if they find it on a Google search, but then can't navigate through the premium site in question without paying.

The problem with this is that much news content is accessed through various Google searches, so a high proportion of stories on premium sites like wsj.com are currently being accessed for free, thus undermining the premium business model.

So Google has now altered First Click Free to provide publishers with the ability to limit the number of pieces of content viewable by a single user to five per day.

We think this is unlikely to appease the likes of Murdoch. A big driver of demand for sites like wsj.com is exclusives. In fact, for many consumers, the only reason to pay for something, especially online, is if it's unique and they can't access it any other way. If we can still read five scoops per day for free, it's unlikely many people will feel compelled to pay for the sixth.

 



HEXUS Forums :: 4 Comments

Login with Forum Account

Don't have an account? Register today!
Murdoch can keep his scoops!
If Murdoch doesn't want his sites shown on Google, there's a very simple method called putting a robots.txt file in your root directory and disallowing whichever areas of your site you want. Google's webcrawlers will respect the robots.txt disallows and leave Murdoch's sites alone.

However, since he's too stupid to do that, why doesn't google just help murdoch out by blocking all Newscorp sites from being indexed at all? They could even go a step further and ban all Newscorp IPs from accessing any of google's services.
latrosicarius
If Murdoch doesn't want his sites shown on Google, there's a very simple method called putting a robots.txt file in your root directory and disallowing whichever areas of your site you want. Google's webcrawlers will respect the robots.txt disallows and leave Murdoch's sites alone.

However, since he's too stupid to do that, why doesn't google just help murdoch out by blocking all Newscorp sites from being indexed at all? They could even go a step further and ban all Newscorp IPs from accessing any of google's services.


Actually, he's not too stupid to do that, that's the whole point of the article. News corp has been talking about delisting from google. Which is more or less exactly what you are suggesting in the first part.
reiella
Actually, he's not too stupid to do that, that's the whole point of the article. News corp has been talking about delisting from google. Which is more or less exactly what you are suggesting in the first part.

I know he wants to de-list his sites…

The point was he's too slow so he needs a bit of help, such as google banning him.