Sorry, we could not find the combination you entered »
Please enter your email and we will send you an email where you can pick a new password.
Reset password:


By Thomas Baekdal - August 2010

We Can Save Newspapers by Destroying The Web

The crap the old media people put out is simply staggering, especially from the constant barrage from people trying to save the newspapers. The latest is this version of "We need to change copyright laws to save newspapers."

First, Eric Clemons goes on to name the enemy: The news aggregators, you know, Google!

Using aggregators like Google and others, I can access essentially in real time the lead paragraphs of almost any story ...Not surprisingly, traditional print media publications are dying, and not surprisingly their owners' online dotcom alternatives are generating far too little revenue to pick up the slack; why pay for any content when the essence of everything is available immediately, and free, elsewhere.

Why would people indeed!

But, the monstrous failure here are not the news aggregators. The monstrous failure is that he doesn't realize the real enemy is "everything is available immediately, and free, elsewhere." He is confusing content aggregation with link aggregation.

This is business 101. If two companies make the same product you have a pricing war. If, in the case of newspapers, several thousand newspapers make nearly identical stories, the pricing war reduces the price to zero.

The link aggregators provide free exposure for your stories! If you fail to sell a product, even when you get massive exposure trough tons of channels, then you need to rethink the product you make.

I simply do not understand why the newspaper people don't get this. This is not rocket science, or even complicated math. It's purely common sense.

Eric Clemons then goes on to suggest that the solution is tightening of the copyright laws to:

A first suggestion would be to provide newspaper and other journalistic content special protection, so that no part of any story from any daily periodical could be reposted in an online aggregator, or used online for any use other than commentary on the article, for 24 hours; similarly, no part of any story from any weekly publication could be reposted in an online aggregator or for any use purpose other than commentary, for one week.

Followed up by the most misguided conclusion I have heard yet:

However, the net is a pretty robust institution by now, and if we were suddenly not able to access articles from the Post (Washington or New York) until they were 24 hours old the net would, indeed, survive.

What? ...I ......what???

The net would indeed survive if you blocked content from appearing for 24 hours? The net would survive if people could not point readers to a new story for 24 hours? The net would survive if we could not share a story for 24 hours?

What kind of silly parallel universe does he come from?

This would destroy any form of viral sharing of stories.

Here on, new articles are usually shared by a relatively small group of core readers. They then provide the exposure to get it moving. After a few days, many people are sharing the articles, but it all starts with the core readers.

If the core group of readers weren't allowed to share stories for 24 hours, then they wouldn't share any story at all. Nobody goes back to a site to share something they found yesterday. And with no initial sharing, there is no follow up by a much larger group of readers.

The articles never grow!

You know, let's do it their way

I have a suggestion. Let's agree to do it their way. Lets invent a new meta tag that defines from what point in time content may be shared. Something like:

This says that the content may not be reused in anyway, but it may be shared on e.g. Twitter from a certain date.

And let's put it into copyright law that all sites, news aggregators, and social services must adhere to it.

Then these silly newspapers can dig their own grave by shutting themselves completely off the internet.

Because while they are using the new meta tags to block readers from finding their articles, you and I - who actually know how the internet works - can use the same meta tags to our advantage.

This says that while you may not republish the article in full, it may be used in all other aspects. But, you may not quote the content unless the destination site also adopts the same licensing rules to the target article.

This would mean that newspapers, who block their content from the internet, cannot use content from e.g., blogs. Newspapers would have to write all their article themselves, without taking quotes from other content sources!

This would give us a serious competitive advantage. The sites that allow sharing and aggregation can take advantage of the full power of the internet. While the old newspapers, who want to control consumption, can do so, but at the cost of not being able to use other content sources in their article.

They can't have both strict control over their articles, and freedom to use other peoples content to make them.

We could even have a little fun with the old newspapers by doing this:

Which means that you can quote content from the article, *only* if you adopt the same rules? But, it is fully allowed from September 23th (a month later).

Imagine if Google did this on their blog. Then Techcrunch, Mashable, you, and I could write about it immediately, and include quotes provided by Google to enhance our articles.

But newspapers, with a more restrictive license, would not be allowed to write about it until next month. They could link to it (and give Google free exposure), but not write articles to be sold behind their pay walls.

Let's see how their so-called net would "indeed survive" then.

The bottom line is...

The average newspapers product sucks. It is massive duplication of available content, redrafted press releases, and republished stories, and quotes from other sources. And even when they produce content on their own, what they are really doing is to tap into the world of citizen reporting.

Not all newspapers are doing this. Many news sources have change a lot in recent years, and are focusing more on creating truly unique content.

News aggregators aren't a threat to unique content; it helps us to get massive exposure and sharing. It even helps us to understand the popularity of an article better, providing us with a vital tool for improving our next stories.

News aggregators are only a threat to the newspapers who have nothing to tell, who is merely making noise. If those newspapers want protection from being included in our social news streams, then that's fine by me.

That just means that I will get less noise and more substance. Hurrah for that!

Of course, I hope we do not have to go this far. It's a bit extreme to pass a new copyright law to show the old media that it would just kill them faster...


The Baekdal Plus Newsletter is the best way to be notified about the latest media reports, but it also comes with extra insights.

Get the newsletter

Thomas Baekdal

Founder, media analyst, author, and publisher. Follow on Twitter

"Thomas Baekdal is one of Scandinavia's most sought-after experts in the digitization of media companies. He has made ​​himself known for his analysis of how digitization has changed the way we consume media."
Swedish business magazine, Resumé


—   thoughts   —


Why publishers who try to innovate always end up doing the same as always


A guide to using editorial analytics to define your newsroom


What do I mean when I talk about privacy and tracking?


Let's talk about Google's 'cookie-less' future and why it's bad


I'm not impressed by the Guardian's OpenAI GPT-3 article


Should media be tax exempt?