Welcome back to another edition.
I'm currently in the middle of writing one of my big Plus reports. I write about 25 of these per year, so there is always about a two-week gap between them. In the meantime, here are three things that came up this week that we need to talk about.
If you are following me over at Twitter, you might have seen how I often call out studies to be careful about applying causation.
The problem is that any studies that aren't based on actual quantified data or on measuring people's behavior are likely to be influenced by the questions in the study itself.
This is part of an overall problem that we have with journalism. Whenever journalists need information for a story, the culture is to go out and ask people "what they think". The problem with this is that this is not data.
One example was a study I came across a few months back (sorry, I didn't save the link) where a newspaper had tried to figure out why people became anti-vaxxers. And not surprisingly, the results came back that it was "because of YouTube".
But just stop and think about this for a moment. How would you ask that question in a neutral way? That's pretty much impossible.
The exchange would have been something like this:
What's happening here is that people actually don't know, and they are also not comfortable in answering the question to begin with. So, as soon as you give them a way out, they jump on it.
The reality is that the actual data is that "people don't know", anything after that point is influenced by giving people other ways to answer.
This is just one of many such examples.
Another example I came across back in August was a study from Canada about news personalization. It came to this conclusion:
Readers are concerned personalized news keeps them in content silos. 48% of Canadian adults have concerns that personalized news may mean they miss out on certain stories or perspectives.
But here again we have the same problem. By simply asking this question, you are also creating the answer.
Before the study, people were most likely not concerned about this at all because they weren't even thinking about it. They were just going about their day, getting the news from whatever source they used, and not really thinking about it too much.
But then, when they were called by the survey company, and asked: "Are you concerned about missing out on certain stories?" ... people started to think about it, and the question itself introduces a level of doubt, and suddenly people go:
OMG yes... what if I am missing out on something? Yes, I am worried about this.
The study likely created this specific outcome simply because they asked.
Another example of this was a study that recently came out from PEW about news consumption on social media: "Americans Are Wary of the Role Social Media Sites Play in Delivering the News", which was then followed up by two other stories from Nieman Lab.
You can see that there is a common theme here. However, we see the same problem as before.
For instance, one of the questions asked was whether people felt the news on social channels was 'one-sided'. Again, the way people answered this (83% thought it was) is likely to have been influenced by just asking whether it was.
Another example is about the question of political bias. In the PEW study they wrote this:
Americans are most concerned about biased and inaccurate news on social media; many say the news content they see leans left.
The problem here is that we asked the question, because we are forcing people to 'pick a side', even if there aren't any.
Imagine asking people "Do you think news leans left or right?"
By asking this, you are putting this concept into people's heads, forcing them to think about the news in a political sense, rather than as just news.
In other words, the question itself creates a bias towards thinking that it's either left or right leaning, even if it isn't. But more than that, by doing a study like this, and by reporting about it to the public, we further facilitate this line of thinking that news is indeed biased.
It's the same problem as before. If we hadn't pointed this out, would the majority of people even think this way? So, a lot of these studies aren't actually measuring what people think, but are instead measuring what we made them think.
There is one more problem I want to mention.
As you can see from the headlines above, all of these studies focus on Facebook (primarily), and how bad it is. How people are wary of it, how it's making them depressed and so forth.
But the problem with this is that we have a focus bias.
For instance, in the study about the people not using Facebook for a week, they came to this conclusion:
Overall, the effects our study finds on news awareness, news consumption, feelings of depression, and daily activities show that Facebook has significant effects on important aspects of life not directly related to building and supporting social networks.
The problem, however, is that I have seen many other studies around news avoidance over the years, some that had nothing to do with Facebook, and they all found the same thing.
In other words, people feel depressed about the news regardless of where they read it. And people who stop reading the news, whether it is via social channels or via newspapers directly, report feeling less stressed.
This means the overall conclusion would be something like this:
Overall, the effects our study finds on news awareness, news consumption, feelings of depression, and daily activities show that news has significant effects on important aspects of life.
See what I mean?
I tell you all this because I come across this problem every single week. In fact, most 'media related' studies have these problems. As I said in the beginning, in the media industry, we have a journalistic culture of focusing our stories on what people think instead of looking at the actual data or measuring actual behavior.
We try to ask people good and neutral questions, but by asking them in the first place, we put these ideas into people's heads.
We are creating a negative feedback loop.
First we ask people a question that makes them think that something is a problem, even if they had never considered it before. Then we report that we found this result, which means even more people think this is the case, then we do another study which now finds that even more think this way, which we then report ... and so on.
The solution to this is to instead focus our efforts on studies that have quantifiable data. For instance, don't ask people whether they think crime is getting worse, look at the actual crime data and report what is actually happening.
And more importantly, don't report about it. If you report that "most people think that crime is getting worse", you are enforcing that belief ... even if you also fact-check it.
We see this problem with so many things in our society.
BTW: This also links to my Plus article: "How Editorial Analytics can Help you Define your Editorial Strategy", where I talk about using concepts like this to drive the editorial focus. So if you haven't read that already, please do!
As mentioned, there are two more things that I just quickly want to mention.
The Drum recently announced that there is a new ruling from The Court of Justice of the European Union about how to interpret GDPR consent dialogs.
The ruling is this:
The court decides that the consent which a website user must give to the storage of and access to cookies on his or her equipment is not validly constituted by way of a pre-checked checkbox which that user must deselect to refuse his or her consent.
That decision is unaffected by whether or not the information stored or accessed on the user's equipment is personal data. EU law aims to protect the user from any interference with his or her private life, in particular, from the risk that hidden identifiers and other similar devices enter those users' terminal equipment without their knowledge.
Essentially what this means is that almost all GDPR consent dialog boxes used by publishers today is illegal. Even the new GDPR consent dialog boxes from IAB aren't legal because, fundamentally, they are based on the same thing.
We have talked about this many times before. But, if you are a publisher, you really need to change your business mindset around this.
We need a new system. I don't just mean a slightly redesigned dialog box. I mean, we need to rethink the whole "let's send data to 200 third parties to show ads"-model.
Speaking about the EU, there is one last thing I want to mention. As you know, the EU copyright law is not going the way EU publishers had planned, because Google has absolutely refused to pay for snippets, and has instead told publishers that they will only show pure links from them in the future.
I was asked to comment on this, so I wrote a thread about it on Twitter (which ended up getting more than 180,000 views).
There is also a slightly easier to read version here: The thing about the EU copyright law and Google refusing to pay
Founder, media analyst, author, and publisher. Follow on Twitter
"Thomas Baekdal is one of Scandinavia's most sought-after experts in the digitization of media companies. He has made himself known for his analysis of how digitization has changed the way we consume media."
Swedish business magazine, Resumé