free
Welcome back to the Baekdal Plus newsletter. Today, I have two main topics for you. The first one is the latest Plus report about personalization. And then we need to talk about polarization, trust in science, misleading studies, and how we cover climate change.
The way most publishers think about personalization is like a filter. You write 100 articles and then you try to figure out what a specific person is interested in by looking at what they looked at earlier, and then you give them maybe 10 out of your hundred articles to fit their interest.
This is also how personalization works on social channels. Here you might be following a 1000 people, but you can't look at 1000 posts all at once, so the social networks use their algorithms to pick out which ones they think you will engage with the most.
But, if you think about this for a second, you realize that this is a really inefficient form of personalization. You wrote 100 articles, but only 10 are used (for each person). What about the other 90 articles? Working like this makes no sense. Why would you not try to make all your articles personally relevant to people?
In my latest Plus article, I talk about personalization for publishers, and specifically, that real personalization is not a filter, but a focus. So, take a look at: "Most publishers get personalization wrong. So what is it?"
I want to tell you a story about how reporting about a survey can end up polarizing the public, and why that also impacts our climate coverage.
The story starts with this graph, which I came across because several of my friends in the media were retweeting it.
And when I saw this, I retweeted this as well (sadly), talking about the polarization that seems to be evident in it.
However, it didn't take long for some people to point out its flaws. One of them was Chris Bail, Duke Professor and Director of the Polarization Lab. And they were right. The above graph is really misleading. Not only that, it also cherry picks the metrics in such a way that it drives polarization of the public.
And I saw this on Twitter. If you look up how people are reacting to this graph, it's all exceptionally polarizing.
So, what is the problem with this graph? Well, quite a lot actually, but I will focus on just four things.
The first problem is simply the Y-axis. FiveThirtyEight chose to use a Y-axis so that the Republicans looked to have almost zero trust in science, whereas the Democrats had almost all of it. But if you look at the numbers, they are only 30 percentage points from each other.
So the design of this graph created a far more shocking image than what reality supports.
The second problem is the question itself. In the graph above, you are merely told that it is the "share of Americans who said they had 'a great deal' of confidence in the scientific community". What it doesn't tell you is that this isn't actually what the survey asked.
Instead, the actual survey gave people four options:
So what does this graph actually look like if we map out the full survey response? Well, I took a look at the raw numbers and this is the result:
This is fascinating because it gives us a completely different picture than the graph above. As you can see here, most people across the political spectrum in the US do believe in science. The vast majority of people either believe in it a great deal or only some.
The share of people who actually say that they don't believe in science is tiny. Yes, there has been a polarizing movement within the past year, and yes, the share of anti-science Republicans has gone up (now 13% of voters). But it's far from the picture that you see in the first graph.
But here is a question. What if we ignore the politics and instead just look at other demographic metrics? For instance, what does it look like when we look by age?
The answer is this:
As you can see, generally speaking, there is really not that big of a problem with acceptance of science in the US. The vast majority believe in it across all age groups.
We see the same thing when we compare men and women. It's very similar, although interestingly, women are slightly more skeptical than men.
But this leads us to the third problem.
This was pointed out by Chris Bail in his tweet. There is a huge problem with the weight of the options people could pick.
To illustrate why, let's ask the question in a different way. Imagine that we instead used a scale from 1-10. On this scale, where would you put "a great deal"?
Well, it's pretty high up, right? I mean, a great deal is not 5 or 6 .... so we would probably put this above 8. Similarly, where would you put "hardly any"? Well, hardly any means almost zero confidence, so we put that somewhere near zero.
The result is this:
As you can see, this is a bit of a problem, because now we have this gigantic void in the center defined as "only some", but there is so much nuance lost in this. So what does only some mean? Does that mean that they only believe in a little bit or a lot?
And this then leads us to the fourth problem.
The fourth problem was that they didn't actually ask people whether they believed in science. Instead, the question was whether the public believed in the "science community".
In theory, that should be the same thing, but in reality, it's nothing of the kind.
We have seen this first hand during the pandemic. Simple questions like: "Should you wear a mask" have been reported to the public as a wildly contested topic where scientists had wildly different views.
And when you combine this with how the whole thing was also presented as a party political division, it's not surprising that, for a smaller part of the public, belief in the scientific community has become a reflection of party politics.
But this isn't science. This is something else. This is the political game, and this is what is important to remember.
So, think about what just happened here. I started this article with a graph from FiveThirtyEight, which made you think that the Republican voters had given up on science, while the Democrats had become science super-fans ... but now, just two minutes later, you see a picture that is nowhere near as polarizing or outrage fueled.
This is a problem ... and more so, it's a media problem.
It's a media problem because of two things. First of all, it's a presentation problem. When FiveThirtyEight created that graph, they did what many newspapers did, which was to focus on the data point that looked to be the biggest problem. But doing it this way, they create even more polarization.
We see this so often, not just with this graph, but with so many stories. I was reminded of a similar story where someone had done a study about immigrants, and found that a huge number were imprisoned. And many newspapers just 'ran with it'. They saw a scary number and that then became the story.
And Rasmus Kleis Nielsen, from Reuters Institute, tweeted another example just the other day. And we see this every day. It's part of the culture of how things have always been reported.
The problem with this is that we know this fuels more polarization, and so instead of helping society heal and find a better way, we are actively part of the problem that is driving it apart.
And mind, in most of these cases, we (the press) are not trying to be misleading or polarizing. Even in the case of the graph from FiveThirtyEight above, I don't think they intended it to be misleading. They just tried to simplify one data element of a much bigger study.
But it's these unintentional actions that we know cause harm.
The second problem, however, is about the journalistic focus. We have to ask "why is science polarizing"? (as in this example). The reason is not because of the science. Instead, it is because every time the public comes into contact with it, it's always part of some party political focus. It's the association with the political focus that causes the problems.
If we know that certain topics are actively being polarized in such a way that it drives people apart because of their association with politics, as the press, we really need to think about whether there is another way of covering it.
A better way!
I want to illustrate this in relation to climate change. As we all know, climate change is going to be the biggest societal change (both positively and negatively) for the next couple of decades. So many things have to change, and so many elements have to be innovated. And, because it is so important and impacts every one of us, it's absolutely vital that we keep the public involved.
This is not someone else's problem. This is our problem and one that we must solve together on every level as a society.
So the worst thing we can possibly do is to focus our climate coverage on the drama and intrigues of the party political games, where the political focus polarizes the public.
Right? That would be a disaster for any form of effective climate action. In fact, if we were to do that, we end up with this:
This is what a politically polarized debate looks like. But climate change doesn't care about political sides. It's not like the price of a gallon of petrol is higher for Democrat voters than it is for Republican voters. And rising food prices impact voters on both sides equally.
So what I see here is a disconnect. This topic no longer follows the reality for the public, and the press needs to pay attention to this.
But today, we are not doing this. Instead, we are actually fueling this problem with our coverage.
Take a look, for instance, at this screenshot from the climate section from one of the largest newspapers in my country. I have highlighted many of the articles that focused on 'the government' (all of them).
This is astonishing because it illustrates very clearly just how blinded we are in the press by politics. This newspaper proclaims to be the 'leading climate newspaper' ... but it isn't actually a climate newspaper at all. It's still just a politically focused newspaper.
Mind you, in my country, we don't have the same levels of polarization as we see in the US (yet), but this type of reporting tells the public that climate change is entirely a political thing, and whatever action is being taken is seen as either good or bad depending on whether you like the government or the opposition.
This is not a good way to involve the public. This will drive up polarization, and it will make it far more difficult to talk about climate change from the perspective of science and innovation.
This is a real problem. If we keep turning every scientific discussion into a party-political game, we make it so much harder for the public to trust in science.
This is a journalistic approach that we need to rethink.
If you want a deeper look into how we need to rethink climate coverage, take a look at my Plus article:
Also, remember that while this newsletter is free for anyone to read, it's paid for by my subscribers to Baekdal Plus. So if you want to support this type of analysis and advice, subscribe to Baekdal Plus, which will also give you access to all my Plus reports (more than 300), and all the new ones (about 25 reports per year).
A look at the trend of brand+publisher, and the future for epaper
Asking an AI to do media analyst, and what does it mean when social becomes content focused?
It's tempting to just take a picture of your desk, but be mindful of what it might reveal
A guide to AI for publishers, the end of a million views, and what read metric is best?
Depression is impacting all level of news, from the journalists, the audience, to the businesses.
What happens if we force people to spend time on things at a proportion that is completely insane?
Founder, media analyst, author, and publisher. Follow on Twitter
"Thomas Baekdal is one of Scandinavia's most sought-after experts in the digitization of media companies. He has made himself known for his analysis of how digitization has changed the way we consume media."
Swedish business magazine, Resumé
plus
free
free
free
free
free