Sorry, we could not find the combination you entered »
Please enter your email and we will send you an email where you can pick a new password.
Reset password:


By Thomas Baekdal - November 2019

The thing about Facebook and political advertising

As you have probably heard (if you are working in the media), there has been a lot of noise around political advertising on social media channels. It started when Mark Zuckerberg from Facebook (in case you didn't know ;)) said that they were not going to take down political ads that were misleading, not even if they clearly promoted lies.

This created a lot of angry debates, and justly so. Despite what you think about free speech, the simple fact that Zuckerberg went out and told politicians "Hey, you are free to lie on Facebook!" is just the single most idiotic thing he could have done. It's an open invitation for bad actors to use Facebook for the worst type of political advertising possible.

What was funny though, was that after Facebook said this, some people decided to test it by creating a political ad with an obvious lie ... and then Facebook took it down.

This led to tweets like this from Jeet Heer:

Yeah, it's a total mess!

And then, Twitter's CEO came out and announced that Twitter would ban all political ads. Not just the lying ones, but all of them! ... This gave Jack (Twitter's CEO) a lot of free press, and suddenly we had this 'fight' between Twitter and Facebook.

Now, I'm not impressed by what Jack is doing. Remember, it was only a few months back that Twitter said they wouldn't take down bad tweets (like the ones containing hate speech) from political leaders. So, it feels like what Jack did was mostly just a PR stunt.

I could be wrong. Maybe Jack has seen the light, and maybe this is the first step to a better Twitter. But I need to see more positive steps from Twitter before I believe that.

But there is a much bigger debate here that we need to talk about. And I want to take a step back and talk about the larger perspective, and whether we are actually solving the problem ... because I don't think we are.

Three ways to talk about this

When it comes to things like this, there are three ways we can think about it.

Let's quickly talk about each one.

The first question we need to ask is: Should platforms like Facebook moderate political advertising?

The short answer to this is: "Yes, they should!"

Let me explain why.

There is a very strong free speech argument to be made for not moderating political advertising. Back in 2014, for instance, there was a lawsuit in Ohio, USA that tried to require truth in political advertising, but it was struck down by the federal court.

The federal District Court Judge Timothy S. Black said that: "We do not want the government (i.e., the Ohio Elections Commission) deciding what is political truth - for fear that the government might persecute those who criticize it. Instead, in a democracy, the voters should decide."

This is a very sensible thing to say, and he is right. We don't want governments to dictate what is true or false.

I'm reminded here of something that happened in India. A friend of mine, Matilde Giglio, spent half a year in India working as the head of the digital campaign for the National Congress Party (INC), and in a very interesting article, she recounts what that was like.

For example, one problem was that all campaign messages had to be pre-approved by the Indian Electoral Commission.

Here is how she describes it:

The problem was made much worse by the Indian Electoral Commission. Every piece of digital content had to be approved by a bureaucrat in an office in Delhi. I visited them in a dark building with a hundred officials in little cubicles. I would take six colleagues with me to carry print-outs of each digital message. We would wait hours to see someone who might then take a week to give their approval.

I remember sitting there for four hours on my first visit to get a message approved. It blamed Modi for unemployment going up. The official looked at me as if I was crazy. 'This can't be approved,' he said. 'The first rule of the Electoral Commission is that you can't criticise the government'. 'But what can I talk about?' I asked, 'We are the opposition - our job is to criticise the government!'. He replied, 'Why don't you talk about what great changes you are going to make?'. I said, 'But every BJP advert I see attacks the Congress?'. 'Ah', he said, 'But you are not the government'.

This is insane.

And so, federal District Court Judge Timothy S. Black is right in saying that we don't want to create anything like this. The idea of having an electoral commission who have the power to reject advertising is scary. Instead, politicians should not be limited this way, and the voters should decide.

However... this doesn't mean that Facebook shouldn't do anything.

The problem here is that if it's the voters who should decide, it also requires that the voters are informed. And this is where Facebook completely fails.

Politicians have always been lying, but in the old world of media, everyone was told the same message. For instance, if a politician wanted to advertise in the print version of the Chicago Tribune, every reader across Chicago would see the same message. And if the politician lied, it would have a consequence.

On Facebook, however, politicians now have the power to do micro-targeting, sending different ads to different groups of voters based on data profiling (like what we saw with Cambridge Analytica). This means that the voters only see the message intended for them, and suddenly they have no idea what else is going on.

In other words, the voters are no longer informed. And the system that is supposed to keep the politicians in check no longer works.

This is a gigantic problem, and one that Facebook needs to fix. And there are two things that Facebook needs to do.

The first thing is to stop politicians from being able to micro-target their lies to different groups of voters. This should not be allowed for political advertising.

Former Facebook security chief, Alex Stamos, also talked about something similar in a recent discussion with Mathew Ingram.

Politicians should not be allowed to send out different messages to different people. Instead, when a politician advertises on Facebook, they should only be allowed to target the general area (like the country or a state), but then within that, the ads should be delivered to people randomly.

So, if a politician buys a million ad views, then Facebook should show those to a million random people, without applying any form of targeting, neither defined by Facebook or by the politician.

This might sound extreme. But this is how print newspapers work. When a politician advertises in print news or on a billboard, they have no idea who is going to see it.

This is how advertising used to work.

Mind you, I'm actually including journalism in this. When we as publishers write stories about politicians, then we too should not be allowed to micro-target this to different groups of voters.

For instance, you don't want Fox News to be able to only promote their articles to Republican voters. It's the same problem. We, as publishers, are just as much a part of this as the politicians. We have a responsibility to remain impartial (but focused on facts), which we can only do if we have an impartial audience.

So nobody should be able to promote information related to politics to segments of voters, regardless if this is a political ad or a news article.

This is the change that Mark Zuckerberg needs to make. He created this problem because of how Facebook designed their ad product. It's his role to fix it!

The second thing that Facebook needs to do is to live up to their responsibility as a publisher.

There has been a long debate about whether Facebook is just a platform or whether they are a publisher, but the answer to this is pretty simple. Facebook is a publisher!

When you visit your Facebook News Feed, Facebook has made changes to it, by changing the order, ranking, and the importance of what you see.

This is what it means to be a publisher, and it's the same thing that newspapers do. When you go to a newspaper, it has made determinations as to which articles you should see.

It doesn't matter that Facebook is doing this algorithmically. The simple fact that they are changing the order of the News Feed also means that they need to take on the responsibility for what people end up seeing.

You can't change the order of what people see and then also claim not to be responsible for it.

So, recently, I asked on Twitter how newspapers were dealing with this, and one of the people to respond was Mike Orren, Chief Product Officer from the Dallas News.

He wrote this:

We review all political ads and if in doubt send to news. All opinions allowed. No clearly false statements allowed. Review starts with me and I send newsroom maybe 3-4 a year to review and reject another couple obviously wrong without even asking.

I just love this. This is how every responsible publisher should behave.

Obviously, we all know that many news publishers don't behave this way. There are plenty of examples of bad behavior in the press, and it's common to provide ad space to deceptive and directly misleading advertising in local newspapers and on local TV stations.

But this is how we are supposed to do things, whether you are Facebook or a newspaper. If you are editing what people see, then you also have to take on the responsibility that comes with it.

So, if we look at Facebook in isolation, then yes, there are several things that Mark Zuckerberg needs to fix, and his current argument that he shouldn't have to do anything because of 'free speech' doesn't hold up.

However, there is also a much bigger question here, one that doesn't really have that much to do with Facebook at all.

Zooming out

The problem is that, if you start to think about political advertising as a whole, Facebook does not really have that much of an impact. Right now when you see articles about Facebook and politics, you almost get the idea that Facebook has single-handedly destroyed everything.

But when you look at the numbers and the behavior, this type of worldview just doesn't hold up.

There are several reasons why Facebook is nowhere near as influential as many people seem to think. One reason is that digital political ad spend is not actually that high.

In 2020, Kantar Media estimates that political ad spend will reach $6 billion dollars (which is insane, but that's another discussion). But only 20% of that will be 'digital'.

And this is all digital. So, Facebook's part of this is only a share of the 20%.

Facebook itself has said that "We estimate these ads from politicians will be less than 0.5% of our revenue next year." Around $350 million, although this number is somewhat misleading because it only represents direct political ad spending, and not ad spend from, for instance, super PACs ...but the graph above also excludes super PACs.

This means that Facebook's share of political ad spend is only 5.8% overall. It's not that much.

But, over the past several months, I have seen many people across the media who seemingly have the idea that all of the problems with misleading advertising are because of Facebook, or Facebook is somehow dominating this space.

It is not. Facebook is a tiny part of the world of political ad spend.

And this is not just true for political advertising, we see the same misconception with a lot of other things that people are currently discussing about Facebook. Take the example of Russian interference in the US election. We see stories about this almost every week, and from the way it is reported, it sounds like Russia is single-handedly swaying US voters by posting things on Facebook.

But then, when you look at the numbers, there is not really that much going on.

Take the latest example. Facebook told us that they had taken down 93 accounts and 17 Facebook pages, as well as four Instagram from Iran and another network operating on Instagram from Russia.

In total all of these accounts had 240,000 followers, of which 60% were from the US.

"OMG", the media said. "Russia and Iran are influencing our elections!" ... and there was much drama and anguish.

But just stop and look at the numbers. 240,000 followers across at least 115 accounts, that's 2,086 people per account.

It's nothing!

In comparison, when Jennifer Aniston created her Instagram account, she gained 10 million followers in just one day, and now, a couple of weeks later, she has 18 million followers.

So, if you think these accounts from Russia and Iran are able to sway the US election, then using the same logic, Jennifer Aniston would be 75 times more influential.

But here is the thing, even if Jennifer Aniston were to post a political message to get people in the US to vote in certain ways, would that really change anything?

The answer is no. Even with 18 million followers, Jennifer has very little influence over the US election, because people don't react this way. So, these pages from Iran and Russia have very little impact.

Don't get me wrong.

But this again illustrates the difference between talking about a problem in isolation, and talking about whether it actually has an impact overall. Facebook does not have much impact.

Another problem for Facebook is how we interact with it, and how that also causes it to have a much lower impact.

This is something I have talked about before. When you think about advertising, there is a very big difference between exposure and persistence.

On Facebook, it's very easy to post something that gets a lot of views, but the main reason for that is because views are measured when people just casually scroll through their news feed without really paying much attention to anything.

In other words, it's what we call a low-intent micro-moment.

For instance, name me the last five posts you saw on Facebook? You can't... right?

In comparison, watching local broadcast TV is a macro-moment. You are suddenly sitting in front of the TV for hours!

So let's do a simple comparison. Let's say we take a voter from the older generation, who is spending 2 hours per day watching, say, Fox News ... and compare that to the same person randomly coming across a post from a fake news site on Facebook in their News Feed.

Which one of these two would have the biggest impact? The answer is obvious. Watching Fox News every day is persistent and focused exposure, and this has a million times more impact than whatever you come across on Facebook.

It's the same with political advertising on local TV. During an election, especially in the swing states, people are inundated by political advertising on local TV, on local billboards, via robocalls ... and everywhere. And this combined 'push' creates an extremely powerful persistent exposure.

In comparison, coming across a random political ad on Facebook does not create this effect. As a whole, Facebook is part of the problem, but in isolation, Facebook has very little impact compared to other channels.

It's the persistence that makes a difference, not the exposure.

The problem isn't even the ads

This leads me to my last point, which is perhaps the most important one, which is that political advertising isn't even the biggest problem.

So, let's back up a bit. Right now, we see many people demanding that Facebook should check and ban misleading political advertising.

Okay, but why are people saying that?

Well, it's because we know that these ads are bad, and we want Facebook to prevent them from being exposed to the public. This is why we want them taken down, to stop them from being seen.

Okay... so... let me give you an example of a misleading political ad campaign.

As you might remember, back during the whole Brexit referendum thing in 2016, Boris Johnson toured the UK in a bus that had a big slogan on the side saying: "We send the EU £350 million. Let's fund our NHS (National Health Service) instead."

First of all, this was a lie. Not only was the number a "clear misuse of official statistics", as many fact checkers pointed out, it also wouldn't be possible to just send this money to the NHS.

So, this entire bus is one big misleading advertisement. It's exactly the kind of thing that, if it was an ad on Facebook, people would now be demanding Mark Zuckerberg ban it.

So, what happened?

Well, Boris Johnson traveled around the UK with this bus, and every time it stopped, he would jump out, stand right in front of this big slogan, and make some kind of political speech.

And as you can see from the picture above, we (meaning journalists) swarmed him. We took pictures from every conceivable angle, we covered this extensively, we put it on the front pages of almost every newspaper ... and every story included a picture of Boris standing in front of this bus.

Don't believe me? Here is a very small sample of newspaper articles as they show up on Google Image Search.

The result of this is that every single person in the UK, and most of Europe, have somehow seen this bus. Not because we saw it on Facebook. Not because it was promoted as an ad. But because of the campaign tactics that caused the media to use it in every article.

As an ad, this was a brilliant tactic. I can't remember any bus ever getting this much exposure. But remember, it was a lie, and we in the press helped it to get all the exposure it ever needed.

This is the real problem that we face today. Yes, Facebook needs to do better, but it would not make any difference if Boris Johnson had promoted this bus in an ad on Facebook.

He didn't need Facebook to spread this message. We did it for him.

And this is something we need to learn about today's world. We now live in an attention economy, where the more we report things, even when we report critically, the more the bad people get to expose their views to their supporters.

It's attention through polarization.

And this is not unique to the UK. We see this in every single country. In my country (Denmark), we have seen a sharp increase in negative sentiment against foreigners, because of a relentless campaign by several political parties. This shouldn't have happened because in fact the number of people seeking asylum is extremely low.

In the year leading up to the last Danish election, both the politicians and the press claimed that refugees were the most important topic to discuss. And yet, when we look at the numbers, only about 4,000 people were seeking asylum in Denmark (in Danish).

How is this the most important topic? True, this might have been an issue back in 2015, but why was it still an issue in 2019?

Notice: The light-pink bar illustrates the number of applicants, while the darker-pink bar illustrates how many were processed each year. For 2019, only the months from January to August are included.

So, again, why did this happen? Did this happen because of Facebook and how Danish politicians were posting anti-refugee posts? Did it happen because Danish politicians were running misleading ad campaigns?

Well, I'm sure some politicians did actually do that, although I can't remember ever having seen one in my Facebook News Feed.

But the reality is that we in the press have managed to misinform ourselves. During the last Danish election, I heard editors say the topic about immigrants was the most important news story. I saw newspapers create topic pages for covering the "immigration debate".

But it wasn't important.

Pretty much anything else would have been more important to talk about than this. The infrastructure, climate change, the lack of electric cars on Danish roads, the inability for young people to buy a house, the way we pay taxes, the waiting time at Danish hospitals, the future of the Danish pension system as the population gets older, etc.

All of these things have a much larger impact than the 4,000 foreigners who were seeking asylum.

You see what is happening here?

The problem we face today is that attention now works differently to how it used to. In the past, the traditional thinking was that journalists would bring attention to bad things, which would then cause people to feel so ashamed that they fixed it.

Today, this doesn't work. Today, the more attention we bring to something, the more it grows, and then the media start thinking that it's even more important, and we end up with a twisted form of self-sustaining attention loop.

This is where the real problem is. And yes, this problem also exists on Facebook, but it's everywhere.

And the only way to get out of this loop is to change the way we think about attention. Whenever bad people do something, or whenever a politician lies, we cannot fix that by bringing attention to it. That only makes the problem worse.

Instead, we need to starve the attention.

Take Brexit and Boris Johnson's bus.

Pretty much from the start, every single journalist and editor knew that it was a lie. We also knew that the reason Boris Johnson wanted to stand in front of it was exactly so that the press would use that as a backdrop for every single article about him.

And it worked...

But what we should have done was refuse to take part in that game. As editors, we should have refused to use any image of that bus.

Just like we are demanding that Facebook take down misleading ads to remove them from public view, we should refuse to bring that picture to the public as well. And we should have focused our reporting on issues that were more important to the readers, rather than on what the politicians wanted us to talk about.

Think about what would have happened if we had done that instead. Only the people who were physically there (which was a very small group of people) would have seen the bus. And the rest of the UK (not to mention everyone in Europe) would never have known that it was there.

If we had done this, Boris Johnson would have realized that his strategy wasn't working. That spending his time standing in front of this bus didn't actually help him get his picture in the newspaper. And he would have seen that his campaign slogan never became the center of the news reports.

Boris might then get frustrated by it, and instead turn to Facebook. Since the media wasn't playing his game, he would just skip the media and appeal directly to people via promoted posts on Facebook.

Well, then we would go to Facebook, and we would point out why we weren't doing this in the press, and demand that Facebook also live up to its responsibility to not provide a platform for politicians to lie.

Whether Mark Zuckerberg would actually do that is unknown. But think about how big a difference this would actually make. Imagine what would happen if the politicians realized that they would not get attention when they lied.

I know this is hard to embrace, because it's the opposite of how journalism works (and has always worked). But this is where the real problem is today, and this is how we solve it.

We need to remove attention from the people who don't deserve it, and we need to remove this from every channel. We cannot solve it by just pointing our fingers at Facebook. Mark Zuckerberg made a big mistake when he told politicians that they were free to lie on Facebook, but if we really want to fix this, we all have a role to play.


The Baekdal/Basic Newsletter is the best way to be notified about the latest media reports, but it also comes with extra insights.

Get the newsletter

Thomas Baekdal

Founder, media analyst, author, and publisher. Follow on Twitter

"Thomas Baekdal is one of Scandinavia's most sought-after experts in the digitization of media companies. He has made ​​himself known for his analysis of how digitization has changed the way we consume media."
Swedish business magazine, Resumé


—   thoughts   —


Why publishers who try to innovate always end up doing the same as always


A guide to using editorial analytics to define your newsroom


What do I mean when I talk about privacy and tracking?


Let's talk about Google's 'cookie-less' future and why it's bad


I'm not impressed by the Guardian's OpenAI GPT-3 article


Should media be tax exempt?