There are many important issues to discuss in relation to the story about Facebook and Cambridge Analytica. But there is also a problem with how we are focusing on this story in the media. And this is particularly noticeable within 'media Twitter', where we as 'media people' discuss our opinions.
We have a tendency to lose our objective focus, and get so caught up in wanting to be anti-tech, that we start to twist the stories in our favor (i.e., let's damage Facebook, get politicians to regulate it, and get our market share back), rather than actually have a serious discussion.
I'm not saying this in order to point any fingers, because I have been a part of this media frenzy as well.
For instance, when one of my followers sent the video below to me, I thought this was a perfect analogy for how Mark Zuckerberg said that he was going to do something about this problem that he wasn't aware of.
I found this video to be hilarious. So I retweeted it as part of my contribution to speaking out against what has happened.
But this 'media focus' is not necessarily a good thing.
One simple example is the way we have covered Facebook's response versus what we could get from the 'whistleblower'. It has been a very one-sided coverage.
One of the big problems with whistleblowers is that they often get drunk on their own fame, and they start to tell journalists what you want them to tell you.
We have seen this many times with people like Edward Snowden, even to the point where he, with the FB CA story, tweeted this:
This is not a true statement. Facebook is not 'selling intimate details', and Edward Snowden knows this. But he has become so 'high' on his fame that his objectiveness is starting to slip (and this has been the case for a long time).
The same thing seems to be happening with the whistleblower who revealed the scandal about Cambridge Analytica. There are very clear signs that he too is currently 'high on fame'.
I'm not trying to discredit him or even say that he is lying. I have no basis for saying either. What I am saying is that I'm not seeing the objective care that I would expect from journalists when dealing with him as a source.
From what I have seen, most stories have taken everything this whistleblower has said at face value, without questioning it in any way, while anything said by Facebook is scrutinized or even disregarded. And there is a tendency in the media to focus more on punishing Facebook, rather than taking a step back and talking about the real issues.
Mind you, I'm not trying to defend Facebook. My role as a media analyst is not to analyze the news, my job is to analyze the media, how we are working, the quality of our journalism, and figure out what is a problem and needs to be fixed and what can be changed to make things better. And this whole discussion about Facebook and Cambridge Analytica is kind of missing the point, because the problem is somewhere else.
Specifically, it's about three other things:
The first big problem is around companies like Cambridge Analytica and how they have provided politicians the tools to micro-target and optimize their messages to get more votes.
This is obviously a massive problem, because the idea that a politician can tell one group of voters one thing and another group of voters another in order to win more votes is just pure political corruption. It's incredibly damaging to the principles of a fair election and the democratic process.
Obviously this should not be allowed. It's insane that it's even a thing.
But how do you stop this? Do you regulate Facebook?
Imagine if Facebook were to disappear tomorrow, would that solve this problem? No, because then the politicians would just use some other data and find some other tools ... and keep on telling different voters different things.
The only way to actually solve this is to regulate the politicians. The politicians are the ones who shouldn't be allowed to do this.
Also, the problem here isn't with the targeting, because 99% of the time, micro-targeting is really useful for everyone. For instance, if you have a small knitting company, and you want to promote your new design, it is 100% more useful to be able to target that only to people who care about knitting than it is to waste a lot of money on showing ads to people who wouldn't want to see this at all.
Same thing if you were Nike. It's massively useful for both Nike and for you, that Nike can target their running shoes to people who like to run and target their basketball shoes to those who like to play basketball.
This creates better ROI for brands, but it also creates a far better experience for you, because it means you don't have to be constantly annoyed by ads that you don't care about.
So, targeting isn't the problem. In most cases it works exactly the way it's supposed to work. The problem is that there are edge cases where this can be used for really bad things ... and one of them is when politicians can use it to tell different groups of voters different things.
You don't solve this by regulating Facebook. You solve it by regulating the politicians and other edge cases where this might be a problem (like racial profiling in housing ads).
But that's not what we are writing about in the media. We are so obsessed with Facebook that we aren't really looking at what the problem is.
And the politicians are happy to point to Facebook as well, because the more the politicians can convince you that this is Facebook's problem, and show you that they are doing something about it by calling on Facebook to testify to Congress, the more we in the media forget that we should actually be asking the politicians to regulate themselves instead.
This whole thing is just one massive diversion.
This leads us to the second issue.
One problem that does exist with Facebook is the whole concept around 3rd party data sharing, where people, who are not you, have the ability to decide who your data should be given to.
This concept is completely and utterly insane.
Why have we created a system where other people have the right to give away your data? Who the heck came up with that idea?
Well, as Harry McCracken, Fast Company's Technology Editor, tweeted, here is a story from 1994:
This is a problem that started a long time before Facebook, but since then things have just gotten worse.
On Facebook today, there is a setting that looks like this:
This is completely insane. Why do other people have the right to take information about my interests, my political views, my posts (some of which might be set to private), and more, and give it to random apps that they happen to be using (like a quiz they decided to take for fun).
This is not acceptable.
I am the only one who can make this decision. If a friend wants to take my data and give that to someone else, they have to get my approval first. The idea that they can just do it is complete nonsense.
This should not be a thing.
And I'm not talking about Facebook just making this setting more visible to people (like Mark Zuckerberg is now saying he will). No, this should not exist at all.
If you ask me, 3rd party sharing should be illegal. If I have not given my consent, on a case-by-case basis, nobody should have the right to share my data.
But here we get to the bigger problem, because as soon as we start to have a serious discussion about this, we also realize this isn't just a problem on Facebook. It's a problem everywhere ... including in the media.
I can illustrate this in a very simple way.
If you use a tool like Ghostery, you can see exactly what 3rd party tools sites are using to track you. And here is where things get scary.
Since this whole story started with The Guardian, let's look at them (but this is true for any media site).
You see, if you go to The Guardian and you block 3rd party tracking you get this:
As you can see here, The Guardian has implemented 12 different 3rd party tools, which in some form are able to track what their readers are doing on their site.
This, in itself, might not be that big a problem, if the data was handled appropriately. The problem is that this is rarely the case.
Basically The Guardian is hoping that these services will keep this data for themselves and not share this with others, who The Guardian would have no control over.
But this is where it gets really bad, because if you then allow these 12 trackers to run, you get this:
Now, the total number of trackers is suddenly 58. How did that happen?
The reason is that those 12 trackers that initially loaded, have themselves loaded an additional 46 trackers, which are now also tracking what your readers look at.
And most likely, these 58 trackers are then, behind the scenes, selling and sharing the data to even more data brokers.
You see what's happening here? This is exactly the same as what happened on Facebook.
Facebook allowed apps to get data from people's profiles, hoping that these apps would keep it to themselves and not share it with others ... but, of course they did.
The Guardian has allowed external services to get data about what people are doing on their site, also hoping that they will keep it to themselves... which, of course, they aren't.
You might argue that the story with Cambridge Analytica was much worse because Facebook has more data, that it's more personal, or some other excuse, but that's semantics. The concept is exactly the same. And like with Facebook, The Guardian has no control or even knowledge of how many actually have this data.
You might also say that it's not The Guardian's fault, and this is how the internet works... but that's exactly the excuse that Facebook is using.
This is not a Facebook problem, it's an industry problem. Everyone is allowing 3rd party sharing without people's consent. So the discussion is now not about whether Facebook should be allowed to do this. It's about whether 3rd parties should be allowed at all.
In fact, the EU is already in the process of putting an end to this with the General Data Protection Regulation (GDPR), which in its simplest form states this:
Which leads us to the third issue.
From a trend perspective, one thing that has been very clear over the past many years is that the public is fed up with how things work online, but not in the way that we talk about in the media.
For instance, most people outside of the media aren't really that concerned about Facebook and Cambridge Analytica.
As Esther Kezia Thorpe tweeted:
I have seen exactly the same thing. As soon you look outside of 'media Twitter', people have generally been indifferent to this whole story. And the reason isn't that people don't care about privacy, but that this one story doesn't represent the problem as a whole.
People know that even if we stop Cambridge Analytica, it's not really going to change anything.
But, what we are seeing is a long-term and very persistent trend around privacy.
It started 10 years ago with the ad-blockers. In the beginning, people used ad-blockers to get rid of annoying ads, but today, most people use ad-blockers to get rid of tracking.
Go to any school and ask young people why they use an ad-blocker. The majority will tell you that it's more about blocking tracking than it is about blocking ads.
So, people do care. They just want to fix it everywhere.
Another part of this trend is why tools like Snapchat became so popular. Snapchat's main feature is to allow people to share pictures with their friends without them being turned into data.
When you share a picture on Snapchat, only your approved friends can see it, and it automatically disappears a short time later.
This means that your Snapchat posts can't be used by 3rd party apps, nor can they suddenly end up outside of Snapchat as part of a dataset for political manipulation.
And this isn't just with Snapchat, we see this trend with Instagram stories, with Whatsapp, and even on streaming channels like Twitch.
Many young people are turning to live streaming exactly because it only exists in that moment and isn't saved. This means that you can share something with your audience, without it later being turned into something else that you didn't intend.
So, the trend around privacy is incredibly strong, and it's obvious where things are heading. 3rd party tracking doesn't have much of a future. The only question is how long it will take to really change.
My point here, though, is that this is a much bigger concern for publishers than it is for Facebook or other big platforms (Google, Youtube, Amazon, etc).
Most of Facebook's ad revenue is coming from direct interactions, as in people seeing ads directly on Facebook. That is 1st party data. While Facebook makes almost no money from 3rd party apps having access to data.
It's the same with Google. Google makes almost all their money from interactions that happen directly on their site, like when you see an ad in Google Search. Again, this is 1st party data.
So, when 3rd party data gets regulated, neither Facebook or Google really has a problem. But for publishers, this is entirely different, because many publishers rely almost entirely on 3rd parties to drive their ad revenue.
So, three things need to change:
First, publishers need to rethink the entire ad model of the internet, and shift it to a 1st party data model rather than the 3rd party model that exists today.
I'll not go into details about how this could be done in this article (I might write about this later), but we need to change the industry here before the public does it for us.
Secondly, publishers need to change the way they report these stories, because, today, your Facebook-centric focus is out of tune with the real problem.
I'm not saying that Facebook should just be allowed to do whatever it wants, but don't let your annoyance with Facebook and how it is threatening the media industry distract you from the real issues.
There are plenty of things we need to discuss and that the media needs to cover, but most of them expand far beyond Facebook.
And more to the point, it's not up to Facebook to define the future of society. As Mark Zuckerberg said:
You know, what I would really like to do is find a way to get our policies set in the way that reflects the values of the community, so I'm not the one making those decisions. Right? I feel fundamentally uncomfortable sitting here in California at an office, making content policy decisions for people around the world. So, there are going to be things that we never allow, right like terrorist recruitment and ... We do, I think, in terms of the different issues that come up, a relatively very good job on making sure that terrorist content is off the platform. But things like where is the line on hate speech? I mean, who chose me to be the person that [decides this].
Mark is exactly right about this. Obviously Facebook (and Twitter) need to do a lot more to stop hate speech and other bad things from proliferating on their platforms, but they should not be the ones who define what it is.
This idea that politicians (and the media) are basically outsourcing regulation of hate speech to private platforms is utterly insane.
It's the same thing with fake news. Why should it be up to Facebook to define what is fake news or not? Why are we even calling it fake news?
Should we not instead talk about fraud? Because there is a really big difference between someone posting an opinion (which may or may not be accurate), and someone deliberately misleading people for profit (like Cambridge Analytica).
Why are we demanding that it is Facebook that defines what is fraud or not? This is insane.
So, let's change the narrative and fight the real problem.
And thirdly, the tech industry needs to change.
This is perfectly on point. She is specifically talking about tech companies in the healthcare industry, where privacy and data protection is paramount, but it's the same thing everywhere else.
How can we trust an industry that has a culture of using people's private data, without consent, as a commodity?
Mind you, the problem is not that companies have data. For instance, it's extremely useful that Domino's Pizza knows what pizza you prefer, because that means you can use their Amazon Alexa plugin to make your 'easy order'.
It is a problem if tech companies start to share this data with people outside of our direct interactions. And as Christina correctly observes, tech companies need to rethink this.
Data is no longer the 'wild west of the internet'.
Founder, media analyst, author, and publisher. Follow on Twitter
"Thomas Baekdal is one of Scandinavia's most sought-after experts in the digitization of media companies. He has made himself known for his analysis of how digitization has changed the way we consume media."
Swedish business magazine, Resumé