The Road to Hell Is Paved With Likes


We’ll start with a blunt statement. The way we use technology now harms the public debate. In this post we will lay out how the exploitation of human nature for the profit of Big Tech contributes to the rising spread of disinformation to the detriment of a public debate.

In short the argument is as follows:

1. Internet companies sell attention 
2. They seek to maximize engagement because this gives them more attention to sell 
3. To achieve that they harness human competitiveness and desire to be liked
4. By ranking participants according to their ability to create engagement, they validate information from participants that are good at grabbing attention 
5. Because of the human tendency to prefer engaging with catchy slogans and simple answers, that’s that gets preference on the platforms

People Need to Know How to Trust Information

While the access too information has become easier and easier over time the spreading of information was always far more centralized. One big reason for this is the practical impossibility for everybody to weigh and verify the received information. A problem exacerbated by the exponential explosion of knowledge and interconnection facilitated by the ever improving means of information transfer.

Journalists to The Rescue

This led to the emergence of a class of gatekeepers, that would collect information, sort out the unimportant or unwanted and distribute the important and desired bits to the greater public. Their reputation (and with it their influence) was based of the appearance of being unbiased.

Of course there were blind spots, but journalistic guidelines sought to ensure the isolation of the reporters from advertisement departments, politicians and other people with interest in manipulating the news.

What You Know Will Bias You

Then came the Internet. Technologically the most important change for the distribution of information was that the reach could be measured on a much finer level than before. Now the “clicks” on an article were visible and marketing companies jumped at the opportunity. No longer did they have to guess how many people would see an ad in the New Yorker. They could immediately get feedback on which mediums and which articles would give them the most exposure.

The New Economy Enters the Stage

It wasn’t long before Internet companies that were built on risk capital without any apparent business model in mind succumbed to the siren songs of ad revenue. As digitally natives they had the engineering chops to optimize the shit out of this dynamic. All they had to do was optimizing their algorithms for “engagement”. More engagement means mor eyeballs means more possible clicks on ads means more revenue. The nature of this engagement is immaterial. A thoughtful rebuttal of a misleading statement is not worth more than the mindlessly forwarding it to your friends. In fact less so as it is less likely to generate more engagement.

The Engagement Trap

And that’s where the engagement trap lies. To make people engage more the Internet companies set up a global competition, the hunt for likes. One of the big goals for social media companies is to be habit forming. That just means they are actively trying to make their platforms addictive, solely in the pursuit of bolstering their bottom line. To achieve that they gamify their platform to stir our competitiveness and play on our desire for attention.

New Gatekeepers

Their stance is that they are not news rooms but just conduits for information and so cannot be held to journalistic standards. The winners in the engagement contest are chosen by the users of the platform, so they are just mirroring the wishes of the users. We are democratizing publishing was the spin they put on that.

What these efforts did not achieve however, is doing away with gatekeepers to information because the core problem still exists. Everybody needs someone to weigh and verify information for them for some or most topics. Instead they gave the authority to define the world to people that are most successful at playing the game set up for them.

In doing so the Internet companies replaced an imperfect reputation system based on journalistic ethos with one that is solely based on winning the engagement contest validating any information or disinformation coming from someone who manages to attract a big online following.

A Hell of Our Own Making

So it seems we are trapped in a hell of our own making. We elevate people to influencer status and the Internet giants are just handing us the rope to hang ourselves. That is only partially true however. What we are displayed on these platforms is carefully selected by sophisticated algorithms to maximize our engagement and with that the revenue. The winners of the game then are not necessarily the people that would post the most enlightning content or not even the people that we would most appreciate to hear from. It’s the people that bring the most profit to the platforms.

How Does This Lead to Disinformation Spreading?

The dynamic of giving the most visibility to people that are best at creating engagement leads inevitably to the spread of disinformation because of another trait of human nature. From Aldous Huxley’s “Brave New World” that imagines a world where constant distraction serves to keep the populace entertained to Neil Postman’s “We Are Amusing Ourselves to Death” in which he exhorts TV as the end of civilization, the desire of people to be distracted has been called out as the weak point in human societies.

Catchy slogans and simple answers will on average resonate more with the people than careful analyses of the issues that warrant a public debate. That holds true regardless of people agree or disagree with them. This means catchy slogans and simple answers will always create more engagement and those who spread them will be favored by the algorithms that optimize for that. With the amount of followers and likes as the only visible metric the internet giants attach credibility whatever they write.

But They Are Fixing It!

Note that there is no ill will or bad intent behind all this. None of these companies explicitly plan to spread disinformation. These companies are fundamentally data driven and will just create algorithms that optimize for their bottom line. That’s why I don’t think it is insincere if they publicly try to put that genie back into the bottle, although often in a hamfisted manner.

Regardless of their professed civic engagement however the algorithms devised by these companies will always value the quick share from a dubious source or the half-witted put down of a critic over a serious discussion because it will improve the number of eyeballs they can sell and that is the metric they are optimized for. Even if we disregard the moral and civic issues of handing the power of censorship to private companies, as long as the algorithms systematically favor disinformation, blocks and bans will never be able to stem the tide. But as the saying goes it is difficult to get someone to understand something, when their salary depends on them not understanding it. This is doubly true for multi billion dollar corporations.

What Now?

To disrupt the dynamic I laid out, at least on of the five links in the chain has to break. Given that changing human nature is obviously an impossible task that leaves us two angles.

1. Prevent information distributors from selling attention
2. Disallow them to display the number of likes and followers of a person or posting and from sharing ad revenue based on engagement

The practical difficulties of either angle are daunting. The alternative of course is to reject the siren song of free data silos and move on to a more distributed system. To achieve that this system has to offer at least a similar degree of convenience and usability. But that will not be enough. It will also have to offer advantages that are more tangible than the abstract concept of decentralization.

Fortunately the optimization for revenue introduces perceptible usability degradation. People complain about algorithms that do not show them important stuff and of course about intrusive advertisement that is blending in more and more with organic content. Obviously overcoming the network effect is a big task but give a convenient enough system and perceptible benefits it might be doable.


Leave a Reply

Your email address will not be published. Required fields are marked *

Try it

With AllPeep, you can build private communities or a public square connected to the world on social media – and many things in between.

Find out more – set up a demo session with our team.