I spent most of 2010 in the startup world. In Hungary, I was one of the first to enter this world, so I watched from the inside as the digital gold rush reached the sobering up stage. I experienced both the valuable and toxic sides of the culture that surrounded it, and I myself followed the morally questionable product design practices whose social effects we are still groaning about today. Information technology is definitely going to be part of our future, and quarantine is where we really see how much we owe to it. But if we cannot develop a wiser culture alongside it, we will end up like a child playing with fire. And the fire will not be to blame.
“Software is eating the world”
The quote is from Marc Andreessen, and it is now clear how right he was. When we discuss technology today, we are really discussing this phenomenon. It is clear that the social impact of the Internet will be at least on a Gutenberg scale. But it could also be that we have started a revolution on a scale similar to the invention of writing, and that the changes so far are just the tip of the iceberg. For those who can see new technologies as opportunities and can review their habits to see what is possible today that was not possible yesterday, exciting times lie ahead.
This sounds great, until you realise how dramatically the Gutenberg revolution destabilised contemporary society, and how much blood was spilled to achieve a new equilibrium. That, of course, would not lead a sane person to see book printing as a harmful thing. But it would be a shame to deny that such a tsunami requires social precautions. In fact, the debate should be about the details, not about chasing utopias and machine-wrecking reflexes.
But first we need to review the current situation and separate the different layers of the problem.
Software as a Designer Drug
Take a look at the home screen of your phone. Pretty, colorful, square shaped icons smile at you seductively. I’ve always been haunted by this design, but I’d never have guessed why. It was Tristan Harris, founder of the Center for Humane Technology, who clued me in. My phone screen looks exactly like a slot machine!
The similarity is not coincidental. These products make more money the more time you spend with them. And the most effective way to maximise this is to make the product addictive in the strictest sense of the word. And if the goal is the same, sooner or later the form will be the same.
Don’t think this is some debatable conspiracy theory. An entire industry has been built on addictive product design, and the practices are openly taught. The bible on the subject is the book Hooked, which I myself have used when I needed to improve the usage statistics of my own product. I can attest that it works. Moral self-reflection was completely absent from the startup world culture at the time, and I was struggling to fit into this new and exciting world. So it never occurred to anyone that this might be problematic. We were busy with other things. And so are the biggest companies in Silicon Valley.
I see a lack of moral self-reflection as the root of the whole problem. In other respects, startup culture is by far more supportive and positive than any other medium I’ve ever been in. But it has no soul. And that is a very dangerous deficiency.
The Trojan Horse of Censorship
Public opinion about the tech world has suddenly changed after 2016. Apparently all that has happened is that people have reassessed their opinions in the wake of the Cambridge Analytica scandal.
It’s worth noting, however, the oddities of the political thread in the story, and the moral panic that Trump’s election (understandably) caused. The elites urgently needed a scapegoat if they were to avoid the obvious questions about their own responsibility.
The tech world is not blameless, of course, and the accusations are partly justified. But the scandal surrounding the issue is really a political lightning rod. A rubber stamp for the people to avoid facing the deeper social causes of the Trump phenomenon and the responsibility of the elite. It would have been too painful to look in the mirror, instead they manufactured a narrative of irresponsible tech platforms. At this point the plot took an absurd turn that perhaps even Orwell would not have expected. The hysterical masses themselves began to demand their own oppression and the outsourcing of the determination of truth to private corporations.
Notably, the tech platforms have not missed this epistemological nightmare on their backs, and the role of the censor is not a bit thankless. They would prefer to release all content, not just because of the cost, but because the editorial role is far less favorable to a company under US law than the independent platform category.
But social pressures forced them to take the plunge, and the excesses appeared immediately. No wonder. The new rules provide incentives such that, when in doubt, rigour is always the rational course of action. Wrongful censorship is not penalised, but failure to delete with justification is. And once again, this regulation was demanded by a public hysterical by the elite!
So the so-called Techlash, or public outcry against big tech companies, is a conflation of two issues that have nothing to do with each other, and one of which is based on a misunderstanding. With all due respect to Tristan Harris, unfortunately he too falls into the trap. With one hand, he rightly exposes the dangerous irresponsibility of companies, but with the other, he forces those same companies to be even more irresponsible.
Something tells me this is not going to end well.
Colonized online space, privatized freedoms
This brings us to the most exciting part of the topic. One of the main pillars of democracy is free public discourse, and the organically evolving public opinion that follows from it. The question is what we mean by public discourse in 2020.
It is not just that we should sometimes review our old habits to find new possibilities, but also to notice when our old habits no longer work in a new situation. The public discourse on private platforms in the last century is a prime example. Surely it is a good thing to have our entire public discourse at the mercy of Facebook’s moderation decisions? Where can those who are declared heretics by the big platforms go? There are no competitors in this market because of the network effect.
Of course, this is not a problem as long as we do not push the boundaries. But what if there are no longer good answers to our social problems within the “boundaries”? (We also discussed this in our video “Treasure Hunt Beyond the Overton Window”.) And what if this makes some people feel threatened and offended by experimentation? Will the almighty algorithm and the Californian rulers of the online space allow such things to be discussed?
Towards a more humane tech sector
It makes no sense to ask whether the information revolution is good for us, because whatever the answer, the process will continue anyway. Newer and newer achievements of information technologies will become so integral to our lives that we will no longer consider them technology, just as we no longer consider the lamp or the book we read under it technology.
The problems were not actually caused by the spread of IT, but by the crisis of values that coincided with it, and which allowed the tech sector to lose moral direction. It is pointless to dwell on the symptoms, we need to address the root cause of the disease. Tristan Harris’ vision of a people-centred tech sector can only be realised if we can make our whole culture more people-centred. And the way to do that is not through rejection of the current culture, but through a synthesis of valuable elements.