And U.S ? We are the 3 billion, after all. What if every Facebook user decides to be a better person, to think harder, to know more, to be kinder, more patient and more tolerant? Well, we’ve been working for the betterment of humanity for at least 2000 years, and it’s not going very well. There is no reason to believe, even with “media literacy” or “media literacy” efforts aimed at young people in a few wealthy countries, that we can count on human improvement, especially when Facebook is designed to harness our tendency to foster, emotional and extreme expressions that our best angels avoid.
Facebook was designed for better animals than humans. It was designed for people who don’t hate, exploit, harass or terrorize each other, like golden retrievers. But we humans are nasty beasts. We must therefore regulate and design our technologies to correct our weaknesses. The challenge is how.
We must first recognize that Facebook’s threat does not lie in some fringe aspect of its products or even in the nature of the content it distributes. It is in these core values that Zuckerberg has anchored in every aspect of his business: a commitment to continued growth and commitment. This is made possible by the ubiquitous surveillance that Facebook exploits to target ads and content.
It is especially in the global and deleterious effect of Facebook on our capacity to think collectively.
This means that we cannot organize a political movement around the simple fact that Donald Trump exploited Facebook for his profit in 2016 or that Donald Trump got kicked from Facebook in 2021 or even that Facebook directly contributed to the eviction. mass murder and the murder of the Rohingya. people in Burma. We cannot rally people around the idea that Facebook is dominant and coercive in the online advertising market around the world. We can’t explain the nuances of Section 230 and expect some sort of consensus on what to do about it (or even whether law reform would make a difference for Facebook). None of this is enough.
Facebook is dangerous due to the collective impact of 3 billion people who are constantly watched, and then whose social relationships, cultural stimuli and political awareness are managed by predictive algorithms that are skewed towards constant, growing and immersive engagement. The problem isn’t that an eccentric or a president is popular on Facebook in some corner of the world. The problem with Facebook is Facebook.
Facebook is likely to be just as powerful, maybe even more powerful, for many decades to come. As we strive to live better with it (and with each other), we must all spend the next few years imagining a more radical reform agenda. We need to strike at the root of Facebook and, while we’re at it, Google. Specifically, there is recent regulatory intervention, however modest, that could be a good first step.
In 2018, the European Union began to insist that all companies that collect data respect certain fundamental rights of citizens. TThe resulting general data protection regulation grants users some autonomy over the data we generate and insists on minimal transparency when that data is used. While the application has been spotty, and the most visible sign of GDPR has been additional warnings that require us to click to agree to the terms, the law offers some potential to limit the power of big data vacuum cleaners like Facebook and Google. It needs to be studied closely, reinforced and disseminated around the world. If the United States Congress – and the parliaments of Canada, Australia, and India – took citizens’ data rights more seriously than they do when it comes to content regulation, there could be a little hope.
Beyond the GDPR, an even more drastic and useful approach would be to limit the ability of Facebook (or any business) to track everything we do and say, and to limit the ways in which it can use our data to influence our social relations and our political activities. We could limit the reach and power of Facebook without infringing on speech rights. We could make Facebook matter less.
Imagine if we focused on how Facebook actually works and why it is as rich and powerful as it is. If we did that, instead of focusing our attention on the latest example of bad content circulating on the platform and reaching a small fraction of users, we might stand a chance. As Marshall McLuhan taught us 56 years ago, it’s the medium, not the message, that ultimately matters.
More great WIRED stories