The Technocracy

October 7, 2021

Share this article:

A major outage led to the unavailability of Facebook and its subsidiaries WhatsApp, Instagram, and Messenger on Monday evening. The outage started at approximately 17:30 and lasted until 23:50. About 3.5 billion users of Facebook, WhatsApp, Instagram, and Messenger combined were unable to check their feeds and receive messages. This event led to a surge in activity on Twitter, of users who took to vent about the outage. Personally, this outage was a wakeup call on how much I truly rely on WhatsApp for most of my day-to-day communication. I realized how a corporate entity like Facebook exercises this much power over our daily lives. This realization has led to me diving more in-depth into the realms of Facebook and its subsidiaries.

First of all, Facebook and its subsidiaries have a total of about 3.5 billion active users across the globe. This is half of the world population. These users generate a vast amount of data on a daily basis regarding their personal lives. Since Facebook is not a government institution, it has fewer regulations than public corporations and hence is often able to do what it wants with this data. This problem has already been apparent from almost the day Facebook started. With the main example of this data usage being personalized advertisements. While this is not in itself a bad thing, on the contrary it can even be a very positive thing, because it allows you to get more information on the things you find important. However it is the combination of this with a fact that has been getting more apparent in the last few years.

That is, that for many people Facebook is used as a news source, and a powerful weapon in political campaigning. This leads to warping people’s perception of reality, because their lens to the outside world is based on their own opinion and interests and hence reinforcing their previous beliefs whether or not they are true. This powerful information is so uncontrollable that it has led to facebook trying to actively correct for it by removing messages and people from its platform. Hence Facebook as a private owned relatively unregulated company is starting to dictate whether or not something is true for its 3.5 billion users. However here the story doesn’t end because the problem is drastically growing because of the expansion of Facebook.

Because Facebook is a profit-maximizing business, it will aim to have an edge over its competitors. If it cannot compete it will try to acquire the competition. This happened with the acquisition of WhatsApp and Instagram, which have now become wholly-owned subsidiaries of Facebook. Facebook uses the expertise gained from the acquisitions to improve its own products, but more importantly the acquisitions are aimed at increasing the user base. This in itself is not a bad thing, the problem however arises again with the moderation part of the social medium. Because Facebook will again moderate what it deems ‘accepted’ by its user base, hence giving its users self affirmation regarding their world views. 

Besides, this moderation has led to many privacy breaches in the past. One major breach of users’ privacy has been claimed by a new report by ProPublica that was released last month. While WhatsApp claims that it is fully secure and encrypted, the extensive report claims that about 1000 moderators work for Facebook to check reported WhatsApp messages. The disturbing part is that not only are the reported messages sent for moderation, but also a subset of unreported messages. This is quite ironic, given that WhatsApp claims that its chats are end-to-end encrypted with the famous words: “No one outside of this chat, not even WhatsApp, can read or listen to them”. Again, the problem here is not the data collection and usage itself, but rather the censorship that results from the moderation of the messages.

To conclude, what Facebook does is not fundamentally wrong, however, it should find a new way to moderate its users. More specifically, it should coordinate with the governments in its jurisdictions whether or not it should moderate its users and what type of moderation should be in place, instead of acting as the government itself. Besides, data is useful for our everyday lives, since it can give us meaningful insights on how we behave. If Facebook can improve the way it moderates I am sure that it will get to a position where it shows that big social media companies can have a positive influence on our worldviews, and give opinions from multiple directions.

This article is written by Berke Aslan

Read more

The Paradox of Choice

The Paradox of Choice

The Paradox of Choice is a psychological phenomenon that refers to the idea that having too many choices can actually be detrimental to our well-being and decision-making processes. This paradox was first introduced by psychologist Barry Schwartz in his book "The...