The Never Ending Problem: Facebook and Freedom of Speech

Throughout the entire month of July, big brands and companies like Adidas, Coco-cola, Ford, and many more have pledged to boycott Facebook advertising. Some companies are even reviewing and re-considering advertising on the site completely. The pulling of multi-million dollar advertising budgets is a huge hit for Facebook, which makes a majority of its revenue from advertising.

The boycott is organised under the aegis of the Stop Hate for Profit campaign, which takes issue with Facebook’s lacklustre attempts at controlling hate speech on its platform. Organisers allege that widespread misinformation has helped incite violence against Black Lives Matter protestors.

President Trump’s own social media posts contributed to this call for action against hate speech. During the height of the protests in June, President Trump unsurprisingly took to social media to share his thoughts. His posts included statements like “when the looting starts the shooting starts” and labeling protestors as “thugs”. The ‘looting and shooting’ phrase is especially harmful to the Black community when considered in the context of police brutality. Labelling BLM protestors as thugs also feeds into the dangerous narrative that Black people engage in more crimes than their white counterparts.

Twitter was quick to label Trump’s posts as untruthful or glorifying violence. Facebook, on the other hand, has continued to take the morally superior ground of protecting freedom of speech. In October 2019, when delivering a speech at Georgetown University, Facebook CEO Mark Zuckerberg said the company should “fight to uphold as wide a definition of freedom of expression as possible.” While Zuckerberg and other executives met with the organising civil rights groups last week to discuss further action, representatives from the groups mentioned that Facebook had taken little to no tangible steps to address their concerns and meet their demands.

Freedom of Speech – A Philosophical Debate in Context

At the heart of the dilemma that Facebook faces is a simple philosophical question – should there be a limit on our rights and freedoms and who decides that limit? A right to do or say something does not necessarily claim that the action is a good or desirable one. After all, anyone has the freedom to do something foolish or bad.

Famous historical philosophers like Kant believed that rights were absolute.  More modern philosophers tend to err on the utilitarian side of things – that a right should not exist where it harms the wider community. People who engage in hate speech are likely to use the absolute argument – arguing that their right to freedom of speech protects any racist and discriminatory comments they make. Minority communities would take the utilitarian view – that absolute freedom of speech harms their welfare and livelihood.

Hate speech is not just speech. Framing the conversation on moderation of hate speech solely in terms of the freedom of speech debate mischaracterises the problem. Hate speech is very much a racism and discrimination issue and freedom of speech is merely a scapegoat excuse to continue allowing widespread bigotry. Allowing individuals to propagate harmful stereotypes and racial slurs lets them continue to cast minorities in the same negative light that their oppressors did. It allows systemic racism to survive.

More tangibly, hate speech on Facebook has been proven to lead to real violence. In 2017, Facebook failed to control the hate speech on its platform against Rohingya Muslims in Myannmar. The posts labelling the Rohingya people as dogs and rapists and encouraging they be killed and shot were not just posted by common citizens but also included status updates from key government officials and military leaders. The situation eventually escalated into the wide spread genocide of the Rohingya community. Aside from the scale of the consequences, how different is this from President Trump condoning the shooting of the “thug” protestors that led to thousands of peaceful protestors being tear-gassed and shot at with rubber bullets?

It seems clear to me that Facebook should take steps to better moderate hate speech on its platforms regardless of whether it comes from elected officials or not. The current position shows a blatant disregard for the responsibility it has in ensuring its site is safe for all user groups.

Words by Dheepa M

Support The Indiependent

We’re trying to raise £200 a month to help cover our operational costs. This includes our ‘Writer of the Month’ awards, where we recognise the amazing work produced by our contributor team. If you’ve enjoyed reading our site, we’d really appreciate it if you could donate to The Indiependent. Whether you can give £1 or £10, you’d be making a huge difference to our small team.

Related articles

Leave a Reply

Your email address will not be published. Required fields are marked *