After Mark Zuckerberg defended himself before US Congress in April, Facebook came under intense scrutiny and criticism. Since then, they have taken out a huge advertising campaign, mentioning within it the problem of fake news and stating that ‘this is going to change’.
In the last few weeks, reporters have been invited into the Facebook offices to tell them how exactly they intend to do this.
In the presentation, they stated that just being false does not violate the company standards. Therefore, unless a post has violated one of their rules, such as harassment, then it would not be taken down.
They did however explain that while they will not remove these posts completely, distribution for pages that repeatedly put out inaccurate information will be cut.
This means that the content would still be allowed to exist but it would be demoted on the timeline to be made less visible to their followers.
This has prompted further criticism, with some questioning why they would allow pages like Alex Jones’ InfoWars to exist and not just remove them entirely.
Facebook is a company. As a company, and not a democracy bound by values of freedom of speech and equal treatment, they have every right to police the content on their site based on their own rules and values.
If, for example, they did want to say that posts saying the earth is flat would be removed from their platform, then they could absolutely do that.
However, recent events have demonstrated that Facebook’s content does have the potential to have a significant impact on a democracy.
This is not surprising as with 2.23 billion monthly users, they govern a huge proportion of the World’s communication and media consumption.
The consequences of this then is that when it comes to political subjects, there is always going to be controversy over any decision to police the content Facebook chooses to display.
As an example, while global warming is widely accepted as a scientific fact, there are people who would deny it’s existence and see this as ‘fake news’.
So which one is right and should Facebook have the right to remove one over the other?
Zuckerberg seems to recognise this problem with the decision not to remove content entirely, while also agreeing they do a responsibility as a global communication platform.
In a quote he states: “The top 100 things going viral, we do have a responsibility to make sure these things are not false”.
So the question now is, should new standards be defined for platforms such as Facebook? Is it now more than just a company?
At the present time it appears that one man, Mark Zuckerberg, holds tremendous power.
Something that even he seems uncomfortable with, seemingly trying to pass the buck along saying, “we should be trying to figure out how to empower and build institutions around us that can figure out what to do about truth on the internet”.
This presents a remarkable challenge, who should be responsible for this and what will the future of social media look like?