Facebook Can't Regulate Itself
Thinking that a company can learn how to regulate itself from the perspective of the public good seems to be a fallacy in the case of Facebook . This goes even beyond the discussion about user privacy online , that seems to be overshadowed by the spiraling concern about fake-news, hate speech, conspiracy theories, and calls to incite violence all over the world.
I deleted my Facebook account , and therefore I can't experience first-hand many of the topics discussed, but it just feels like the platform is going in one direction and one direction only. The biggest concern with Facebook is that it became an amplifier of any kind of content. They implemented content policies that allowed moderators to act on anything that didn't conform to what was expected:
Charlotte Willner joined three years later, as one of the company’s first employees to moderate content on the site. At the time, she said, the written guidelines were about a page long; around the office, they were often summarized as, “If something makes you feel bad in your gut, take it down.”
But human moderation is expensive and normally slow. Plus, it strains the people behind because they are exposed to any kind of material (from child exploitation to calls to violent reactions against groups of people).
Two months later, Gray hired a local law firm and sued Facebook in Irish High Court, alleging that his “repeated and unrelenting exposure to extremely disturbing, graphic and violent content” had caused him lasting psychological trauma. Shortly thereafter, about twenty more former Facebook moderators in Dublin contacted the law firm representing Gray to ask about possible lawsuits against the company.
There is plenty of data showing that Facebook is plainly failing at moderating content. A particular challenge appeared when a figure such as Trump posted hate speech on his account. Can you go against an elected president? What if he would have been the president of a different country?
Since taking office, in 2019, he has delivered a weekly Presidential address on Facebook Live. Earlier this year, during one speech, he said of Brazil’s indigenous population, “The Indian has changed. He is evolving and becoming, more and more, a human being like us.”
Bolsonaro's office was approached before the elections to run campaigns on Facebook. It is much cheaper than on TV or newspapers and it can have a profound impact. The proposition is not unreasonable, but it also means that moderation becomes a conflict of interest. You are getting payed, what would your interest be in muting your source of income?
In 2019, Zuckerberg advanced the idea that Facebook was due for a change. This was framed within the idea that "The future is private", acknowledging a trend called (by some) the Dark Forest Theory . It is also important to point out that what Zuckerberg defines as "private" is not what anybody else's defines a privacy. So be mindful of these subtle language differences.
“The future is private,” Zuckerberg told the crowd, noting that Facebook’s most dominant vision over the last decade was to build global communities that would bring the world together, for better or worse. “Over time, I believe that a private social platform will be even more important to our lives than our digital town squares. So today, we’re going to start talking about what this could look like as a product, what it means to have your social experience be more intimate, and how we need to change the way we run this company in order to build this.”
In Facebook's view, the privacy they are craving for is related to the use of groups. Potentially closed and tight communities of users that can exchange ideas, links or whatever. However, the only way in which groups can kickstart is by shoveling them on users' faces.
At this point, Facebook is working so hard to ignore expert advice on how to reduce toxicity that it looks like Facebook doesn't want to improve in any meaningful way. Its leadership simply doesn't seem to care how much harm the platform causes as long as the money keeps rolling in.
Groups on Facebook can still share any type of content, just because they may be "private" it does not mean that hate speech disappears all of a sudden. The real issue with siloing and resonating the same type of information, to the same groups of people, is that it will become almost impossible for users to flag content (they won't see it), nor to judge what the platform is doing.
How much blame does Facebook deserve for QAnon’s growth?
When Facebook changed its focus to encourage people to gravitate to smaller, more intimate groups, it inadvertently created safe havens for people to discuss how to spread QAnon theories.
Facebook needs to ask itself if it has a responsibility for fueling QAnon and think through the consequences of that.
So far, every step that Facebook has taken points towards optimizing their revenue, and not even tangentially addressing the problems they generate, not only online, but also in the real world.
Backlinks
These are the other notes that link to this one.