The Facebook Files: A Clarion Call for Real Accountability and Transparency
Technology Policy Brief #63 | By: Scout Burchill | October 18, 2021
Header photo taken from: Financial Times
Follow us on our social media platforms above
Browse more technology policy technology from the top dashboard
Photo taken from: The BBC
A remarkable investigative series published last month by the Wall Street Journal reveals the profound, and deeply disturbing, ways Facebook is warping our society. The Facebook Files, as they are called, expose bombshell revelations about the harms the biggest social media company in the world knowingly perpetrates. While much ink has been spilled on this topic before, the Facebook Files are already shaping up to be the most damning scandal to rock the company since Cambridge Analytica.
According to the Wall Street Journal, who received a wide-ranging cache of internal documents from Frances Haugen, a whistleblower within the company, Facebook has a secretive system of exclusive content moderation rules called ‘XCheck’ that only apply to about 5.8 million specially designated VIP users. A number of the reports, which feature company conducted research, also confirm that Facebook is well aware of the harms its platforms have unleashed on society. For example researchers found that Instagram has had a profoundly negative effect on teenage girls, with close to one-third of teen girls stating that when they feel bad about their bodies, Instagram makes them feel even worse. 13% of British teens and 6% of American teens who reported having suicidal thoughts attributed their desire to kill themselves back to Instagram.
Further internal documents reveal that 87% of all platform moderators’ time is spent on content in the United States, a massive inequality considering that 90% of Facebook users live outside the United States and Canada. As a result of their underinvestment, the company knew about drug cartel operations and human trafficking in developing regions, but did little about it. Just as equally damning, internal memos show that Facebook’s 2018 algorithm tweaks, which were made under the guise of incentivizing more meaningful social reactions, actually made the platform a measurably more effective manufacturer of outrage. Divisive and politically destabilizing content became even more rewarded. In fact, one political party from Poland estimated that the 2018 algorithm tweaks shifted their content from about 50% negative content to about 80% negative content.
If there is one takeaway from the Facebook Files it is the undeniable confirmation that Facebook is well aware of the harms of its platforms, and is either unable or unwilling to fix them. Its own workforce is now raising the alarm, calling for sorely needed accountability.
Facebook is increasingly living up to the precedent set by the ‘Big Tech’ moniker. Similar to Big Oil or Big Tobacco, the social media company is extracting massive profits at the expense of the well-being of our society. It is knowingly addicting users to cycles of manufactured outrage, polluting our mental health and poisoning our information ecosystems, all while engaging in sophisticated public relations campaigns and cover ups. The key here is not that Facebook’s business model is adversely affecting society, but rather that Facebook knows a lot more about how its business adversely affects our society than it cares to admit. And worse, the Facebook Files actually contradict a lot of what Facebook has been publicly telling us about its content moderation policies, algorithmic incentives and mental health impacts.
The Facebook Files, taken together with recent developments at Facebook, paints a picture of a company that is trying really hard to appear accountable, trustworthy and responsible, while actively undermining any real efforts to be so. Over the summer, Facebook banned a group of New York University researchers who were attempting to study political advertising on the platform. The researchers were crowdsourcing data on political ads from users through a browser extension called Ad Observer. Facebook claimed that the project puts users’ information at risk by collecting their data, however the open source codes of the extension prove that this is simply not true. The reality is Facebook is prioritizing the privacy of advertisers who pay big bucks for targeted political ads over true transparency.
Photo taken from: Washington Square News
Shortly after this incident, Facebook released a quarterly transparency report that proved to be anything but transparent. Not long after its release, the New York Times revealed that the original transparency report was shelved out of fear it would make the company look bad.
For over a decade Facebook has tricked the public, and probably even itself for a time, into thinking that everything bad about Facebook can be fixed with a tweak of the algorithm. One of the upshots of this thinking is the assumption that Facebook is the best equipped actor to solve the problems of Facebook because at the end of the day the platform’s problems are mainly technical in nature, like a lever gone awry or a dial turned too far up in the algorithmic amplification machine. Facebook itself goes to great lengths to prop up this fantasy, as well, because it allows the company to obscure its real business, which is fundamentally driven by profit, not good intentions. The Facebook Files shed light not only on the myriad ways Facebook’s policies affect society, but also help illuminate the company’s true priorities, which have always been maximizing profits by any means necessary.
Click or tap on image to visit resource website.
The Center for Humane Technology
The Facebook Files (writer’s source):
Facebook Bans Researchers (writer’s source):
Facebook’s Transparency Report (writer’s source):