Human trafficking, war mongering? All this whistleblower says is wrong with Facebook
The world changed in 2004 when Harvard student Mark Zuckerberg, along with friends Eduardo Saverin, Dustin Moskovitz, and Chris Hughes, began the social media giant Facebook. It endeavored to connect students all over campus by using features such as Timeline, Status, and NewsFeed. It grew exponentially until 2006, when the company made the service open to anyone over thirteen, not just students.
The attractiveness of Facebook stems in part from Zuckerberg’s insistence from the beginning that members be transparent about who they are. Management argued that transparency is necessary for forming personal relationships, sharing ideas, and building society. However, does that transparency apply to everyone? Does the company itself follow that rule? What’s wrong with Facebook?
Privacy remains an ongoing problem for Facebook. It first became a severe issue for the company in 2006, when it introduced News Feed, which consisted of every change that a user’s friends had made to their pages. After it was voted in the bottom five percent in consumer reports to privacy concerns, Facebook swiftly implemented privacy controls in which users could control what content appeared in News Feed.
More recently, Cambridge Analytica collected the personal data belonging to millions of Facebook users through an app called “This Is Your Digital Life”. The app used a series of questions to build psychological profiles and collected personal data for ad targeting. Political parties also used it to provide analytical assistance to the 2016 Presidential campaigns of Ted Cruz & Donald Trump.
In July of 2019, Facebook was fined $5 billion by the Federal Trade Commission due to privacy violations. It was ruled that their breach of privacy exposed their users to “serious risk of harm”. In May of 2018, Cambridge Analytica filed for Chapter 7 bankruptcy.
What’s wrong with Facebook?
Frances Haugen, 37, is a data scientist from Iowa with a degree in computer engineering and a Harvard master’s degree in business. For fifteen years, she’s worked for companies including Google & Pinterest. However, she quit her most recent position at Facebook in May, taking a treasure trove of documents to support her claims that the company wasn’t honest with people about who they were.
“The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook, over and over again, chose to optimize for its own interests, like making more money,” she said. She claimed that the company is lying to the public about making significant progress against hate, violence, and misinformation.
After coming forward in America, British lawmakers took an interest in what she had to say, especially a British parliamentary committee that’s further along in creating legislation to crack down on social media platforms. She told the committee that Facebook Groups amplify hate online. She claimed that algorithms prioritize engagement with mainstream interests and push to radicalize.
“Unquestionably, it’s making hate worse,” she said.
She further illustrated what’s wrong with Facebook by highlighting the decision that they would be adding 10,000 engineers to work on what they call the “metaverse,” believing it’s the new online trend. She responded with, “Wow, do you know what we could have done with safety if we had 10,000 more engineers? It would be amazing.”
Incapable of stopping more than hate
Facebook has struggled for years to crack down on matters worse than hate speech. They have known that human traffickers use the platform to sell people, especially teenage girls, into what they call “domestic servitude.” It got so severe overseas that in 2019, Apple threatened to pull Facebook, Messenger, and Instagram from their Apple Store. The result would have been devastating to the business of Facebook.
“We prohibit human exploitation in no uncertain terms,” spokesperson Andy Stone said. “We’ve been combatting human trafficking on our platform for many years, and our goal remains to prevent anyone who seeks to exploit others from having a home on our platform.” While they continue to work on stopping issues like human trafficking, hate speech, and radicalization, gaps still exist.
Facebook kept all of these struggles secret from their investors; Haugen claimed, “Investors would have been very interested to learn the truth about Facebook almost losing access to the Apple App Store because of its failure to stop human trafficking on its products.” With the possibility of Apple pulling the platform, stocks would have theoretically dropped alarmingly fast.
As more claims surface about Facebook and its practices, it becomes increasingly more questionable that the company follows Zuckerburg’s original mission of transparency. The problem of hate speech on Facebook and the further radicalizing of formerly moderate views begs the question, how does social media impact free speech? It also asks what role does a company plays in its user’s illegal actions?
It also begs the question of when a company forgets its original mission and abandons it in the quest for more profit. As for what’s wrong with Facebook, Francis Haugen believes it made that change long ago.