FBI warning prompted Facebook to suppress legitimate news story
What I learned from Joe Rogan's conversation with Mark Zuckerberg about censorship and content moderation.
Facebook founder Mark Zuckerberg recently appeared on the Joe Rogan Experience on August 25th. For three hours, they explored a variety of topics. I listened to this podcast twice, and it changed many of my perceptions of Zuckerberg, Facebook, and how they are portrayed in the media concerning censorship and content moderation. He founded Facebook back in 2004 in his dorm room at Harvard. Facebook’s original target audience was college students. Facebook exploded in popularity and expanded its reach to other colleges and then eventually to the general population. Mark never imagined it would become the social media empire that it is today with over 3 billion active users worldwide. Over the years Facebook has been no stranger to criticisms and controversies as it has become the world’s premier social media outlet.
1. Zuckerberg is more interested in technology and innovation than content moderation.
According to Zuckerberg, his main focus has always been creating new technology that empowers people, connects them, gives them the content they want, and allows them to express themselves. If you analyze the interview, it’s essentially split into two parts. In the first part, they talk about Meta’s new Oculus device coming out, the Metaverse, virtual v. augmented reality, technology’s role in society, and the future of technology. In the second part, they talk about more controversial subjects like tech censorship, misinformation, algorithms, tech addiction, and the darker sides of tech. Observing Zuckerberg, he is much more engaged and passionate talking about new technology and its applications. You can see it in his animated hand gestures and his enthusiasm. As the conversation transitions to the latter half, that enthusiasm drops off noticeably; he’s much less engaged. He even remarks how he much rather talk about technological innovation then censorship.
2. Algorithms are neither good nor bad, but they do have a big impact on the content that you see.
Over the years, Facebook has faced a litany of criticism over its use of algorithms and how they influence its users. Zuckerberg explained how their ranking systems moderate content. These ranking systems favor certain content at the top of your feed. The question that he posed was how you curate content in a way such that you are more likely to see the posts that you care about and that interest you. He acknowledged that these algorithms aren’t perfect. They show you the content that they predict you will like based on your search history, your likes, etc. Joe Rogan went on to tell a story about how his friend only looked at puppies. After doing so, his feed showed only that, content related to puppies. Zuckerberg argued that these algorithms are used to empower people to express what they want and get the content that they want. They aren’t perfect, but some sort of ranking system needs to be in place. The alternatives would be totally random content, or you see only the most recent posts, which businesses could take advantage of by spamming. “Whenever we don’t show people what they want, we lose in the marketplace,” Zuckerberg argued. “Our role is to not tell people what to think but to empower people to pursue their interest and express themselves.” To me, algorithms are just a tool. The criticism of algorithms is really just a criticism of human behavior. They give you what you want. If you are a staunch conservative or liberal, you will probably see posts reinforcing your values and beliefs not challenging them. The question then becomes how you find the right balance when it comes to introducing new content? That question has no simple answer.
3. Facebook has been criticized on both sides of the political aisle for the way it handles censorship and misinformation.
Zuckerberg is in a tough spot. He leads a platform that hosts over 3 billion people. Facebook has the daunting task of regulating what millions of people can and can’t say on their platform. How do you decide what gets deleted, shadow banned, or banned entirely? Here, we have the two competing values of free speech and public/individual safety. Now, we ask the question: “What content is acceptable and what is not?” I agree with Zuckerberg when he says that we can all agree on most of the content that shouldn’t be allowed like pornography, threats of violence, calls to action, racial slurs, etc. Where the line gets blurry is when you have ideas that are open to interpretation, not so black-and-white, or not yet confirmed. For example, when the Hunter Biden lap top story broke out (more on this later), conservatives criticized Facebook for shadow banning the story, while liberals criticized them for not banning it altogether like what Twitter did. This sparked a storm controversy and debate regarding free speech and censorship. Zuckerberg expressed remorse acknowledging that Facebook sometimes makes mistakes in censoring information and regulating content. "There’s no ideological bend. When we don’t get it right, that sucks.”
4. Zuckerberg has mostly removed himself from the content moderation process.
Zuckerberg’s approach has been to establish what he calls “principles of governance” in an effort to de-concentrate decision making. He likened his approach to the federal government’s separation of powers. He’s outsourced content moderation to third-party independent, oversight boards. These outside institutions act as fact-checkers, labeling what they deem as false or inaccurate information as ‘misinformation.’ The problem here is that the truth is not always black-and-white. Many of these so-called ‘facts’ have a lot of nuances and are open to interpretation. How do you define what is true for a subject where a lot of experts don’t even agree? Even worse, posts that have been previously labelled as misinformation have turned out to be true. Who is fact checking the fact checkers? Are these people really unbiased? How do you choose these fact checkers? All fair points brought up by Rogan that I agree with. Facebook may not have a bias, but these ‘fact-checkers’ definitely might.
5. The FBI influenced Facebook’s content moderation.
The most shocking revelation came when we learned that the FBI warned Facebook to keep a lookout for ‘Russian misinformation.’ Shortly after, the infamous Hunter Biden lap top story broke in April 2019 at a time leading up to the 2020 presidential election between incumbent President Donald Trump and Joe Biden. Facebook didn’t go as far as banning the story, but Zuckerberg admitted to limiting its spread so that less people would see it. That story turned out to be true. The most powerful federal law enforcement agency in the country influenced two of the largest social media platforms to censor a legitimate news story during the months preceding a presidential election. This begs the questions, “Did the FBI know this story would break and did they try to snuff it out? And was it a conspiratorial attempt to influence the outcome of the 2020 election?” We don’t know for sure, but this is a red flag no doubt!
So how should content be moderated if at all? What content is acceptable and what is not? There is a lot of grey area here. It’s important to think about these issues in a thoughtful way before coming to any sort of conclusion. Again, these complex issues don’t have simple solutions. What are your thoughts? Tell me what you think.