Mark Zuckerberg’s company says Facebook is not so bad: It’s just that people are careless.
At a conference last fall, Mark Zuckerberg said his company is “certainly not trying to undermine or basically, you know, throw aside the entire democratic process here.” If only that were true. But in response to a rampant disinformation campaign, alleged federal election interference and an erosion of privacy standards, Facebook has tried, for years, to undermine democracies all over the world. Its B.S. defense — that you, dear user, have been careless, misinformed and clueless — only compounded the problem.
But what do you expect from an executive whose most famous imperative (“move fast and break things”) is essentially: “Don’t bother me about the little stuff. I’ve got big ideas that I will think are right, often without really knowing why. And because they’re big, anyone who tries to put up obstacles is going to cause a big headache for everyone around us. So don’t be that guy. Come along for the ride and keep your mouth closed.”
Voters in the 2020 presidential election appeared to be better informed than in 2016, thanks to a critical will to not get snowed. But big ideas are hard to undo. They continue to travel, especially in a system like Facebook’s, where it can sometimes seem that ideas themselves have a way of moving beyond information to emotion to fact.
“In places where fundamentally we did not have much to do with spreading misinformation, there has been a lot of, kind of, correction,” Mark Zuckerberg told ABC News last month, when asked about Facebook’s election-season performance in more than 80 countries around the world. It’s interesting to re-spin his response to convey uncertainty behind his words, especially now when that approach comes off as defensive. The effect of the statement, however, would still hold: Even in places where Zuckerberg’s company hadn’t done a tremendous amount to disrupt the democratic process, democracy had already been sufficiently eroded.
In the United States, by all accounts Facebook did do a tremendous amount to disrupt the democratic process — and continue to do so. Only recently does Facebook admit that roughly 80,000 posts produced by the Kremlin’s Internet Research Agency may have been seen by 126 million people. These posts, designed to sow confusion and dissent among politically active Americans, could have proved decisive in certain areas and districts and therefore had a demonstrable impact on the outcome of the 2016 presidential election. Since then more than $5 billion in hardware, equipment and cybersecurity professionals have been allocated to the 2020 election, not least to prevent, identify and repair Facebook’s and other social media platforms’ contribution to the next American Russian cyber disaster.
But Zuckerberg has been so careful not to admit that Facebook bears a significant responsibility for the content posted on its platform that it sounded as though he was saying he was letting someone else own that particular link in the chain of accountability. But he was really trying to portray his company as an arbiter of election fairness, a service that aggregates and disseminates information to the electorate, rather than as the powerful piece of technological infrastructure that it is.
Of course Zuckerberg also benefits from framing the problem this way. That way he can fix it — while still trying to convince investors that Facebook is the indispensable intersection between people and brands. Between February 2020 and February 2021, his social media company’s stock price jumped more than 70 percent. He’s now worth more than $100 billion.
And in his own recent reckoning of the company’s performance, Zuckerberg explained that one of his personal goals for 2021 was to improve “the idea that social media is inherently bad for well-being.” Excuse me? Do we use that oraisonBO: Send viaBluetooth?
Since its founding, Facebook has been an architecture of progressive disconnection. Part of the social contract it brokered was that each person would not have to endure the slings and arrows that the rest of humanity must bear. Facebook would be an unfettered, context-free environment. It wouldn’t be filtering shared ideas, tweaking or reconfiguring user experiences on the fly. It would not be defining, redefining or surveying its users with a fair degree of precision.
Collaboration and information flow were intentionally cut off at the pass. Psychologists, linguists and social engineers quickly and quietly figured out how to turn that engine of emotional isolation into a megaproduction of both propaganda and profit. They knew that they control access to the means of social communication and so they control not only what we say but also, in a very real sense, precisely how we speak.
For more than a decade Zuckerberg has been trying to persuade us to overlook the fractured digital sepsis beneath his domain’s sanitized surfaces. But because he, like all of us, is living in denial, his quarantine is starting to bite.
Nervously, the man who once told the world his mission was to “accelerate progress in every new field and on every new horizon” has finally come clean. Facebook, he says, can indeed be bad for your (mental) well-being.
Was it just that Facebook users have been careless — the company’s most recent defense — or does Facebook itself need some serious supervision?
The editors at Sputnik think so. Sputnik is funded by the Russian government and is considered by the United States government to be a disinformation outlet. Its top editors have been indicted by the U.S. Department of Justice for wire fraud, money laundering and conspiracy.
No one is suggesting that American publishers shouldn’t use Sputnik’s content or employ its journalists. Their mission, is, to quote a bit from Jimmy Breslin’s classic, “The Gang That Couldn’t Shoot Straight,” “the big story of our time” — the moment-by-moment breakdown of the future that information technology and the internet have precipitated. In order to understand the shape of that future it seems both crucial and wise to intercept, synthesize, thrash and assimilate other journalists’ content and to seek out other voices and viewpoints. That kind of engagement can bring more to the table than either the bad guys or native intelligence sources might want to admit.
But if Sputnik’s content, observed over time, comes to fulfill its editor’s dismal mission, and filtrates down through the system, resulting in electoral victory by subversion, conspiracy, wire fraud and money laundering — well, that would be a totally different story.
In the end, we may all be a bit careless. But before that happens to us — and before companies like Facebook make it seem normal — we need to make more serious demands of the social media platforms that have digitized our tribal instincts and monetized our most intimate personal relationships. We will have to push back more forcefully against the ambiguities and tribalist fallacies of an information-energy production line that attempts to camouflage itself as a democratic service.
Can you summarize the main points from an article about Facebook’s response to interference in elections and the issue of personal responsibility for spreading misinformation?
Leave a Reply