Make America Great Again Profile Frame on Facebook
'History Will Not Approximate Us Kindly'
Thousands of pages of internal documents offering the clearest picture all the same of how Facebook endangers American democracy—and show that the company's own employees know it.

Most the author: Adrienne LaFrance is the executive editor of The Atlantic. She was previously a senior editor and staff writer at The Atlantic, and the editor of TheAtlantic.com.
Before I tell y'all what happened at exactly 2:28 p.m. on Wed, January 6, 2021, at the White House—and how it elicited a very specific reaction, some 2,400 miles abroad, in Menlo Park, California—you demand to remember the commotion of that day, the exuberance of the mob as it gave itself over to violence, and how several things seemed to happen all at once.
At ii:10 p.grand., a live microphone captured a Senate aide'southward panicked warning that "protesters are in the building," and both houses of Congress began evacuating.
At 2:13 p.m., Vice President Mike Pence was hurried off the Senate floor and out of the chamber.
At 2:15 p.thou., thunderous chants were heard: "Hang Mike Pence! Hang Mike Pence!"
At the White House, President Donald Trump was watching the coup live on television. The spectacle excited him. Which brings united states to 2:28 p.chiliad., the moment when Trump shared a message he had simply tweeted with his 35 million Facebook followers: "Mike Pence didn't take the backbone to do what should have been washed to protect our Country and our Constitution … Us demands the truth!"
Even for the Americans inured to the president'south thumbed outbursts, Trump'due south assail against his ain vice president—at a moment when Pence was being hunted past the mob Trump sent to the Capitol—was something else entirely. Horrified Facebook employees scrambled to enact "break the glass" measures, steps they could take to quell the farther use of their platform for inciting violence. That evening, Mark Zuckerberg, Facebook's founder and CEO, posted a message on Facebook's internal chat platform, known as Workplace, under the heading "Employee FYI."
"This is a dark moment in our nation'southward history," Zuckerberg wrote, "and I know many of you lot are frightened and concerned well-nigh what'due south happening in Washington, DC. I'thousand personally saddened by this mob violence."
Facebook staffers weren't sad, though. They were angry, and they were very specifically angry at Facebook. Their message was clear: This is our fault.
Master Technology Officeholder Mike Schroepfer asked employees to "hang in there" as the company figured out its response. "We have been 'hanging in there' for years," ane person replied. "We must demand more action from our leaders. At this bespeak, organized religion alone is not sufficient."
"All due respect, but haven't we had plenty fourth dimension to figure out how to manage soapbox without enabling violence?" another staffer responded. "We've been fueling this fire for a long time and we shouldn't be surprised it'due south now out of control."
"I'm tired of platitudes; I desire activity items," another staffer wrote. "We're not a neutral entity."
"One of the darkest days in the history of democracy and self-governance," however another staffer wrote. "History will not judge us kindly."
Facebook employees have long understood that their visitor undermines democratic norms and restraints in America and across the world. Facebook's hypocrisies, and its hunger for power and marketplace domination, are not secret. Nor is the visitor's conflation of costless speech and algorithmic amplification. But the events of January 6 proved for many people—including many in Facebook's workforce—to exist a breaking point.
The Atlantic reviewed thousands of pages of documents from Facebook, including internal conversations and research conducted by the company, from 2017 to 2021. Frances Haugen, the whistleblower and erstwhile Facebook engineer who testified before Congress earlier this month, filed a series of disclosures nigh Facebook to the Securities and Exchange Committee and to Congress before her testimony. Redacted versions of those documents were obtained past a consortium of more than a dozen news organizations, including The Atlantic. The names of Facebook employees are mostly blacked out.
The documents are astonishing for two reasons: First, because their sheer volume is unbelievable. And second, considering these documents leave trivial room for doubt nigh Facebook's crucial role in advancing the cause of authoritarianism in America and around the world. Authoritarianism predates the rising of Facebook, of course. But Facebook makes it much easier for authoritarians to win.
Again and again, the Facebook Papers evidence staffers sounding alarms about the dangers posed past the platform—how Facebook amplifies extremism and misinformation, how it incites violence, how it encourages radicalization and political polarization. Again and again, staffers reckon with the means in which Facebook'due south decisions stoke these harms, and they plead with leadership to do more.
And again and again, staffers say, Facebook's leaders ignore them.
By nightfall on January six, 2021, the siege had been reversed, though non without fatalities. Washington'due south mayor had issued a citywide curfew and the National Baby-sit was patrolling the streets. Facebook announced that information technology would lock Trump's account, finer preventing him from posting on the platform for 24 hours.
"Do y'all genuinely think 24 hours is a meaningful ban?" ane Facebook staffer wrote on an internal message board. The staffer then turned, merely equally others had, to the years of failures and inaction that had preceded that day. "How are we expected to ignore when leadership overrides enquiry based policy decisions to meliorate serve people similar the groups inciting violence today. Rank and file workers have done their office to identify changes to amend our platform just take been actively held back. Can y'all offer any reason we tin can await this to change in the future."
Information technology was a question without a question mark. The employee seemed to know that there wouldn't exist a satisfying answer.
Facebook later extended the ban at least until the finish of Trump'due south presidential term, and so, when Facebook'southward Oversight Board ruled against imposing an indefinite ban, it extended the temporary ban until at least Jan 7, 2023. Simply for some Facebook employees, the decision to crack down on Trump for inciting violence was comically overdue. Facebook had finally acted, but to many at the company, it was besides footling, likewise late. For months, Trump had incited the insurrection—in evidently sight, on Facebook.
Facebook has dismissed the concerns of its employees in manifold ways. One of its cleverer tactics is to argue that staffers who have raised the alarm about the damage washed by their employer are merely enjoying Facebook'south "very open culture," in which people are encouraged to share their opinions, a spokesperson told me. This stance allows Facebook to merits transparency while ignoring the substance of the complaints, and the implication of the complaints: that many of Facebook'southward employees believe their company operates without a moral compass.
"Employees have been crying out for months to start treating high-level political figures the same mode we treat each other on the platform," one employee wrote in the January 6 chat. "That's all we're asking for … Today, a insurrection was attempted against the U.s.. I hope the circumstances aren't even more dire next time we speak."
rewind two months to Nov 4, 2020, the mean solar day after the presidential election. The upshot of the election was all the same unknown when a xxx-yr-onetime political activist created a Facebook group chosen "Stop the Steal."
"Democrats are scheming to disenfranchise and nullify Republican votes," the group's manifesto read. "It's up to usa, the American people, to fight and to put a end to it." Within hours, "Terminate the Steal" was growing at a heed-scrambling charge per unit. At one bespeak it was acquiring 100 new members every 10 seconds. It shortly became one of the fastest-growing groups in Facebook history.
As "Cease the Steal" metastasized, Facebook employees traded messages on the company's internal chat platform, expressing anxiety virtually their office in spreading ballot misinformation. "Non only practise we non do something about combustible election misinformation in comments," i wrote on November 5; "we amplify and requite them broader distribution. Why?"
By and then, less than 24 hours after the group'due south cosmos, "Finish the Steal" had grown to 333,000 members, and the group's administrator couldn't continue up with the footstep of commenting. Facebook employees were worried that "Stop the Steal" members were inciting violence, and the group came to the attention of executives. Facebook, to its credit, promptly shut downward the group. Merely nosotros now know that "Cease the Steal" had already reached also many people, as well quickly, to be contained. The movement jumped from one platform to another. And even when the grouping was removed by Facebook, the platform remained a key hub for people to coordinate the set on on the U.S. Capitol.
After the all-time-known "Stop the Steal" Facebook group was dismantled, copycat groups sprang upwardly. All the while, the movement was encouraged by President Trump, who posted to Facebook and Twitter, sometimes a dozen times a day, his complaint always the same—he won, and Joe Biden lost. His demand was e'er the aforementioned besides: It was fourth dimension for his supporters to fight for him and for their land.
Ne'er earlier in the history of the Justice Department has an investigation been and then tangled upwardly with social media. Facebook is omnipresent in the related courtroom documents, woven throughout the stories of how people came to exist involved in the riot in the commencement identify, and reappearing in accounts of anarchy and mortality. More than 600 people accept been charged with crimes in connection to Jan half-dozen. Court documents besides detail how Facebook provided investigators with identifying data about its users, as well as metadata that investigators used to confirm alleged perpetrators' whereabouts that twenty-four hours. Taken in aggregate, these court documents from January six are themselves a kind of facebook, 1 filled with selfies posted on Facebook apps over the grade of the coup.
On a bright, chilly Wednesday weeks subsequently the insurrection, when FBI agents finally rolled up to Russell Dean Alford'south Pigment & Body Shop in Hokes Bluff, Alabama, they said Alford's reaction was this: "I wondered when y'all were going to show up. Guess you lot've seen the videos on my Facebook page." Alford pleaded not guilty to four federal charges, including knowingly inbound a restricted building and hell-raising conduct.
Not just were the perpetrators alive-streaming their crimes as they committed them, but federal court records prove that those who have been indicted spent many weeks stoking violence on Facebook with posts such as "NO EXCUSES! NO RETREAT! NO SURRENDER! TAKE THE STREETS! Have Dorsum OUR COUNTRY! ane/6/2021=7/iv/1776" and "Grow a pair of assurance and take back your government!"
When you stitch together the stories that spanned the flow betwixt Joe Biden's election and his inauguration, it's easy to see Facebook as instrumental to the attack on January half-dozen. (A spokesperson told me that the notion that Facebook played an instrumental part in the insurrection is "absurd.") Consider, for case, the case of Daniel Paul Grayness. According to an FBI agent's affidavit, Gray posted several times on Facebook in December well-nigh his plans for January 6, commenting on one mail, "On the 6th a f[*]cking sh[*]t ton of us are going to Washington to close the entire city downwards. Information technology'southward gonna be insane I literally can't wait." In a private message, he bragged that he'd just joined a militia and as well sent a message saying, "are you lot gonna exist in DC on the 6th like trump asked u.s.a. to be?" Grayness was later indicted on nine federal charges, including obstruction of an official proceeding, engaging in acts of physical violence, violent entry, assault, and obstacle of constabulary enforcement. He has pleaded not guilty to all of them.
Then there'south the case of Cody Page Carter Connell, who allegedly encouraged his Facebook friends to join him in D.C. on January 6. Connell ended up charged with eight federal crimes, and he pleaded not guilty to all of them. Afterwards the coup, according to an FBI affirmation, he boasted on Facebook almost what he'd done.
"We pushed the cops confronting the wall, they dropped all their gear and left," he wrote in 1 message.
"Yall boys something serious, lol," someone replied. "It lookin similar a civil war yet?"
Connell'south response: "Information technology'southward gonna come to it."
All over America, people used Facebook to organize convoys to D.C., and to fill the buses they rented for their trips. Facebook users shared and reshared messages similar this one, which appeared before dawn on Christmas Eve in a Facebook group for the Lebanese republic Maine Truth Seekers:
This election was stolen and we are being irksome walked towards Chinese buying past an establishment that is treasonous and all too willing to gaslight the public into assertive the theft was somehow the will of the people. Would there be an interest locally in organizing a caravan to Washington DC for the Electoral College vote count on Jan 6th, 2021? I am arranging the fourth dimension off and will be a driver if anyone wishes to hitch a ride, or a pb for a caravan of vehicles. If a telephone call went out for able bodies, would at that place exist an answer? Merry Christmas.
The post was signed by Kyle Fitzsimons, who was later indicted on charges including attacking constabulary officers on January vi. Fitzsimons has pleaded not guilty to all eight federal charges against him.
You lot may be thinking: It'due south 2021; of course people used Facebook to plan the coup. Information technology's what they apply to plan all aspects of their lives. But what emerges from a close reading of Facebook documents, and ascertainment of the manner in which the company connects big groups of people rapidly, is that Facebook isn't a passive tool but a goad. Had the organizers tried to plan the rally using other technologies of earlier eras, such as telephones, they would have had to identify and reach out individually to each prospective participant, then persuade them to travel to Washington. Facebook made people's efforts at coordination highly visible on a global scale. The platform not only helped them recruit participants simply offered people a sense of strength in numbers. Facebook proved to be the perfect hype machine for the coup-inclined.
Amid those charged with answering Trump's call for revolution were 17 people from Florida, Ohio, North Carolina, Georgia, Alabama, Texas, and Virginia who allegedly coordinated on Facebook and other social platforms to bring together forces with the far-correct militia known equally the Adjuration Keepers. One of these people, 52-year-quondam Kelly Meggs from rural Florida, allegedly participated with his wife in weapons grooming to ready for January 6.
"Trump said It'due south gonna be wild!!!!!!!" Meggs wrote in a Facebook message on Dec 22, according to an indictment. "Information technology's gonna exist wild!!!!!!! He wants u.s.a. to make information technology WILD that'southward what he's saying. He called united states of america all to the Capitol and wants us to brand information technology wild!!! Sir Yes Sir!!! Gentlemen we are heading to DC pack your shit!!" Meggs and his Facebook friends arrived in Washington with paramilitary gear and boxing-ready supplies—including radio equipment, camouflage gainsay uniforms, helmets, eye protection, and tactical vests with plates. They're charged with conspiracy against the United States. Meggs has pleaded not guilty to all charges. His wife, Connie Meggs, has a trial date set for January 2022.
Ronald Mele, a 51-twelvemonth-old California man, besides used Facebook to share his plans for the insurrection, writing in a December Facebook post that he was taking a road trip to Washington "to support our President on the 6th and days to follow but in case," according to his federal indictment. Prosecutors say he and v other men mostly used the conversation app Telegram to brand their plans—debating which firearms, shotgun shells, and other weapons to bring with them and referring to themselves as soldiers in the "DC Brigade"—and three of them posted to Instagram and Facebook about their plans as well. On January two, four members of the group met at Mele's house in Temecula, about an hour north of San Diego. Before they loaded into an SUV and set out across the land, someone suggested that they take a group photograph. The men posed together, making hand gestures associated with the Three Percenters, a far-right militia movement that's classified as a terrorist organization in Canada. (Mele has pleaded not guilty to all four charges against him.)
On Jan 6, federal prosecutors say, members of the DC Brigade were among the rioters who bankrupt through the concluding police line, giving the mob admission to the West Terrace of the Capitol. At 2:30 p.m., simply afterwards President Trump egged on the rioters on Facebook, Mele and company were on the West Terrace celebrating, taking selfies, and shouting at fellow rioters to get ahead and enter the Capitol. I of the men in the group, Alan Hostetter, a 56-yr-old from San Clemente, posted a selfie to his Instagram account, with a crowd of rioters in the groundwork. Hostetter, who has pleaded non guilty to all charges, tapped out a caption to go with the photo: "This was the 'shot heard 'round the earth!' … the 2021 version of 1776. That war lasted 8 years. We are just getting warmed up."
I
n November 2019, Facebook staffers noticed they had a serious problem. Facebook offers a collection of 1-tap emoji reactions. Today, they include "like," "honey," "intendance," "haha," "wow," "pitiful," and "aroused." Company researchers had found that the posts dominated by "angry" reactions were substantially more likely to get against community standards, including prohibitions on various types of misinformation, according to internal documents.
Merely Facebook was slow to act. In July 2020, researchers presented the findings of a series of experiments. At the time, Facebook was already weighting the reactions other than "like" more heavily in its algorithm—significant posts that got an "angry" reaction were more likely to show upward in users' News Feeds than posts that simply got a "like." Anger-inducing content didn't spread but because people were more probable to share things that made them aroused; the algorithm gave anger-inducing content an edge. Facebook'southward Integrity workers—employees tasked with tackling problems such as misinformation and espionage on the platform—ended that they had adept reason to believe targeting posts that induced acrimony would help cease the spread of harmful content.
By dialing anger's weight back to naught in the algorithm, the researchers found, they could keep posts to which people reacted angrily from being viewed by as many users. That, in turn, translated to a significant (up to five percent) reduction in the hate speech communication, civic misinformation, bullying, and violent posts—all of which are correlated with offline violence—to which users were exposed. Facebook rolled out the change in early September 2020, documents show; a Facebook spokesperson confirmed that the modify has remained in effect. Information technology was a existent victory for employees of the Integrity team.
But it doesn't ordinarily work out that way. In April 2020, according to Frances Haugen's filings with the SEC, Facebook employees had recommended tweaking the algorithm so that the News Feed would deprioritize the surfacing of content for people based on their Facebook friends' behavior. The idea was that a person's News Feed should be shaped more by people and groups that a person had chosen to follow. Up until that point, if your Facebook friend saw a conspiracy theory and reacted to it, Facebook's algorithm might show information technology to you lot, too. The algorithm treated whatever engagement in your network as a betoken that something was worth sharing. Just now Facebook workers wanted to build circuit breakers to slow this class of sharing.
Experiments showed that this change would impede the distribution of hateful, polarizing, and violence-inciting content in people's News Feeds. Simply Zuckerberg "rejected this intervention that could take reduced the take chances of violence in the 2020 election," Haugen's SEC filing says. An internal message characterizing Zuckerberg's reasoning says he wanted to avoid new features that would arrive the way of "meaningful social interactions." Simply according to Facebook's definition, its employees say, appointment is considered "meaningful" even when it entails bullying, hate spoken communication, and reshares of harmful content.
This episode, similar Facebook's response to the incitement that proliferated between the election and January half dozen, reflects a cardinal problem with the platform. Facebook's megascale allows the company to influence the speech and thought patterns of billions of people. What the globe is seeing now, through the window provided by reams of internal documents, is that Facebook catalogs and studies the harm it inflicts on people. And so it keeps harming people anyway.
"I am worried that Mark's continuing design of answering a different question than the question that was asked is a symptom of some larger problem," wrote one Facebook employee in an internal post in June 2020, referring to Zuckerberg. "I sincerely hope that I am wrong, and I'm withal hopeful for progress. Just I also fully sympathise my colleagues who accept given upward on this company, and I tin't blame them for leaving. Facebook is non neutral, and working hither isn't either."
"I but wish we could hear the truth directly," another added. "Anything feels like nosotros (the employees) are being intentionally deceived."
I've been covering Facebook for a decade at present, and the challenges it must navigate are novel and singularly complex. One of the most of import, and heartening, revelations of the Facebook Papers is that many Facebook workers are trying conscientiously to solve these problems. One of the disheartening features of these documents is that these same employees have little or no organized religion in Facebook leadership. Information technology is quite a affair to encounter, the sheer number of Facebook employees—people who presumably empathize their company equally well equally or better than outside observers—who believe their employer to exist morally bankrupt.
I spoke with several former Facebook employees who described the company's metrics-driven civilization as farthermost, even past Silicon Valley standards. (I agreed not to name them, because they feared retaliation and ostracization from Facebook for talking nearly the company's inner workings.) Facebook workers are nether tremendous pressure level to quantitatively demonstrate their private contributions to the company's growth goals, they told me. New products and features aren't approved unless the staffers pitching them demonstrate how they will drive appointment. As a result, Facebook has stoked an algorithm arms race within its ranks, pitting core product-and-engineering teams, such every bit the News Feed team, confronting their colleagues on Integrity teams, who are tasked with mitigating harm on the platform. These teams constitute goals that are often in straight disharmonize with each other.
I of Facebook's Integrity staffers wrote at length nearly this dynamic in a goodbye note to colleagues in August 2020, describing how risks to Facebook users "fester" considering of the "asymmetrical" burden placed on employees to "demonstrate legitimacy and user value" before launching any harm-mitigation tactics—a brunt non shared by those developing new features or algorithm changes with growth and date in mind. The note said:
We were willing to act merely later things had spiraled into a dire state … Personally, during the time that we hesitated, I've seen folks from my hometown get further and further down the rabbithole of QAnon and Covid anti-mask/anti-vax conspiracy on FB. It has been painful to observe.
Current and former Facebook employees draw the same fundamentally broken civilization—one in which effective tactics for making Facebook safer are rolled dorsum by leadership or never approved in the kickoff place. (A Facebook spokesperson rejected the notion that it deprioritizes the well-being of its users.) That broken culture has produced a cleaved platform: an algorithmic ecosystem in which users are pushed toward e'er more than extreme content, and where Facebook knowingly exposes its users to conspiracy theories, disinformation, and incitement to violence.
Ane example is a programme that amounts to a whitelist for VIPs on Facebook, allowing some of the users most likely to spread misinformation to break Facebook's rules without facing consequences. Under the plan, internal documents show, millions of high-profile users—including politicians—are left alone by Facebook fifty-fifty when they incite violence. Some employees take flagged for their superiors how dangerous this is, explaining in ane internal certificate that Facebook had solid evidence showing that when "a piece of content is shared past a co-partisan politician, it tends to exist perceived as more trustworthy, interesting, and helpful than if it's shared by an ordinary citizen." In other words, whitelisting influential users with massive followings on Facebook isn't just a secret and uneven awarding of Facebook's rules; it amounts to "protecting content that is especially likely to deceive, and hence to impairment, people on our platforms."
Facebook workers tried and failed to end the program. But when its existence was reported in September past The Wall Street Journal did Facebook's Oversight Lath ask leadership for more information about the practice. Concluding week, the lath publicly rebuked Facebook for not existence "fully forthcoming" about the program. (Although Oversight Board members are selected past Facebook and paid by Facebook, the company characterizes their work as independent.)
The Facebook Papers bear witness that workers agonized over trade-offs between what they saw as doing the right thing for the world and doing the right thing for their employer. "I am and so torn," ane employee wrote in Dec 2020 in response to a colleague'south comments on how to fight Trump's hate speech and incitements to violence. "Following these recommendations could hasten our own demise in a variety of ways, which might interfere [with] all the other good we practise in the world. How practise y'all weigh these impacts?" Messages show workers wanting Facebook to make honorable choices, and worrying that leadership is incapable of doing so. At the same time, many clearly believe that Facebook is still a net force for good, and they as well worry about hurting the platform'due south growth.
These worries take been exacerbated lately by fears nigh a decline in new posts on Facebook, ii former employees who left the visitor in contempo years told me. People are posting new material less frequently to Facebook, and its users are on average older than those of other social platforms. The explosive popularity of platforms such as TikTok, especially among younger people, has rattled Facebook leadership. All of this makes the platform rely more than heavily on ways it can dispense what its users see in order to reach its goals. This explains why Facebook is so dependent on the infrastructure of groups, likewise as making reshares highly visible, to keep people hooked.
Merely this approach poses a major problem for the overall quality of the site, and quondam Facebook employees repeatedly told me that groups pose 1 of the biggest threats of all to Facebook users. In a particularly fascinating document, Facebook workers outline the downsides of "community," a buzzword Zuckerberg ofttimes deploys as a way to justify the platform'south existence. Zuckerberg has divers Facebook's mission as making "social infrastructure to requite people the power to build a global community that works for all of usa," but in internal inquiry documents his employees point out that communities aren't always skillful for society:
When part of a community, individuals typically act in a prosocial manner. They accommodate, they forge alliances, they cooperate, they organize, they brandish loyalty, they expect obedience, they share information, they influence others, and so on. Being in a grouping changes their behavior, their abilities, and, importantly, their capability to harm themselves or others … Thus, when people come up together and form communities around harmful topics or identities, the potential for harm tin can exist greater.
The infrastructure choices that Facebook is making to continue its platform relevant are driving down the quality of the site, and exposing its users to more than dangers. Those dangers are too unevenly distributed, because of the manner in which sure subpopulations are algorithmically ushered toward like-minded groups. And the subpopulations of Facebook users who are virtually exposed to dangerous content are too most likely to exist in groups where it won't get reported.
Many Facebook employees believe that their visitor is pain people. Many take believed this for years. And even they can't stop it. "Nosotros can't pretend nosotros don't see information consumption patterns, and how deeply problematic they are for the longevity of autonomous discourse," a user-experience researcher wrote in an internal comment thread in 2019, in response to a at present-infamous memo from Andrew "Boz" Bosworth, a longtime Facebook executive. "There is no neutral position at this stage, information technology would exist powerfully immoral to commit to amorality."
Idue north the months since Jan vi, Mark Zuckerberg has made a betoken of highlighting Facebook'south willingness to help federal investigators with their work. "I believe that the former president should be responsible for his words, and the people who bankrupt the law should exist responsible for their actions," Zuckerberg said in congressional testimony last bound. "And then that leaves the question of the broader data ecosystem. At present, I can't speak for everyone else—the TV channels, radio stations, news outlets, websites, and other apps. Only I can tell you what we did. Before January half-dozen, we worked with law enforcement to place and accost threats. During and subsequently the attack, we provided extensive back up in identifying the insurrectionists, and removed posts supporting violence. We didn't catch everything, but we made our services inhospitable to those who might do harm."
Zuckerberg's positioning of Facebook's role in the insurrection is odd. He lumps his company in with traditional media organizations—something he'south ordinarily loath to do, lest the platform exist expected to take more than responsibility for the quality of the content that appears on information technology—and suggests that Facebook did more, and did improve, than journalism outlets in its response to Jan 6. What he fails to say is that journalism outlets would never be in the position to help investigators this style, because insurrectionists don't typically use newspapers and magazines to recruit people for coups.
In hindsight, it is piece of cake to say that Facebook should have made itself far more than hostile to insurrectionists before they carried out their attack. Simply people post passionately virtually lawful protests all the fourth dimension. How is Facebook to know which protests volition spill into violence and which won't? The reply here is simple: because its ain staffers take obsessively studied this question, and they're confident that they've already found means to make Facebook safer.
Facebook wants people to believe that the public must choose between Facebook as it is, on the one hand, and costless spoken language, on the other. This is a false option. Facebook has a sophisticated understanding of measures information technology could take to make its platform safer without resorting to wide or ideologically driven censorship tactics.
Facebook knows that no two people encounter the same version of the platform, and that certain subpopulations experience far more than dangerous versions than others do. Facebook knows that people who are isolated—recently widowed or divorced, say, or geographically afar from loved ones—are unduly at risk of being exposed to harmful content on the platform. It knows that echo offenders are disproportionately responsible for spreading misinformation. And it knows that 3 percent of Facebook users in the United States are super-consumers of conspiracy theories, accounting for 37 percent of known consumption of misinformation on the platform.
The near viral content on Facebook is basically untouchable—some is and then viral that fifty-fifty turning downwards the distribution knob by xc percent wouldn't make a dent in its ability to ricochet effectually the cyberspace. (A Facebook spokesperson told me that although the platform sometimes reduces how oftentimes people see content that has been shared past a chain of ii or more people, it is reluctant to apply that solution more broadly: "While nosotros have other systems that demote content that might violate our specific policies, like hate speech or nudity, this intervention reduces all content with equal strength. Considering it is and so edgeless, and reduces positive and completely benign speech alongside potentially inflammatory or violent rhetoric, we use it sparingly.")
Facebook knows that there are harmful activities taking identify on the platform that don't break any rules, including much of the coordination leading up to January 6. And information technology knows that its interventions touch simply a minuscule fraction of Facebook content anyway. Facebook knows that it is sometimes used to facilitate large-scale societal violence. And it knows that it has acted too slowly to prevent such violence in the by.
Facebook could ban reshares. It could consistently enforce its policies regardless of a user'south political power. It could cull to optimize its platform for prophylactic and quality rather than for growth. It could tweak its algorithm to prevent widespread distribution of harmful content. Facebook could create a transparent dashboard so that all of its users can meet what's going viral in real time. It could make public its rules for how frequently groups can post and how quickly they tin can grow. Information technology could as well automatically throttle groups when they're growing besides fast, and cap the rate of virality for content that's spreading as well rapidly.
Facebook could shift the brunt of proof toward people and communities to demonstrate that they're skilful actors—and treat reach as a privilege, not a correct. Facebook could say that its platform is not for everyone. It could sound an alarm for those who wander into the well-nigh unsafe corners of Facebook, and those who meet disproportionately high levels of harmful content. It could concord its employees answerable for preventing users from finding these besides-harmful versions of the platform, thereby preventing those versions from existing.
It could practise all of these things. Only it doesn't.
Facebook certainly isn't the simply harmful entity on the social spider web. Extremism thrives on other social platforms as well, and plenty of them are fueled by algorithms that are equally opaque. Lately, people accept been debating just how nefarious Facebook really is. One argument goes something like this: Facebook'southward algorithms aren't magic, its advert targeting isn't even that good, and nigh people aren't that stupid.
All of this may exist true, merely that shouldn't be reassuring. An algorithm may but be a big impaired means to an end, a clunky way of maneuvering a massive, dynamic network toward a desired event. But Facebook'south enormous size gives it tremendous, unstable power. Facebook takes whole populations of people, pushes them toward radicalism, and then steers the radicalized toward one another. For those who found themselves in the "Stop the Steal" corners of Facebook in Nov and Dec of last year, the enthusiasm, the sense of solidarity, must have been overwhelming and thrilling. Facebook had taken warped reality and distributed it at scale.
I've sometimes compared Facebook to a Doomsday Machine in that it is technologically simple and unbelievably unsafe—a blackness box of sensors designed to suck in environmental cues and deliver mutually assured destruction. When the most powerful company in the world possesses an musical instrument for manipulating billions of people—an musical instrument that only it can control, and that its own employees say is badly cleaved and unsafe—we should accept detect.
The lesson for individuals is this: You must be vigilant virtually the informational streams you swim in, deliberate most how you spend your precious attention, unforgiving of those who weaponize your emotions and cognition for their own profit, and securely untrusting of whatever scenario in which you're surrounded by a mob of people who agree with everything y'all're saying.
And the lesson for Facebook is that the public is starting time to recognize that it deserves much greater insight into how the platform'due south mechanism is designed and deployed. Indeed, that'due south the just manner to avert further catastrophe. Without seeing how Facebook works at a finer resolution, in real time, we won't be able to empathize how to make the social web compatible with commonwealth.
Source: https://www.theatlantic.com/ideas/archive/2021/10/facebook-papers-democracy-election-zuckerberg/620478/
0 Response to "Make America Great Again Profile Frame on Facebook"
Post a Comment