If the internet has largely been lost to the culture war engulfing much of the Western world, Wikipedia has proven to be an unlikely Switzerland. The open-source encyclopedia, once maligned as the harbinger of a future in which no one had authority over the facts, now looks, ironically, like the last, best hope for a global digital portal to the truth. But behind the unadorned pages that have earned the trust of millions of readers, an argument rages that threatens to drag the project into the muck with the other major social platforms. And it began, as these fights so often do, with something as seemingly simple as a user ban.
On June 10, the Wikimedia Foundation did something unprecedented in its decade and a half history: It banned a user from the English-language Wikipedia for a year. The San Francisco–based nonprofit that hosts the world’s greatest information resource has historically kept its hands off the individuals who use and edit it. Penalties for bad behavior on the English Wikipedia are typically determined and meted out by the community itself, often represented by the Arbitration Committee, the 15-person all-volunteer body elected by fellow Wikipedians. ArbCom is commonly referred to as “Wikipedia Supreme Court.”
But the foundation is a higher power. The 300-person organization, which in fiscal year 2017–2018 received more than $100 million in donations, can make unilateral decisions about users. These cases are rare, referred to in the community as a “nuclear option.” Though the foundation does not disclose the nature of the offenses it investigates, it is widely held among Wikipedians that “office actions” apply only to extreme cases: child pornography, pedophilia advocacy, terrorism, realistic threats. And they sometimes come only after a referral from ArbCom. The community that labors every day to polish the crown jewel of the collaborative internet fiercely guards its ability to police itself. As befits the most committed members of a project dedicated to decentralization and transparency, Wikipedians don’t take well to top-down decisions.
Indeed, the foundation has only given out 36 global bans since 2012, and never, until now, a temporary one. A permanent ban is a lifetime prohibition from participation in any Wikimedia Foundation website, a death penalty. This was something else: A jail sentence with a release date, imposed from above, without a trial.
The foundation could hardly have picked a more conspicuous target for this new kind of intervention: Fram, one of its best-known administrators. (Many editors and admins on Wikipedia work anonymously, a fact that has embarrassed the platform in the past.) When word got out — the foundation didn’t make a statement — the reaction from the Wikipedia community was fast and fierce.
“What the hell?” wrote Iridescent, an admin, on a community response Wiki. “There had better be a damn good explanation; Fram is arguably the best admin in Wikipedia’s history. … I find it hard to imagine problems that are simultaneously so bad they warrant an emergency ban without discussion but simultaneously so unproblematic that the ban will auto-expire in a year.”
Iridescent’s comment, echoed by many, went straight to the heart of the issue: If Fram hadn’t done something worthy of a lifetime ban, the foundation had unceremoniously arrogated for itself the kind of everyday, shades-of-gray decision-making that was supposed to belong to Wikipedians themselves. It looked like a power grab.
“We, the people, are being systematically brainwashed into giving up … all of our precious freedoms,” wrote one editor. “It is an orchestrated self perpetuating cultural shift away from aspirational and community empowered governing bodies toward protective, moralizing and pushy governing bodies.”
Unmotivated by profit and maintained by a volunteer army of idealists, Wikipedia has so far escaped the fate of the other user-generated content giants, now locked in public, years-long, brutally specific battles over content policies and moderation. But now, with one decision, the Wikimedia Foundation seemed to have plunged the project into the familiar world of strikes and suspensions, martyrdom and harassment. It finds itself in the painful position that the YouTubes and Twitters of the world have been unable to escape: in open conflict with some of its most devoted users, without whom its scale and success would be unimaginable, but whose sometimes toxic culture threatens its long-term health.
Because Wikipedia is so thoroughly animated by an ideal — to create, as founder Jimmy Wales put it in 2004, “a world in which every single person on the planet is given free access to the sum of all human knowledge” with, as he wrote in a 2001 statement of principles, “no cabal … no elites … [and] no hierarchy” — a decision that might be justified as pragmatic at another organization has taken on enormous symbolic weight; A crucial subset of its community sees the action in quasi-existential terms. Indeed, the ban has raised fundamental questions about the governance of one of the 21st century’s few democratic achievements, touching off a familiar tech-world culture war between the platform’s libertarian roots and the foundation’s egalitarian aspirations. And in the middle of this war are the platform’s most zealous volunteers, who, depending on your perspective, are either guardians of the utopian vision that brought Wikipedia to life, or obstacles in the way of a community that better represents the world it purports to explain.
The English-language Wikipedia currently has 1,161 administrators, users with special powers that include the ability to block and unblock other users, to globally limit edits to specific pages, and to hide and delete revisions. (Editors become admins following a rigorous discussion between editors aimed at reaching consensus.) It’s with these few tools that a tiny fraction (about 1%) of active users have managed the trolling, tendentiousness, and poor quality that are the open-source encyclopedia’s natural weak spots.
It’s easy to forget that, a little more than a decade ago, some regarded the idea of a free encyclopedia editable by anyone as a public hazard. In 2007, then-senator Ted Stevens introduced the Protecting Children in the 21st Century Act, which would have barred minors from using public computers to view social networks, including Wikipedia. Educators publicly weighed the merits of blocking the site. “A professor who encourages the use of Wikipedia is the intellectual equivalent of a dietician who recommends a steady diet of Big Macs with everything,” wrote Michael Gorman, former president of the American Library Association, around the same time.
Today, Wikipedia is the web’s de facto starting point for most everyone’s research, and its value as a tertiary source that organizes secondary ones is unquestioned. A world without it is unthinkable, and not just for settling debates. Services from Google’s Knowledge Graph to the voice assistants Echo and Siri crib from Wikipedia to deliver information quickly to their users. The site has become such a trusted resource that other giant platforms have turned to it to contest their own problems with bad information.
Much of the hard work that has gotten Wikipedia to this point has been done by people like Fram. Within the Wikipedia community, Fram is known as a rigorous and prolific administrator with a special talent for quality control: removing spam, handling copyright issues, and, ironically, booting banned users who post under new names. He’s exactly the kind of diligent, obsessive volunteer that Wikipedia needed to thrive. (Fram declined to speak to BuzzFeed News for this story.)
Fram is also known within the community as an asshole. “He’s like Inspector Javert,” one Wikipedian wrote of Fram recently, comparing him to the ruthless and inflexible antagonist of Les Misérables. “Brusque, bordering on rude sometimes,” another longtime admin, Floquenbeam, told BuzzFeed News. “He has a reputation for almost always being right on the underlying merits in a dispute, but going about it in a fairly obnoxious way.” Over the years, Fram has clashed with other admins, with editors, with ArbCom, and with the foundation itself. Still, he remains part of a caste of old-school admins, with nearly 15 years of social capital in the community.
The foundation banned Fram shortly before 6 p.m. on June 10. Within an hour, admins had left dozens of messages on their private noticeboard demanding an explanation. That night, the foundation released a short statement explaining that the ban had originated in complaints from the Wikipedia community. It did not, per its own safety guidelines, disclose the complainer nor the complaint. The statement made things worse. So did a statement from Fram, the next day, on his Wikimedia Commons page, where he, confusingly, had not been banned.
Fram explained that he had received two previous “conduct warnings” from the foundation’s Trust and Safety Council for his incivil style toward other Wikipedians. He then claimed that the foundation told him he had been banned for a single edit to the Wikipedia entry for the Arbitration Committee itself, which began, “Fuck Arbcom.” Once he had received the conduct warning, he wrote, any “flimsy justification” for banning him would do.
“I’m not a model admin or editor,” he wrote, “But I believe I was steadily improving. But that’s not for [English-language Wikipedia] to decide apparently.” The real reason behind his ban, he said, was his history of sparring with the foundation over the technical details of software updates to the platform.
Angry at the lack of specifics from the foundation and convinced that Fram was undeserving of the ban, the community decided the next morning to act. An editor proposed a resolution to unblock Fram — after all, that was one of every admin’s fundamental powers. Wales, by now aware of the unfolding crisis, implored the community not to do anything rash.
“Rather than cloud the waters and make it even harder (emotionally) for a backdown (if such is warranted – we don’t know yet!),” Wales wrote in a comment on the resolution. “It will be best to take the high road and wait until a more appropriate time.”
That afternoon, Floquenbeam unbanned Fram.
“I believed the unblock was necessary to force the WMF to take that overwhelming community support for an unblock seriously,” Floquenbeam told BuzzFeed News in an email. “Historically, the WMF has been fairly immune to people just saying they’re unhappy.”
Shortly after midnight on June 12, the foundation rebanned Fram, and banned Floquenbeam for a month for reversing its action. That morning, another admin, Bishonen, unblocked Fram yet again, accusing the foundation of starting a dreaded “Wheel War”: a fight in which two or more admins repeatedly undo each other’s actions.
The foundation had faced a handful of admin revolts in the past, and it has historically capitulated to its power users in these situations. But this situation was worse, and it was spiraling. The foundation showed no signs of backing down. Even cooler heads, like the highly respected admin and former ArbCom member Risker, voiced unease. Wikipedia has, over the years, produced its own vast universe of policies. Risker was concerned that the foundation had taken a dramatic and unprecedented action without communicating any kind of normative process.
“It comes across as a FUD [fear, uncertainty, and doubt] campaign,” she wrote. “We’ll temporarily ban people who did something wrong according to rules we haven’t shared, but we won’t tell you what they did, what can be done to prevent similar actions, or whether we’ll change the [unshared] rules again without telling you. This is why even people who don’t like Fram, and even those who think Fram was behaving unacceptably, are having a hard time with this ban. Bluntly put, I feel much less safe working on a Wikimedia project today than I did a week ago, because one of the most fundamental understandings I had about working here has now been proven wrong.”
Wikipedia cofounder Jimmy Wales poses during a photo session at the VivaTech fair in Paris on May 16.
As existential concerns about Wikipedia’s accuracy have faded into the background, demography has emerged as the most serious threat to the project’s legitimacy; it can hardly aspire to be the sum of human knowledge if only white guys create and manage it. Studies have found that up to 90% of Wikipedia’s editors are male. (Similar numbers don’t exist for the racial composition of editors, but it’s widely held within the Wikipedia community that people of color are seriously underrepresented.) That’s led to some awkward discoveries, like the fact that the platform’s “List of Pornographic Actresses” as recently as 2015 had more edits and editors than its “List of Female Poets.” Another study, from 2017, found that 77% of Wikipedia articles are written by 1% of its users. A picture has emerged over the past half decade of a platform controlled by a small group of white men that is unwelcoming, if not hostile, to newcomers and women.
Those dynamics are central to Fram’s ban. Egged on by Fram’s insistence that the foundation had actually banned him because of a grudge, and stymied by the foundation’s refusal to name the complainant, Wikipedians began to scour his history on the platform, looking for someone to blame.
Much of that blame fell, perhaps predictably, on a woman and a transgender editor. In 2017, a fledgling Wikipedian accused Fram of monitoring her activity on the site to such an extent that felt like harassment. The editor, whose contributions focused on women athletes, lesbian history, and abortion rights, felt that Fram’s pattern of correcting her spelling and deleting her stubs — short, unfinished articles that are culled when they sit dormant for too long — demonstrated a lack of good faith.
“Stay off my talk page Fram,” she wrote at the time. “If you have a problem with my work, then you need to talk to another admin and have them handle the problem. It should not be you.”
More recently, Fram had an acrimonious semantic debate with a high-profile transgender editor over whether referring to them as “xe” constituted misgendering. It culminated in an ugly claim by Fram that he would not be misracing a black person by calling them the n-word, only being racist.
On Wikipedia and in the forums of Wikipediocracy, a site where Wikipedians gather to discuss and criticize Wikipedia, users speculated about a secret romantic connection between the woman editor and a member of the Wikimedia Foundation board and about whether the trans editor might’ve been pretending to be trans to win a fight with Fram. The vitriol toward those two users grew so intense that Risker chastised some Wikipedians in her critical note about the ban.
“Please, stop being cruel to individuals whose names have come up in the course of this issue,” Risker wrote. “If ever you wondered why User:WMFOffice exists, those of you who have overpersonalized this situation have illustrated the point quite well.”
Incivility on Wikipedia driving away new users has been a cause of concern for the foundation at least since 2014, when Wales addressed the topic at the Wikimania conference.
“There are users in the community who have a reputation for creating good content, and for being incredibly toxic personalities,” Wales said. “On this issue, I have a very simple view that most of these editors actually cost us more than they’re actually worth.” In 2016, the Wikimedia Board of Trustees resolved to address toxic behavior in the community.
Getting a handle on the size and severity of the toxicity problem in the Wikipedia community is difficult. The relatively small number of admins and active editors of Wikipedia compared to the number of active users on a major social network means the scale of harassment is necessarily smaller.
“Harassment is a problem, but for us its small,” Katherine Maher the executive director of the Wikimedia Foundation, told Slate’s If Then podcast last year.
But it can be severe. In 2016, an editor said the toxicity of the community had led him to contemplate suicide. And abuse on Wikipedia can be baked into the tools used by admins themselves.
A source familiar with the Wikimedia Foundation’s thinking about the ban told BuzzFeed News that one of the standards admins prize, prolificacy, can be counterproductive. An admin — like Fram — might be extremely prolific in applying the speedy deletion policy to new stories in the name of regular maintenance. Such rigor might, however, prevent new users from taking the time necessary to create a full entry and alienate them in the process. The same source said that the temporary ban policy, introduced in February of this year and rolled out for the first time against Fram, was a way to add another step in the escalation from conduct warning to permanent ban.
Indeed, Fram seemed like the perfect test case for a new kind of enforcement from the foundation — a prolific user whose bad behavior warranted a severe sanction short of a lifetime ban. But as is the case in so many enforcement decisions on social platforms, the ban created more questions than it answered. Would a new gradient of enforcement lead to more office actions overall? What separated a permanently bannable offense from a temporarily bannable one?
And, as Tom Fish, a former member of ArbCom, put it to BuzzFeed News, “If you’re going to step in for this, what other things would you step in for?”
The most profound question raised by the ban, though, has to do with ArbCom. Why hadn’t the Fram situation been dealt with by the community?
The source familiar with the Trust and Safety commission said that the foundation had always investigated complaints that came directly to it from the community; there was nothing new about the Fram ban except the severity of the enforcement and the high profile of the target.
Members of the community were quick to point out that the Trust and Safety team sometimes passed complaints back to ArbCom to handle locally. In a statement to the community, the foundation said it couldn’t do so in Fram’s case for two reasons: “privacy provisions,” and the fact that the committee itself had been targeted by Fram’s “Fuck ArbCom” comment, creating “the appearance of a conflict of interest.”
For those inclined to see it that way, the foundation’s justifications seemed like a mere pretext to introduce a new era of enforcement: one in which ArbCom couldn’t be trusted to handle dispute resolution, one in which a proudly decentralized community gives up a degree of its autonomy to a distant authority, one that looked more like the other platform giants.
“This is a shot across the bow for English Wikipedia,” Fish told BuzzFeed News.
Even the foundation seemed to recognize that it had, wittingly or unwittingly, undermined the self-governance ArbCom represents.
“I realize that this situation has been difficult for the English Wikipedia’s Arbitration Committee (ArbCom),” Jan Eissfeldt, the foundation’s lead manager of Trust and Safety, wrote in a note to the community. “The Trust & Safety team apologizes for not working more closely with them in the lead-up to this point.”
According to Molly White, who serves on ArbCom under the username GorillaWarfare, opinion on the body about the ban varies, though she added she hopes communication from the foundation about such situations improves.
A much quieter group in the community were thankful for the ban. BU Rob13, a former member of ArbCom who recently retired from administration, said that Fram’s behavior toward him, including “taking shots” at him in an edit summary and following him to unrelated cases, felt like harassment.
“These actions, and the Arbitration Committee’s failure to act promptly in condemning them, were a major factor that led to my resignation,” BU Rob13 told BuzzFeed News. “It is also a major reason why I no longer believe the current Arbitration Committee can handle harassment.”
The real cause of the Fram flare-up wasn’t the sudden overreach by the foundation, BU Rob13 said, but the community’s own laissez-faire attitude about toxic users.
“The community is currently blaming the foundation for their own mess, in my opinion,” he wrote, “which was caused by our abject failure to develop procedures to enforce civility without Foundation intervention.”
Two weeks after the ban, anger in the community persists. An official response from the Wikimedia Foundation Board, promised by Wales on June 21, has not been forthcoming. Editors and admins have proposed various protest actions, including a work stoppage, freezing the main site page, and forking all of English Wikipedia. Nine admins have resigned. And the Wikipedia page about Fram’s ban now runs to more than 100,000 words of text: claims and counterclaims, proposals and counterproposals, recriminations, calls for patience and calls for impatience, sarcastic references to a forthcoming Nobel Peace Prize, lessons about the history of authoritarianism, and an entire section titled, simply, “No good is coming out of this.” ●
Joe Bernstein is a senior technology reporter for BuzzFeed News and is based in New York.
Contact Joseph Bernstein at [email protected]
Got a confidential tip? Submit it here.
Source: Read Full Article