The Wikipedia community is bracing for a potential Wikipedia cannon, with millions of people being targeted for fake accounts.
The Wikimedia Foundation is now planning to deploy a software program to block and delete accounts, and has been preparing for the inevitable since the hack of its network.
The Wikipedia hack has brought a new level of scrutiny to Wikipedia, which has faced the threat of legal action by the Wikimedia Foundation after its staffers were accused of violating its ethics policy.
The community has been concerned that the fake accounts may be trying to sway readers’ opinions by promoting conspiracy theories and pushing misinformation.
But in the wake of the hack, the Wikimedia Community Team (MCNT), which runs Wikipedia’s core software, has decided to act as a “community-monitoring tool.”
The aim of the program is to identify accounts that may be engaging in “fake accounts activity,” including attempts to manipulate users’ opinions, and to block them from Wikipedia.
The MCNT will then send an alert to those accounts, warning them that they are in violation of the terms of service and that they need to be blocked.
In an interview with Wired, MCNT co-founder and chief executive officer Matt Cutts said the program will “help the community stay vigilant against malicious activity.”
The company says the program should be deployed across Wikipedia’s entire network by mid-March.
It will take a few weeks to implement, Cutts told Wired, and will require the MCNT to conduct automated checks on thousands of accounts every day.
“There are some accounts that will get flagged, some accounts will get blocked, and some will not be flagged at all,” Cutts explained.
“If we can identify that a certain account is being used, we will block that account.
We will not flag any account as spam, because that would be spamming and it is not a good practice.
If we can’t identify it, we won’t flag it.”
It’s unclear whether the program’s targeting of fake accounts will be extended to all of Wikipedia’s content.
But it’s an ambitious strategy, and Cutts is clearly trying to make the system work in a way that will make the company less reliant on its staff.
In other words, he wants to get the software working as quickly as possible so that the community can do its job and prevent the kinds of bad behavior that have plagued Wikipedia in the past.
“What we’ve seen with the [hack] is that it has taken off a lot of activity and engagement around that [program], and that’s why we wanted to do something,” Cutms said.
“And what we want to do is have it as simple as possible.
You can use it for the content that you care about, but it’s not going to be able to do everything.”
That’s exactly what the MCnt is trying to do.
The company has set up a webpage to inform users of the new tool.
Cutts also made clear that he wants the program to be broadly available to the entire community, so that users can use the tool to flag and block accounts that are engaging in bad behavior.
“When you’re talking about this, I want to be absolutely clear that there’s no specific requirement that it be done on a one-time basis,” Cutt said.
It’s the same strategy he has used with other tools that are available to users.
“I don’t think it’s a problem that you should do a few days of this, and then it’s gone.
That’s not the right thing to do,” he said.
The new program will be rolled out in stages.
“The first one will be a very limited tool, because we’ve got a lot going on at the same time,” Cutson said.
He noted that the program has not yet been deployed on any other Wikipedia website, so it’s unclear how many people will be affected.
Cuts acknowledged that the tool will not help Wikipedia combat the spam that’s been used to spread misinformation.
“But we’re trying to prevent this from happening again,” he added.
Cutt stressed that the MCNTS will be operating under the assumption that users are using Wikipedia to make money.
“We’ve got to have a reasonable expectation that we’re not making money,” he explained.
“[We’re] trying to protect Wikipedia, and we have to make sure that we protect the content.”
Cutts was careful to note that he does not intend to use the MCNET to censor Wikipedia content.
“You don’t censor Wikipedia, you make sure it’s okay to be there,” he told Wired.
“That’s the only way you can protect Wikipedia.
That is the only thing you can do.”