Earlier this month, agents for France’s top intelligence agency, the Direction Central du Renseignement Interieur, or DCRI, were accused of trying to force a Wikipedia volunteer to remove a Wikipedia page describing a French military radio relay station. The volunteer, a library curator, reportedly was threatened with jail unless he complied.
Before any of the bullying took place, the DCRI had gone the conventional route, contacting the Wikimedia Foundation, which is Wikipedia’s parent organization. The Wikimedia Foundation declined to remove the material but said it would be happy to comply upon the receipt of proper legal documentation.
The saga had a few unintended consequences. For one thing, the French authority hoping to stymie the spread of information about the station ended up increasing traffic to the page 1,000-fold. The events also raised interesting questions about what, exactly, constitutes a legitimate Wikipedia takedown request. If a national security-related request from France — which is not some backwater, totalitarian regime — is not heeded, then what is? What makes a takedown request legitimate?
In this TechNewsWorld podcast, we talk with Geoffrey Brigham, the general counsel and board secretary for the Wikimedia Foundation, which hosts a number of projects, including Wikipedia. Brigham explains the logistics of a Wikipedia takedown request, what the criteria for legitimacy are, and how Wikipedia’s linguistic expansion — which invariably means a geographic expansion — affects this process.
Download the podcast (14:41) or use the player:
Here are some excerpts from the podcast:
TechNewsWorld: One of the things that kind of caught people off guard or that attracted attention with this instance I mentioned in France is that the story came from France, which is thought of as a very liberal, very open-minded, very free speech-conducive country. And it makes you wonder — if these sorts of problems are happening in France, they must be happening quite a bit elsewhere, especially as Wikipedia expands and incorporates new languages and new countries into its network.
Tell me, first off, how you begin to sift through and begin to determine what is legitimate and what is not from what must be a whole lot of requests that you receive?
Geoffrey Brigham: Well, David, one of the first things I think you need to keep in mind is that we do not actually write Wikipedia. It’s our community. It’s tens of thousands, hundreds of thousands, millions of contributors who are putting together the biggest encyclopedia in the world. And that community exercises its editorial responsibilities and discretions as you would expect with anything of this size and nature. So 99 percent of requests to remove content for various reasons usually go through the community. And the community is able to work through those requests based on the Wikipedia policies that are in place.
Now, a very small percentage of those requests are sometimes referred to us. And we look at those requests on a case-by-case basis. We first ask ourselves, “Are the requests consistent with the policies of the Wikipedia community — the policies of the community itself as written?” We ask ourselves, “Is there an immediate threat of harm?” in which case we would act responsibly. But it’s a case-by-case analysis based on values that our community holds dear as it’s writing a project that is based on free information, publicly available, reliable sources.
TNW: Are the criteria that the community uses to evaluate these requests — are those static across the multitude of countries and nations that Wikipedia is available in? Is it one singular set of rules? Or are these things kind of fluid according to where the information is being seen or where it originated?
Brigham: Well, our projects of course are not based on countries; they’re based on languages. And each language project will have its own set of rules. But many of them are consistent at the highest level. For example, we don’t tolerate plagiarism. Another example is, we don’t allow copyrighted materials, the use of copyrighted materials, except for certain exceptions like fair use. And our community is quite active in ensuring that there is no plagiarism, that there is no misuse of copyrighted materials.
So the good-news story about this is, actually, we get involved with very few takedown requests. In the realm of copyrighted materials, for example, we only receive 15 requests when other major websites receive tens of thousands, if not hundreds of thousands. And that’s because our community is vigilant about writing a credible encyclopedia. They are, as I say, constantly reviewing, and are actually listening to others who are suggesting that there may be content that needs to be taken down. They do the evaluation, and they take it down. …
TNW: It seems like perhaps with copyright violation or especially plagiarism, there would be a pretty objective way to determine whether or not the rule was broken. I mean, if something is plagiarized, then it will probably be Googleable, or if not on Google, then it will be written somewhere else and you can say, “Okay, this is plagiarism and we can take it down.” But in instances where it is perhaps subjective — like you talked about if it’s a threat — how does something that’s a little less objective, less clear-cut — how do you view those? And how are those kind of worked through when you come to those requests?
Brigham: Well, our community is very smart. They know how to handle nuances and subtleties. So they will take those requests and actually have discussions, and very informed discussions, before making decisions. Like any decision in life that deals with a subtle issue, it requires reflection, and our community is very good at that. We do the same thing at the Foundation: We will evaluate the requests against the many values that we have, including values of free expression that are extremely important to us. So like any subtle request, we do a case-by-case analysis.