The Signpost

In the media

Internet companies use Wikipedia to police truth; Citogenesis proven yet again; early birthday greetings; and trains

Should Wikipedia be asked to cure the Internet?

Guy with a mop. Mopping up after plutonium production?

Bloomberg Businessweek recently published an op-ed in which the writer argued for a "Digital Protection Agency". According to the article, after social media companies make a mistake "they mop it up with Wikipedia or send out a message that reads, 'We take your privacy seriously'". This practice is becoming increasingly common as companies face recoil over videos and comments that propagate conspiracy theories and fake news. In an article published by Wired, entitled Don't Ask Wikipedia to Cure the Internet, the author criticized the move by companies, writing "Using the crowdsourced encyclopedia as a shield, platforms abdicate responsibility for their own problems." A Washington Post article noted that Wikipedia is becoming the "good cop" of the internet. The decisions, however, are not incredibly out of line with the moves of other companies, such as Amazon and Apple, to utilize Wikipedia as their digital assistant.

In October of 2017, Facebook announced that they would be adding an information button (Information icon) to their news feed that users could click on to read the Wikipedia page of the news organization publishing an article, in an effort to combat the spread of fake news. In March of 2018, YouTube stated that text boxes called "information cues," with links to Wikipedia (and other sources) would appear next to videos to help discredit conspiracy theories. The Wikipedia foundation said in a statement that "We were not given advance notice of this announcement", and Katherine Maher tweeted that "frankly, we don’t want you to blindly trust us. Sure, we’re mostly accurate - but not always! We want you to read Wikipedia with a critical eye. Check citations! Edit and correct inaccurate information! You can’t do that in a simple search result." A month before, Google (the parent company of YouTube) decided to put a label next to state owned media organizations, linking to the Wikipedia article.

In the face of such disinformation and privacy concerns, some went so far as to propose a Facebook clone, run in the same manner as Wikipedia. This hypothetical social media service was termed 'Wikiface'. Others raised concerns about the reliability of such measures, arguing that taking content from Wikipedia opens the floor up for conspiracy theorists to spread their views, or for vandalism to be given a wider field of view. Such things have previously happened, including an instance when vandals caused Siri to respond to the question "What is an Indian?" by saying "they are a little brown and they smell like curry and they eat it". In late March, Wikimedia's Chief Revenue Officer complained about Apple and Amazon using Wikipedia's content without giving back to the foundation.

Several Wikipedians gave their thoughts:

"it's a good thing if these social media companies use Wikipedia properly. If they started linking to unverified material or add content into Wikipedia themselves, then it could be bad for the companies' (and Wikipedia's) reputations. The reader would not benefit if they are directed to a poorly sourced article that itself looks like it could be a hoax. However, among some groups, the perception of Wikipedia as a reliable source is low. In an ideal world, the articles that are being linked-to would be at least of the same quality as you may expect from an article that is run for Did You Know. It would be optimal if Wikipedia could recruit experts in these subject areas that could help edit the Wikipedia articles, discrediting the hoaxes.
— Epicgenius


Most controversial topics are already semi protected and relatively decently watched. I am hoping that this will be enough to deal with much of the potential disruption.

On the plus side this sort of exposure may bring in more people who are interested in improving or maintaining these topics as they may see Wikipedia as having a potentially greater impact.
Well YouTube has announced this effort, I do not think it has rolled out yet. It would be nice to help with maintenance if they provided us with a list of articles they plan to link to. This would also allow us to determine what effect their change has on readership if any.
We could of course potentially build something internally [by] creating a list of articles based on traffic coming from YouTube.
— Doc James


Mike Pompeo did not serve in the Gulf War after all

We hope this is actually a picture of Mike Pompeo. Can't be sure, though.

Quartz describes in this piece how an IP inserted an unsourced claim that the CIA director, and current nominee for the Secretary of State, Mike Pompeo served in the Gulf War – he did not. His cavalry squadron was not one of the units sent to Iraq in 1991. The problem is that several other outlets repeated the claim and months went by before the CIA issued a correction and the error was removed from Wikipedia. The Quartz piece said, "The situation shows how much major media outlets have come to rely on Wikipedia, a crowd-sourced encyclopedia run by the Wikimedia Foundation, a non-profit that employs less than 300 people".

The false claim was picked up by the Los Angeles Times ("an army officer who served in the 1991 Persian Gulf War"), The Wall Street Journal and The New Yorker. Trey Gowdy, in his letter of support for the nominee, wrote "Michael Pompeo spent five years serving in the United States Army, including in the Gulf War". The article was viewed over 850,000 times between when the erroneous information was added in December 2016, and when it was corrected in April 2018.

In brief

  • Penny Wong: In February, Buzzfeed and others reported that "The Victorian Department of Premier and Cabinet has launched an investigation into who in the department is editing [vandalizing] the Wikipedia page of Labor's leader in the Senate, Penny Wong". After several months, the department has given up the investigation.
  • Previews: It was noted by several news organizations that Wikipedia added page previews "designed to help save you from disappearing too far down internet rabbit holes".
  • The tale of a tweet: Government officials of Uttar Pradesh were left red-faced when they sent Guru Nanak birthday tweets–about seven months early. They were quick to blame the mishap on Wikipedia, with one writing "Sorry for Guru Nanak Ji’s birthday tweet. The confusion happened due to Wikipedia (enclosed). Apologies to everyone".
  • They like trains: The New York Times reported on the work of two members of WikiProject New York City Public Transportation, Epicgenius and Kew Gardens 613.



Do you want to contribute to "In the media" by writing a story or even just an "in brief" item? Edit next month's edition in the Newsroom or leave a tip on the suggestions page.
+ Add a comment

Discuss this story

These comments are automatically transcluded from this article's talk page. To follow comments, add the page to your watchlist. If your comment has not appeared here, you can try purging the cache.
  • This post-publication piece may interest readers 200,000 volunteers have become the fact checkers of the internet. Eddie891 Talk Work 22:19, 27 April 2018 (UTC)[reply]
    • Let's not give ourselves too much slobbery self-congratulation. FAC is a dysfunctional mess that merely appears to function because it passes non-controversial articles, usually on tiny subjects like a road, a town, a coin, a defunct magazine, or a species of bird or fern. It is not even vaguely competent enough to evaluate meaningful articles like Bengal famine of 1943, which is the locus of a large amount of disinformation. The FAC reviewer did not even bother to read the FAC. If FAC has no validity, then neither does Wikipedia as a whole.Axylus.arisbe (talk) 04:31, 9 May 2018 (UTC)[reply]
      • There is absolutely no incentivization for reviewers at any featured-review process to give any meaningful feedback. Any feedback beyond "fix grammar" is completely ignored by the nominator and other reviewers. Nergaal (talk) 14:45, 13 May 2018 (UTC)[reply]
  • Isn't one of the Internet's main problem that many individual contributors remain anonymous? Even on Wikipedia, users normally employ aliases, although they are encouraged to validate an associated e-mail address they own (which itself may be an alias). Perhaps Wikipedia should take the lead by verifying the true identity of users, using a type of authority control on user pages to enhance the unique ID that already exists for each user. This could be linked to other standard information such as the user's passport number or photograph. Clearly users would have to opt in to the system: I'm not suggesting (yet!) that anonymous contributions be prohibited. However, users who did take part could then be awarded an enhanced status. Incidentally, I note that currently there is not even any system in place to prevent users having multiple log-in credentials on Wikipedia, although I accept that there may be valid reasons for allowing this. Michael D. Turnbull (talk) 15:16, 9 May 2018 (UTC)[reply]
    • That's one of the theories, a more nuanced one is that when people build up a reputation in a particular community they care about that reputation - whether their identity in that community is pseudonymous or the same as the real world. We cover a lot of businesses and we have a lot of articles on controversial people, pseudonymity is our best defence for our editors on such subjects. If we agree to the Public Relations industry's request that we insist on real name editing we say goodbye to neutral point of view on large swathes of the Wikipedia and hand them to the spammers, PR flaks and anyone who wants to employ lawyers to enforce their version of events. As for abusive sockpuppetry, we don't just have systems in place to detect it, we have a long long history of sockpuppets detected and banned. ϢereSpielChequers 08:40, 15 May 2018 (UTC)[reply]
      • 1) “We cover a lot of businesses and we have a lot of articles on controversial people” True. However, Wikipedia is an encyclopaedia and one of its tenets is that every statement is backed up by a verifiable external source, preferably a secondary or tertiary source. If I, as a named individual contributor, quote a source that said something controversial but do so in a neutral tone, pointing out that there are other views (and referencing those also), what would be the problem? I don’t see the need to be anonymous.
      • 2) “[if] we insist on real name editing we say goodbye to neutral point of view.” Why? Neutrality just means presenting the evidence without adding personal bias: a contributor can do that whether anonymous or not.
      • 3) “And hand them to the spammers, PR flaks and anyone who wants to employ lawyers to enforce their version of events.” In my (idealistic, I agree) world, there would be no spam because everyone could see who the contributor was and could detect if that contributor tried to express the same view over and over again. Wikipedia can already deal with most edit wars and it would be even easier to do so if each contributor could only make contributions from one account. I don’t think that “lawyers and PR flaks” would be in a stronger position than anyone else to enforce a specific version of events, since the references would be there in the article for all to see.
      • Can you give me some specific examples that would support your view? Michael D. Turnbull (talk) 16:12, 18 May 2018 (UTC)[reply]