Breaking the barriers that hold bodies back In 2019, sextech startup Lora DiCarlo’s Osé, a personal robotic massager, received the CES Innovation Award for Robotics. A month after the announcement, the CES and the Consumer Technology Association (CTA), which owns CES, rescinded the award citing [...]
Wikipedia’s role in a fragmented and confusing information landscape
In a year with a U.S. presidential election and a pandemic, what we needed was reliable and consistent information. What we got was quite the opposite.
We live in an age where information lies inches from our fingertips. There is both good and bad in this; the availability of free knowledge is remarkable, but misinformation is a serious concern. Fake news has been a well-acknowledged problem for years, but tackling it has been an experience akin to fighting a hydra: every time one head is chopped off, another two grow in its place.
Twitter, for instance, began adding fact-check labels to tweets containing misleading or disputed information in May 2020. The labels prompted users to ‘get the facts,’ and were mostly applied to tweets regarding COVID-19 and–notably–those of former U.S. President Donald Trump.
Despite this, unconfirmed rumors continued to run rife. Further, many of the far-right extremists, conspiracy theorists, and political conservatives formerly on Twitter deemed the company’s actions ‘censorship’ and mass-migrated to Parler, a 2018-founded microblogging site which sells itself as a ‘free speech’ social network (The New York Times).
But for those who don’t get their news exclusively from social media, there’s one Web destination that always seems reliable, a trusted port-of-call in the midst of uncertainty: Wikipedia. Every year, Wikipedia articles accumulate billions of views, and its most-viewed pages act as a miniature of the year’s major events. In 2020, unsurprisingly, the pandemic and the U.S. election dominated the list.
According to the free Wikimedia web traffic analytics tool Pageviews, the page ‘2019-20 coronavirus pandemic’ was the third-most viewed Wikipedia article in 2020, with over 39.5 million views. ‘Donald Trump’ took the top spot, with 52.5 million views. Entries lower down the list follow similar lines; the ‘Kamala Harris’ and ‘Joe Biden’ pages both follow directly after ‘2019-20 coronavirus pandemic,’ and ‘Spanish flu’ sits in 12th position.
The question to answer now is whether these results also show that some people, at least, were attempting to find unbiased information. The 17th most-viewed page is an interesting case study: it’s about Japanese scientist and immunologist Tasuku Honjo, who was falsely quoted as saying the novel coronavirus had been manufactured in a laboratory in Wuhan, China. One can only hope that all those who viewed the article were looking for proof, one way or the other.
The current social media business model needs change
Wikipedia Co-founder Jimmy Wales has frequently spoken out about the dangers of social media algorithms in the past–the algorithms in particular, because they were deliberate choices on the part of Facebook and its contemporaries to hold users’ attention and sell more ads. Twitter is also a breeding ground for misinformation: one 2018 MIT study found that fake stories are 70% more likely to be retweeted than true stories.
“What I always say is, if your stereotypical crazy uncle is posting anti-vaccine materials to friends and family on Facebook, that’s not really Facebook’s problem,” Wales tells Jumpstart in an interview at Web Summit 2020. The danger, he says, is that Facebook’s algorithm gives people saying ‘outrageous’ things an elevated platform and creates an environment where people only see their own beliefs reflected back at them.
The business model, he says, “keeps people on the site, keeps them seeing ads, but it’s very unhealthy for the world. It’s not a harmonious experience for anyone.”
A few weeks after this interview, Twitter and Facebook banned Trump from their platforms, a move they had avoided making for years. However, Wales is convinced that social media companies will need to make bigger, sweeping changes to ensure their continued survival.
“If people are starting to believe that you are destroying society, and leading us to a very, very bad place, that’s not good for your business in the long run,” he says. “So you need to really think, is Twitter even going to survive for five years? Is Facebook going to survive for five years? If people become convinced that you’re actually breaking the world, people will find other alternatives.”
Staying neutral in a polarized world
Wales credits Wikipedia’s reputation as a neutral source of information to its non-profit structure. Volunteer editors from around the globe work to edit, refine, and add sources to articles, and the site has a sophisticated internal criteria for moderating edits from the greater population. This runs counter to much of social media and digital media, where pay-per-click ad revenue is still the primary business model for many, and holding users’ attention is of paramount importance.
“The nature of Wikipedia is it’s very human-oriented,” Wales says. “There’s no incentive for anybody–any of the Wikipedians–to optimize for outrage and clickbait headlines.”
That said, Wikipedia itself isn’t always squeaky clean, and nor does it always stay clear of political or gendered controversy. It created a stink in 2020 for neglecting to create a page for Theresa Greenfield, a U.S. Senatorial candidate for the state of Iowa (Wired). This resulted in a long-drawn-out battle about whether Greenfield met the enclopedia’s standards for “notability,” a criterion which defines whether people deserve dedicated articles.
Wikipedia also attracted the ire of a Hindu right-wing blog, OpIndia, which has written several articles railing against the website for what it perceives as biased editing of its own page and multiple other pages tied to the volatile Hindu-Muslim relationship in India. In both cases, Wales got personally involved to resolve the dispute–a rare occurrence.
“Sometimes you find that, actually, this isn’t as neutral as it could be, or this is a problem in our rules that led to a situation where people aren’t correcting something that needs to be corrected,” Wales explains. In other cases, he adds, it becomes clear quickly that third parties are attempting to push their own agendas.
The challenge, he says, lies in presenting the information “…in a way that acknowledges the uncertainty in the world, while at the same time giving people the whole story.”
To that end, Wales recommends that the first step for anyone trying to break out of the cycle of belief-reinforcement on social media should read a variety of high-quality left- and right-leaning news sources, and exercise critical judgement.
“After a very difficult year, I think that a lot of people have learned a lot about what’s wrong in the world. And so now I just encourage people to be thoughtful and to try and work towards better solutions,” he says.
And as for Wikipedia, Wales has a firm idea for the role the website will play going forward.
“I hope that we are a calm, quiet, solid alternative,” he says, “so that instead of people screaming at each other on Twitter, or on TV news, you come to Wikipedia to get the background, and to really reflect and assess all the evidence from all angles.”
Originally published in Jumpstart Magazine Issue 31 as ‘Neutrality in the Echo Chamber’