Skip to content

Following Viral Misinformation – The New York Times

    Two decades ago, Wikipedia emerged on the scene as a quirky online project that aimed to crowdsource and document all human knowledge and history in real time. Skeptics feared that much of the site would contain unreliable information and often pointed to errors.

    But now, the online encyclopedia is often cited as a place that, on balance, helps fight false and misleading information disseminated elsewhere.

    Last week, the Wikimedia Foundation, the group that oversees Wikipedia, announced that Maryana Iskander, a social entrepreneur in South Africa who has worked for years for nonprofit organizations tackling youth unemployment and women’s rights, will become its director in January.

    We spoke to her about her vision for the group and how the organization works to prevent false and misleading information on its sites and on the Internet.

    Give us an idea of ​​your direction and vision for Wikimedia, especially in such a fraught information landscape and in this polarized world.

    There are a few core principles of Wikimedia projects, including Wikipedia, that I believe are important starting points. It is an online encyclopedia. It doesn’t try to be something else. It is in no way trying to be a traditional social media platform. It has a structure led by volunteer editors. And as you may know, the foundation has no editorial control. This is a user-led community that we support and enable.

    The lessons to be learned, not just with what we do, but how we continue to iterate and improve, begin with this idea of ​​radical transparency. Everything on Wikipedia is quoted. It’s discussed on our talk pages. So even if people have different points of view, those debates are public and transparent, and in some cases they really do just kind of go back and forth. I think that’s the need in such a polarized society – you have to make room for back and forth. But how do you do that in a way that is transparent and ultimately leads to a better product and better information?

    And the last thing I want to say is, you know, this is a community of extremely humble and honest people. Looking to the future, how can we build on those attributes in terms of what this platform can continue to bring to society and provide free access to knowledge? How do we ensure that we reach the full diversity of humanity in terms of who is invited to participate, who is written about? How do we really ensure that our collective efforts reflect more of the global south, reflect more women and reflect the diversity of human knowledge, to be more reflective of reality?

    What is your take on how Wikipedia fits into the widespread problem of online disinformation?

    Many of the core features of this platform are very different from some of the traditional social media platforms. If you pick up misinformation about Covid, the Wikimedia Foundation has partnered with the World Health Organization. A group of volunteers gathered around what was called WikiProject Medicine, which focuses on medical content and creating articles that are then checked very carefully, because these are the kinds of topics you want to factor in around misinformation.

    Another example is that the foundation has put together a task force ahead of the US election, again in an effort to be very proactive. [The task force supported 56,000 volunteer editors watching and monitoring key election pages.] And the fact that there were only 33 reversals on the main US election page was an example of how you can focus really well on important topics where misinformation poses real risks.

    Another example that I just think is really cool is that there’s a podcast called “The World According to Wikipedia.” And in one of the episodes, there’s a volunteer that’s being interviewed, and she’s really put a lot of effort into being one of the key viewers of the climate change pages.

    We have technology that alerts these editors when changes are made to any of the pages so they can see what the changes are. If there is a risk that misinformation will actually creep in, there is an option to temporarily lock a page. No one wants to do that unless absolutely necessary. The example of climate change is helpful because the talk pages behind it are widely debated. Our editor says, “Let’s have the debate. But this is a page that I will closely monitor and monitor.”

    A major debate currently taking place on these social media platforms is the issue of the censorship of information. There are those who argue that biased views take precedence on these platforms and that more conservative views are scrapped. When you think about how to handle these debates once you’re in charge of Wikipedia, how do you judge with this in the background?

    What’s inspiring to me about this organization and these communities is that there are core pillars that were established on day 1 in setting up Wikipedia. One is the idea of ​​presenting information from a neutral point of view, and that neutrality requires understanding from all sides and all perspectives.

    It’s what I said before: leave the debates on talk pages aside, but then come to an informed, documented, verifiable citable conclusion about the articles. I think this is a core principle that, again, may offer others something to learn from.

    Coming from a progressive organization that fights for women’s rights, have you thought a lot about disinformants arming your background to say it could affect the calls you make about what’s allowed on Wikipedia?

    I would say two things. I would say that the really relevant aspects of the work I’ve done in the past has been volunteer-led movements, which is probably a lot harder than others might think, and I played a really operational role in understanding how systems work. building, building culture and building processes that I believe will be relevant to an organization and a range of communities trying to increase their scale and reach.

    The second thing I would like to say is, again, I have been on my own learning journey and invite you to come on a learning journey with me. How I choose to be in the world is that we interact with others on a premise of good faith and engage in a respectful and civilized manner. That doesn’t mean other people are going to do that. But I think we have to hold on to that as an aspiration and as a way of being, you know, the change we want to see in the world too.

    When I was in college, I did a lot of my research on Wikipedia, and some of my professors said, ‘You know, that’s not a legit source.’ But I still used it all the time. I was wondering if you had any thoughts on that!

    I think most professors now admit that they also sneak into Wikipedia to look for things!

    You know, we’re celebrating Wikipedia’s 20th anniversary this year. On the one hand, this was something I think people made fun of and said wasn’t going anywhere. And it has now rightly become the most referenced resource in all of human history. I can tell you from my own conversations with academics alone that the narrative surrounding Wikipedia resources and the use of Wikipedia has changed.