The Many Faces Fighting Disinformation - SAFEGUARDING CIVIL SOCIETY'S ROLE IN THE RESPONSE TO INFORMATION - EU DisinfoLab
←
→
Page content transcription
If your browser does not render page correctly, please read the page content below
The Many Faces Fighting Disinformation SAFEGUARDING CIVIL SOCIETY'S ROLE IN THE Funded by: RESPONSE TO INFORMATION DISORDERS
Table of Contents 5 20 Global Disinformation Index EDITOR’S NOTE “OUR GOAL IS TO DEFUND DISINFORMATION” VerificaRTVE 6 META-DEBUNKING DECENTRALISED DISINFORMATION 21 Tracking Exposed KNOW YOUR ALGORITHM 22 Debunk EU Lie Detectors 8 DISINFORMATION IS SMALL WATER DROPS THAT OVER TIME CAN HEW OUT A STONE NEWS LITERACY FOR ALL AlgoTransparency “ALGORITHMS NOT 10 24 Maldita.es DESIGNED TO HAVE QUALITY JOURNALISM NOT TO BE INFORMATION IN THEIR FOOLED OBJECTIVE FUNCTIONS WILL NATURALLY FAVOUR Who Targets Me DISINFORMATION 12 “WE USE TECHNOLOGY 26 TO TELL THE STORY OF First Draft TECHNOLOGY” ADRESSING THE INFORMATION ECOSYSTEM 14 MEMO 98 DISORDER MONITORING ELECTIONS IN THE DIGITAL AGE Account Analysis Citizen D 29 “MACHINE LEARNING OFTEN 16 GETS IT WRONG, I WANTED COUNTERING TO GO A STEP BACKWARDS” DISINFORMATION FROM A TO Z 30 2020, A TURNING POINT 17 WTFake!? IN OUR RESPONSE TO THE FIRST CITIZENS’ ANTI- DISINFORMATION? FAKE NEWS BRIGADE 18 Bellingcat A CENTRAL NODE IN A GROWING NETWORK
Key trends Timeline 1998 MEMO 98 2014 Bellingcat These are micro-entities. 43% of the initiatives surveyed count between 0 and 2 employees 2015 First Draft Citizen D 2016 VerificaRTVE They are community-reliant. 57% work with volunteers and 64% rely on crowdsourcing to some extent Tracking Exposed Lie Detectors Strained relationship with the private sector. Almost a third of the participants 2017 Debunk EU feel they are in opposition to the major platforms, and two thirds have a weak relationship with telecommunications actors AlgoTransparency WTFake? Sustainability. Only one of the initiatives Who Targets Me surveyed said that their operation is fully sustainable. 2018 Maldita.es Global Dsinformation Index Security. Cybersecurity is a concern for Account Analysis all actors interviewed. None feels that their operation is entirely secure. 3
As we have noted at EU DisinfoLab, more of. They exhibit new types of disinformation has many faces expertise – from digital forensics, (manifestations, motives, and tools). to crowdsourced research, to It is only logical that the response journalism-classroom partnerships. to disinformation must have many Though they might seem unusual faces as well. In this project, we today in their techniques or seek to present a panorama of the organisational structures, we believe different kinds of actors responding that they also point towards the to disinformation today - from future of civil society engagement broadcast journalists to open source against disinformation. investigators to election observers to technology developers. In the report Despite the critical work they do, that follows, we interview 14 actors many of these actors feel alone. from across this emerging civil society They struggle to make their voices ecosystem. heard by policymakers and by the tech companies at the heart of our It is important to note that many information ecosystem. Their long- EdItor's NOTE of the actors interviewed do not term security and sustainability are see themselves as responding to not assured. In order to effectively disinformation per se. We use the counter disinformation challenges, term ‘disinformation’ as a shorthand we need an ambitious framework for the many illnesses in our to sustain and further develop this information ecosystem (what First civil society network. Following Draft, whom we interview here, these interviews, we make several refers to as “information disorders”). recommendations which we feel are The problem for us encompasses necessary to safeguard a resilient, disinformation and misinformation, decentralized civil society ecosystem. mistrust in journalism and the weaknesses or deficits in existing media, the business model behind clickbait and disinforming content, DESPITE THE CRITICAL WORK algorithmic targeting, and other THEY DO, MANY OF THESE ACTORS opaque mechanisms in our digital FEEL ALONE. THEY STRUGGLE TO information architecture. MAKE THEIR VOICES HEARD BY POLICYMAKERS AND BY THE TECH This report is necessarily limited in scope. The interviews here represent COMPANIES AT THE HEART OF OUR a snapshot or cross-section of an INFORMATION ECOSYSTEM. expanding network of individuals and initiatives. Still, the findings we Last, it should be noted that this share are broadly representative of project is a view from the inside. this space. We conducted qualitative, These discussions were framed by our semi-structured interviews in order own experience at EU DisinfoLab, as to understand how these initiatives a relatively young, small NGO, finding were created and have evolved over our way in this new environment. We time, and to examine in detail the consider the people featured here to difficulties these actors face in terms be our colleagues, allies, and friends. of sustainability, security, and impact. We admire them, we learn from them, At the same time, these actors serve and we are continually grateful for as examples of what we hope to see their work and their collaboration. 5
META-DEBUNKING DECENTRALISED DISINFORMATION Redondo Dr Myriam Spain The purpose of Verifica RTVE is to change the Global culture of the institution. D r. Myriam Redondo has pioneered Steering Clear of Amplification digital verification workshops for Spanish journalists since 2012. She RTVE faces unique challenges came into contact with RTVE as as a public institution formally an external trainer in 2016. An early expert dependent on public in the field, she released her doctoral thesis funds. Under heightened on “Internet as a source of information for scrutiny from audiences international journalism” in 2006. The team and inevitable political she is currently part of, the digital verification pressure at moments, team Verifica RTVE, includes only 2 full RTVE has to prove their time and two half time employees (though independence day by they will be adding new members in 2021). day and maintain the Their purpose is to change the culture of public interest. In the institution as a whole by teaching digital recent months, verification. “This is transversal across RTVE this has meant and that is the success. It involves documentary focusing more on experts and archive experts, designers and public health journalists, all positions in the team. We publish and less on in multiple formats, radio television, internet...” fact checking she explains. political For Myriam, the problem of false news can only be addressed transversally, through journalistic capacity building and a cultural shift. “We as journalists cannot do it all. If we fight fake to fake, we’ll get tired. It’s like trying to swim in a vast sea.” Instead she suggests the need to meta-debunk, particularly for what she calls distributed disinformation: “We tend to analyse a fake, we take the content and debunk it, but, at least in Spain, liars are sophisticated in their activity. They publish content but it doesn’t include a lie, it’s just a suggestion, then a second liar goes farther, and a third one farther. You have to debunk the whole chain, the idea behind it”. 6 DIGITAL VERIFICATION FACT CHECKING JOURNALISM ACADEMIA
statements, which Myriam perceives very see what is happening. We are partially watching often to be “noise”. “Politicians from extremist what happens in each but don’t have the whole parties are tempted to use our services to amplify vision”. Myriam explains the need for more a topic. When we verify a topic we enter their tools, in particular tools that provide network agenda.” analysis and track trends across platforms. She also needs the ability to parse more carefully The team is also strategic in their method of between countries, to avoid unnecessary debunking to avoid sharing content more transnational amplification through fact widely than is necessary; they try to respond checking. to queries from citizens in the same channels where they are posed (directly in a WhatsApp Myriam wants message, for instance) and they try to more collaboration reply to personal between journalists q u e s t i o n s and specialized privately. Of institutions. She course, this takes also sees a need for IT’S IMPORTANT massive human guaranteeing diversity TO DEBUNK FAKE resources. Not in the growing industry BY FAKE BUT ALSO only is this kind of digital verification; TO META-DEBUNK of monitoring political fact checking TRENDS AND t i m e is a clear example INTENTIONS consuming, of an area where a but it is often multiplicity of voices is not possible needed, rather than a monopoly. “I envision a in closed world in which journalists are doing our job, but in messaging which we need [digital verification] organisations spaces. for deeper analysis on a given trend of topic. Also, “ W e a world in which citizens receive a more robust cannot education on media literacy and critical thought,” clearly she concludes. About RTVE RTVE is the Spanish Radio and Television Corporation, founded in 1973. It is also the first Spanish national media that began training its staff on UGC (user generated content) and digital verification techniques, at a time when political fact checking was the dominant trend in the country. This was before the election of Donald Trump, “the event that changed everything”. RESEARCH INTERNATIONAL RELATIONS / POLITICS 7
DISINFORMATION IS SMALL WATER DROPS that over time can hew out a stone as Viktoras Daukš DISINFORMATION ANALYSIS Lithuania MONITORING & REPORTING ia, Lithuania, Latv Pol an d, US, TECHNOLOGY DEVELOPMENT Estonia, ac ed on ia North M STRATEGIC COMMUNICATION LARGE-SCALE MEDIA LITERACY CAMPAIGNS FACT CHECKING COMMUNITY TRAINING Debunk EU defines itself as an independent technological think tank and analysis center for disinformation. 8
F or Debunk EU, building a response to have drawn on their expertise in research and match the scale of the disinformation analysis, accumulated over many years. problem has meant merging automation Currently they see potential for working with and artificial intelligence with human Eastern partnership countries as well. “The analysis and dedicated volunteers (a network technology is scalable, and we can work with local of ‘elves’). “How do you find the needle in the partners”, say Viktoras. The fact that the Debunk haystack, the needle being disinformation cases EU team in Poland was able to produce their first that have the biggest impact? If you do it manually, report within four weeks of its establishment you’ll never see the bigger picture”, explains in Warsaw proves that the system is not only Viktoras. Debunk EU has developed an AI- scalable, but also able to achieve significant based analytics tool which spots and identifies results in a short topics of interest in online articles in real time. time. This means that from 1 million pieces of content they receive each month, their analysts D e v e l o p i n g can focus on the most harmful ones (10 – 15 a u t o m a t e d IF YOU ARE A 000 content pieces). Long term reporting and solutions is costly BUSINESS, YOU analysis are also at the core of their approach: and requires in- CAN BE FUNDED disinformation analysts in the four countries house technical BY ANYONE, BUT IF provide thematic reports on topics and trends, expertise, which YOU ARE FIGHTING which are then shared with a wide range of is rare in the stakeholders. Debunk EU has applied process disinformation DISINFORMATION, automation across their reporting and analysis space, and generally YOU CAN WORK activities, allowing them to produce around 10 unheard of among ONLY WITH reports per month. NGOs of this size. CREDIBLE AND Debunk EU is TRANSPARENT A scalable approach growing rapidly, and DONORS now sustainability is Founded in 2017, the project emerged in the the main question. As a tech-based organisation Baltic countries and was supported by DELFI whose infrastructure is regularly attacked, and the Google News Initiative. Debunk EU’s they must invest heavily in cybersecurity and initial partnership with colleagues in Latvia monitoring of their digital ecosystem. Even the and Estonia was inspired by their shared most agile project management requires fuel history and disinformation threat constantly to run on. While they haven’t found the optimal coming out of Russia since the fall of the Soviet financial model yet, Viktoras says that they will Union. In order to meet this challenge, they be testing out new models in 2021. Disinformation • Coordinated efforts • Trained forces Debunking • Strategy in place • Fragmented efforts • Well-funded and cheap • Difficult to work in real to produce time • Time consuming • Reaching citizens is expensive 9 Cost
JOURNALISM FACT CHECKING NOT TO BE JOURNALISM FOOLED TRAINING ADVOCACY POLICY/EXPERT ADVISORY ROLE Maldita is a Spanish non-profit fact-checking and data journalism platform working to monitor and counter digital discourse, to fight disinformation, and to promote media literacy. T wo Spanish broadcast journalists, chatbot, which currently has an answering Clara Jiménez Cruz and Julio rate of 8 hours. Carlos is confident that the Montes, began the organisation in AI will improve further as the community 2014. It has since grown to 20 full diversifies. Growing and diversifying the time employees and between 30 and 35 community is top of the agenda, bringing operating staff. The team primarily debunks in what he describes as “the many people stories brought to them by their volunteer who care about disinformation but who aren’t community and recirculates these debunks following every second of the news cycle.” as widely as possible. Their approach is Maldita hopes to reach these wider audiences deeply community centered, driven by a through new partnerships, for example with tiered volunteer participation model: as of the popular tabloid 20 minutos. Meanwhile July 2020, this counted about 40,000 General they’re exploring other verticals, for example Malditos or recipients of the newsletter, through a science public engagement 1,000 Ambassadors or financial supporters, project, a platform focused on gender- and 2,000 active Malditos and Malditas, related hoaxes, and a browser extension to individuals who contribute ‘superpowers’ or promote transparency in public and private subject area expertise to the organisation’s institutions. Recently, they are moving more fact checking efforts. into the public policy space to weigh in on major policy issues too big to ignore. Since their founding and particularly during the pandemic, the mis and disinformation Maldita brings clear benefits to many social problem has changed in scale and scope. media platforms. In a sense, Carlos explains, “We’ve gone from 1.5 million unique visitors per they “pay the bill” for fact checking services month to sometimes 8.5 during the pandemic, that platforms claim to provide. which is good and bad” Carlos testifies. “We have been hiring during the pandemic, expanding Though Carlos specifies that different our operation because the situation called for platforms have different practices, in it. The audience was more engaged, but small general, these relationships are far from organisations like ours are fragile. The scale of reciprocal. At the same time, platforms are the operation was sometimes overwhelmed.” not sufficiently forthcoming with the data that is necessary for actors like Maldita to Maldita has been experimenting with assess the effectiveness of their debunking. artificial intelligence (AI) in the form of a Still, “the task at hand is too important to 10
forgo” in Carlos’ words, and the community is in agreement. Maldita has just ndez received foundation status Carlos Herná in Spain, which allows them to preserve their reputation Spain and independence - a long and expensive endeavor that d was made possible through Spain, North an ic a crowdfunding contributions. South Amer WE NEED TO FIND WAYS TO MAKE SURE WE AREN’T WORKING FOR FREE. WE HAVE TO ENSURE OUR FUTURE Why “Maldita”? Maldita translates to ‘damned’. The name is a reference to Maldita’s first fact checking initiative, Maldita Hemeroteca (Damned Archive), which is a project to confront politicians with past statements they themselves made that contradict their current views. 11
“WE USE TECHNOLOGY Sam Jeffers TO TELL THE STORY OF om United Kingd Global TECHNOLOGY” CONSUMER LITERACY RESEARCH TECHNOLOGY DEVELOPMENT POLICY/EXPERT ADVISORY ROLE ADVOCACY Sam Jeffers co-founded Who Targets Me in 2017, having worked previously on political campaigns which had favoured grassroots, bottom-up tactics. He’d witnessed the ecosystem shift around 2015, when parties and candidates began to buy social media ads to target small groups of voters with large quantities of tailored messaging. W ho Targets Me is interested platforms. They are a small operation and all of in “practical transparency for their funding is project based. Sam and his co- political campaigning”, in Sam’s founder, who has another fulltime job, take on words, and most focused on capacity and particular skills as needed, which transparency for people running for office. gives them a certain lightness compared to “Others might be traditional NGOs. “We’re a bunch of flexible looking for people who have interesting and innovative ideas, other things which we try to execute cheaply and quickly and such as fake simply.” Despite their size, they are committed WHILE WE DO profiles or to staying “at the forefront of thinking on COMPLAIN ABOUT state influence political ads and regulation”. They THE PLATFORMS (A efforts. We’re bring a more balanced voice to LOT), SOMETIMES looking at the the policy space, where new THEY’RE THE very top of the entrants are often eager tree because to simply “see everything ONLY ONES DOING we think that banned”. ANYTHING sets the norms and examples A lack of creativity in around which the rest of politics and democracy the policy response work.” implies a general need to better bridge the Who Targets Me is working in an uncrowded product-policy divide, space, in part due to the technical challenge, Sam thinks. “Two years and in part due to an increasingly chilly ago we could have been research climate around the large social media experimenting with labeling 12
in an independent way and looking through a more shaped over a longer period of time. We should imaginative portfolio of responses. Alternatives be looking for longer and using the aftermath of are possible, but no-one is really doing the work elections to push forward with reforms.” of designing what better services might actually look like.” He also notes the need for longer term thinking. “We need to think about election years and cycles rather than the few months and weeks before elections. People’s opinions are Consumer ad literacy Who Targets Me provides a free browser extension to help people understand more about the paid media they are exposed to on Facebook. The software shows a library of all political ads sent to them and data on who is showing them the most ads, and also helps them understand the mechanics behind the tailoring of that content. Sam describes it as a consumer ad literacy tool. “We’re trying to exemplify the transparency that we want the platforms to provide.” “WE WANT TO UNDERSTAND WHAT THE BIG ACTORS ARE DOING, WHO’S FUNDING THEM, AND HOW THAT HAS AN IMPACT ON DEMOCRACY AND ON PEOPLES’ UNDER- STANDING OF ISSUES.” 13
MONITORING ELECTIONS IN THE DIGITAL AGE MEMO 98 is a monitoring organisation consisting of media and election experts. What began in 1998 as a project to monitor the Slovak media prior to the parliamentary elections developed into a permanent organisation that has conducted media and election monitoring across the world. MEDIA MONITORING ELECTION OBSERVATION SOCIAL MEDIA MONITORING Elections toolkit JOURNALISM In November, Rast’o published a toolkit “How CAPACITY BUILDING / TRAINING to monitor media coverage of elections”. The toolkit was developed within the context of POLICY / EXPERT ADVISORY ROLE the Council of Europe project “Supporting the transparency, inclusiveness, and integrity of electoral practice in Ukraine”, C urrently MEMO has a core implemented within the framework team of 7 full time employees, of the Council of Europe Action but they work closely with Plan for Ukraine 2018–2022. local partners in a number of countries. MEMO’s methodology of media monitoring focuses on content WE AIM TO HIGHLIGHT THE WORK OF LOCAL PARTNERS, SERVING AS A SORT OF QUALITY GUARANTOR and aims above all to evaluate political and social diversity in media reporting. In recent years, they’ve adapted their methodology from traditional media to account for the principles of the social web, but they maintain the focus on 14
both spheres. “This combination of traditional be the ones left after international election and social media monitoring is very important,” observers leave. “It doesn’t end at the end of an Rast’o explains. election cycle. It is usually the start of another one.” Their monitoring is designed to provide in- depth feedback on pluralism and diversity Where MEMO struggles, along with others in media reporting, including coverage of in this space, is in fully understanding the particular themes (integration of minorities, mechanics of amplification online. “We focus corruption etc.). MEMO does not only focus less on inauthentic behavior - bots and trolls who on disinformation. They have “a more holistic amplify content. But from a different perspective, and general approach, assessing both the positive this is a critical part. This can make marginal voices and negative impacts of social media platforms more visible.” The tools currently available for on election integrity” in Rast’o’s words. They social media analysis (for instance, Facebook’s study three things: the actors (from both CrowdTangle) have been a game changer for traditional media and social media), the MEMO, but the data is incomplete without messages and narratives (how they are used access to private pages and closed messaging by parties to make claims and to polarize) and spaces. While entering these kinds of spaces the messaging (how the message is amplified). raises ethical questions for researchers, there are legitimate design and transparency issues Much of MEMO’s work includes capacity that hinder election integrity monitoring. building and training, enhancing the media monitoring activities of local partners. They also train journalists, NGOs, regulators, and other members of the media. For Rast’o, it is critical to support these local actors who will Rast’o Kužel Slovakia Global PLATFORMS SHOULD BE GEOGRAPHICALLY BLIND WHEN IT COMES TO APPLYING THEIR RULES. WE SHOULD NOT ASSUME THAT EVERY ELECTION IS LIKE THE ONE IN THE US 15
Citizen D’s COUNTERING MISINFORMATION FROM A TO Z ongoing campaigns Citizen D is currently tracking two governmental Based in Ljubljana, Slovenia, Citizen D ad campaigns which are is a nonprofit whose core mission is the being funded with tax- payer money. The web and promotion of human and digital rights. D television ads are directly espite counting recommendations. related to government only two full time parties through various non- employees, Citizen D’s In the Slovenian political governmental organisations activities are various context - which Domen has with strong government ties. and in-depth. Their actions range written about - Citizen D faces Citizen D is relying mainly on from privacy rights monitoring a double battle. In addition to FOIA requests to gain insight and training, to investigating the deeper problem of political into the government’s practices within the advertising propaganda funded by public opaque process of cost industry and holding discussions money, they have to respond to tracking and selection of the on the purpose of mass media a prevailing narrative around ad placements. in a democratic society. They so-called ‘fake news’. They have offer media literacy and digital developed a targeted approach, privacy trainings, and they raise following the money between INVESTIGATIVE REPORTING awareness through campaigns. all relevant public services MEDIA LITERACY One example is a national and actors (advertisers, public anti-hate campaign drawing funds, political parties, etc.). CAMPAIGNING attention to the Slovenian This posture can make financing public funding of Hungarian complicated. Citizen D tries to propaganda outlets. maintain a dispersed model of funding, developing alternative While Citizen D aims to sources of revenue that include encourage active citizenship commercial projects - in line and democratic participation, with their mission and under a Domen Savic they take legal action on their code of conduct. Meanwhile, own. “This myth around an active the lack of outside, international Slovenia citizen that will react and gather media pressure on Slovenia sufficient information from the means that local issues are often media to make their own case, ignored. “I’d have an easier time Balkans that’s not the reality we’re seeing,” saying there was a problem with Domen explains. To help lift Russian propaganda. Nobody is the onus from citizens, they’ve focused on Orban and Vishegrad,” adopted an approach of working says Domen. from A to Z. “Our job doesn’t THE TERM ‘FAKE end when we file a report. From NEWS’ HAS IN OUR that we analyze the problems, OPINION MORPHED we define which decision makers INTO A CATCH-ALL are responsible, and we pursue PHRASE THAT DOES that.” Frustrated by the limited MORE HARM THAN functioning of the legislation in some areas, for example around GOOD false advertising, they have also 16 taken to proposing legislative
THE FIRST CITIZENS’ ANTI-FAKE NEWS BRIGADE “You could say the problem I’m trying to solve is fake news, but, deeper than that, the problem is mistrust of journalists.” JOURNALISM ACTIVISM A ude is a journalist by training. She is have become discussion moderators. Others also president of Fake Off, a media contribute in different ways, like developing literacy association that works with the logo. Her colleague Sylvain Louvet helps young people in classrooms, which with editorial and video production. Aude she founded together with a small group now has 1411 people on her Discord, and the of journalists in the wake of the 2015 Paris feedback from many of her followers suggests attacks. Behind the crisis of media literacy and she is achieving the impact she hoped for, ‘fake news’, Aude identifies a lack of dialogue regularly receiving comments like “usually I and a growing mistrust between journalists don’t like journalists, but I like you!” and people. Following the election of Donald Trump, she began to feel that it wasn’t sufficient “C’est le bazar” to intervene in classrooms, and decided to take directly to YouTube. “The motivation Organisationally, things are messy, Aude behind everything is anger, I see the impunity, the admits. She set out on this alone, without indecency of manipulating the public, it makes me a financial model, and without the kinds of so angry.” contacts that many need in this space to survive. A few crowdfunding links have not La rédac’ WTFake (‘the citizen journal’) tracks yielded much yet. “I’m just a little investigative and debunks conspiracy theories, primarily journalist, but now I find myself having to reflect found on YouTube, with the support and like a business person” she explains. While admiration of an online following. In a way, the energy of her followers motivates her to Aude’s activities mirror those of her enemies continue, animating the community also limits - like the conspiracy theorist Jean-Jacques her ability to strategize and grow. To devote Crevecoeur - in a multimedia chase that plays time to fundraising would mean abandoning out across the social web. The investigations the community, and both activities take away progress over several days on a Discord chat, from time spent on investigations. Meanwhile, and culminate in a revelation streamed live there are trolls to be wary of. Aude has already over Twitch. A handful of dedicated followers changed her phone number after a conspiracy theorist doxxed her online. “Given the people I want to investigate, I’ll probably see more of this,” she reflects. WTFake!? Aude Favre France France 17
Eliot Higgins Netherlands A CENTRAL NODE IN A Global GROWING NETWORK Bellingcat emerged in July of 2014 from the online community that had formed around Eliot’s blog, in particular around his work tracking the downing of flight MH17. OSINT / DIGITAL FORENSICS JOURNALISM JUSTICE & ACCOUNTABILITY MEDIA & REPORTING TRAINING I launched Bellingcat wanting to give verification component. “You’re people a place to publish articles often dealing with debunking one side about what they were doing using through open sources,” says Eliot. open source, and also to create Their work revolves around three resources for people to learn how to do it steps. First, they identify information themselves.” Today, Bellingcat is a fully online related to a topic of interest fledged organisation with 18 core or importance (subjects include staff across research, business and corruption, corporate misconduct, administration, a management team, racial equality, far right movements, a supervisory board, and “with proper etc., but it is important that interests policies” to quote Eliot. It has recently be led by the team). Next they verify been registered in the Netherlands as that information using open source a Public Benefit Corporation. Though techniques, drawing strategically on it is no longer a volunteer dominated volunteers and on an engaged social organisation, Bellingcat has retained media community who can help with the volunteer community as a key precise aspects like geolocation and element: a network of open source identification. Finally they amplify investigators spread the use of that story, through a report, video, open source methods while further or podcast, or even as courtroom developing tools and methodologies, evidence. Bellingcat is increasingly maintaining a focus on justice and exploring the role of open source accountability. evidence in legal processes; they have worked already with the ICC and the Identify, verify, amplify United Nations. Bellingcat is not in the business of fact Much of what Bellingcat does could checking so much as “fact finding”. be considered capacity building. Still, disinformation is inherent to They offer training to journalists and their activities, in that online open fact checkers as well as to activists, source investigation has a strong NGOs and lawyers. These trainings 18
also provide a source of revenue (currently 30 collaboration, we’ll build coalitions of groups percent of their funding). Eliot sees Bellingcat to work on topics.” They recognize this can as “a central node in the network that makes up be strange for media organisations who are the online open source investigation community”, accustomed to scoops and exclusives, so a universe which spans human rights they’re strategic in bringing together non- organisations, major media competitive media and in building trust. outlets, and individual Twitter users. Though many organisations look to Bellingcat Collaboration is key. as an example, Eliot readily admits that “Often when we come “we’re figuring this stuff out as we go”. Beyond across a project OSINT, the team is focused on the editorial or something to and production side; Bellingcat is training investigate, and their researchers in journalistic writing and if it goes beyond building up a production company (which the scope of would be another source of income). They’re what Bellingcat also leaning into their volunteer community does or if it is (through a volunteer app and a Patreon), something and trying to increase their impact with that can international governance stakeholders, for benefit instance through developing standards for from a open source evidence in the area of justice and accountability. Black lives matter Along with Forensic Architecture, Bellingcat geolocated and verified over a thousand incidents of police violence during the Black Lives Matter protests in the United States. The project analyses them according to multiple categories, and presents the data in an interactive cartographic platform. 19
“OUR GOAL IS TO DEFUND DISINFORMATION” The Global Disinformation Index is responding to the growth of digital disinformation sustained through advertising revenue. BRAND SAFETY TECHNOLOGY DEVELOPMENT RESEARCH & INTELLIGENCE ANALYSIS POLICY / EXPERT ADVISORY ROLE T he Global Disinformation Index is a not- approach provides a disinformation risk score for for-profit organisation founded in 2018. websites which advertisers can use in real time. Registered in the United Kingdom, they have a virtual team of 15 experts based GDI has already seen impact, for example, in around the world. “We focus on the advertising Google’s decision to defund all Coronavirus networks that place ads on websites regardless of conspiracies. But the change isn’t happening fast whether the advertiser knows or wants their ads to enough, and it isn’t systemic. “They’ve [Google] end up there,” Clare explains. GDI’s known about anti-vax conspiracies for years, but core contribution is to provide the virus was 7 or 8 months old before they made the advertising industry with that decision. What about all the other anti-science disinformtion risk ratings conspiracy theories out there affecting our ability to assess whether or not to think critically? Flat earth conspiracies, climate ads should be placed change denial...” Clare reflects. Moreover, other on certain sites, but tech companies providing ad services must they support other take the same actions for a whole-of-industry stakeholders as well, response, not just Google. including through their media market A young organisation, GDI needs to ensure risk rating reports. their own sustainability, primarily by building out new commercial products and services to GDI views its effort fund its not-for-profit work. Looking forward, as aligned with “Brand they hope to solidify “a coalition of the willing” Safety” efforts, except that against lucrative disinformation, aligning with the organisation is focused on the corporate responsibility agenda. They also disinformattion, an area that was want to join forces with other sectors to expose previously ignored. They brought with other distortions hat provide a funding lifeline to them a degree of insights and intelligence that was disinformation. “Payment systems, merchandising, previously lacking. “Brand safety usually involves e-commerce… there’s a whole ecosystem and that is keyword blocking, static lists... Until GDI came along, not well enough known,” Clare concludes. there has never been a box for highly disinformative, toxic, adversarial narratives,” says Clare. GDI makes use of a comprehensive framework, both human and AI powered (humans can’t assess the whole internet at “colossal speed and scale”, while Clare Melford AI isn’t nuanced enough to differentiate high production, high traffic media outlets that publish UK, Germany disinformation). This combined, innovative Global 20
i Claudio Agost KNOW YOUR Netherlands ALGORITHM Netherlands, Italy, Argentina Tracking Exposed is a non-profit, free software project that aims to analyze evidence of algorithmic personalization. It was founded in 2016 by Claudio Agosti, a self-taught hacker and developer, currently researcher at the University of Amsterdam. RESEARCH ACADEMIA DATA ANALYSIS TECHNOLOGY DEVELOPMENT ADVOCACY USER EMPOWERMENT I n 2018, the University of Amsterdam’s onboarding, potential damage and security Department of Media Studies expanded risks. on Claudio’s work, creating the project Claudio is working to make the tool simpler, ALEX - Algorithms Exposed, Investigating with more visual results. He would like users Automated Personalization and Filtering for to be able to play with and control their own Research and Activism. Through his work, algorithms. He’s also interested in pursuing Claudio empowers end users to understand strategic litigation using his findings as how aggressively the algorithm is mediating evidence of platform rights violations. He information for them. The project has been has done election-related monitoring in applied to four platforms so far. “For Facebook the past, and he has a project coming up to and YouTube, it’s about information quality. For work with a polling company. Elections are Amazon and Pornhub, it’s about explaining how emerging increasingly a business case, he algorithms exist in other places and can still have observes; offering this high expertise service an impact on you.” could be a form of sustainability for Tracking Exposed. However, advertising composes a Claudio mostly works with researchers in small portion of our informative experience, the Netherlands and in Italy, though he has and so we should remain focused rather also carried out work related to Argentina. on the role of our personalized algorithms, The academic label is important for the Claudio notes: “We legitimacy of the project, and also for finding should have control collaborators and growing the project. of our algorithm The work on Pornhub was carried out by a because that is the researcher he met while teaching at the Digital tool responsible Methods Summer School, who wanted to for all the content expose heteronormativity on the platform. selected.” Tracking Exposed gathers data through crowdsourcing, but getting access to wider audiences and groups is challenging. To the extent that outreach depends on marketing and visibility, this can exceed the capacity of a researcher. For a small programming project like Claudio’s, recruitment requires compromises and calculations: investment in 21
NEWS LITERACY Juliane von Reppert-Bismarck FOR ALL Belgium ia, Belgium, Austr Germany Juliane founded Lie Detectors to respond to two sides of a problem she identified in the information ecosystem. JOURNALISM EDUCATION MEDIA LITERACY POLICY / EXPERT ADVISORY ROLE O n the one hand, she was it is possible because Lie Detectors has dismayed by increasing distrust that rarest of gifts in this ecosystem: long in professional journalism, term flexible funding). From a research and increasing polarisation, and the methodological perspective, this diversity blurring of facts and opinions - a problem to is crucial. The organisation also works which children are particularly vulnerable. closely with questionnaires and feedback And on the other hand, she noticed “the forms to identify developments in mis and succumbing of journalism to the promise disinformation from the perspective of the of click bait”, the response of an industry in children. Juliane explains that conspiracy crisis forced to take short cuts. theories, which appear to be a growing challenge for students, will require particular Lie Detectors brings the two sides together, policy recommendations and particular placing working journalists into schools to training for educators: “Conspiracy theories deliver dynamic trainings. The organisation are the ultimate deep fake, because they are currently has a core team of 14 staff and layers upon layers upon layers that you have to coordinators, along with a network of 200 unravel.” journalists and a growing ecosystem of educators and schools in Belgium, Germany “This is not something we should be feeling and Austria. The approach is circular: Lie proprietorial about” Detectors trains journalists, who then train children and educators in the classroom; The service Lie Detectors is providing is in meanwhile that classroom experience demand, both on the side of schools and from provides feedback for journalists and journalists (they currently have a wait list). newsrooms, as well as insights for Lie “This is a method and a structure. It’s so simple, Detectors’ policy development and advocacy it makes perfect sense… We want to amplify work. and facilitate organisations doing similar work.” They increasingly hold seminars that Sessions are free and journalists participate enable teachers to deliver these trainings as volunteers with compensation, which themselves. means that trainings are delivered in diverse school environments and from journalists with diverse experiences. (Incidentally, 22
Thinking critically Formerly a journalist herself, Juliane understood the need to “differentiate ourselves from disinformation, and also admit that we don’t always get it right.” She explains that “admission of fallibility and rebuilding of trust was important from the beginning. That’s why we have to work with a particular kind of journalist, capable of speaking critically about their work.” EVERYBODY KNOWS DISINFORMATION IS A PROBLEM, BUT PEOPLE NEED TO UNDERSTAND THE CAUSES AS WELL AS THE SYMPTOMS. THAT BECOMES MORE VISIBLE WHEN YOU ARE ABLE TO MAKE A COMPARISON BETWEEN COUNTRIES, LANGUAGES, CULTURES, SCHOOLS. 23
“ALGORITHMS NOT DESIGNED TO HAVE QUALITY AlgoTransparency INFORMATION IN Guillaume Chaslo t THEIR OBJECTIVE FUNCTIONS France Global WILL NATURALLY FAVOUR DISINFORMATION” RESEARCH ARTIFICIAL INTELLIGENCE PROGRAMMING ADVOCACY Guillaume Chaslot is a computer programmer with a PhD in artificial intelligence, the founder of the consulting firm IntuitiveAI and of the nonprofit AlgoTransparency. He is currently a Mozilla Fellow. During his three years at Google, he had worked beside YouTube engineers on their recommendation system. He observed how the algorithm, which optimized for watch time, had some dangerous side effects. G uillaume particularly became aware the problem - he left the company and began of the “snowball effect” boosting AlgoTransparency in 2017, with the objective conspiracy theories on the platform. of exposing what content recommendation “I realized the algorithm was promoting algorithms are showing people on different disinformation, very often more disinformation platforms. They currently look at Facebook, than truth. I looked at topics which were clearly Google Autocomplete, Twitter Trending, and of disinformation, like flat earth theories, and realized course YouTube, where they monitor over 800 that the algorithm was producing more flat earth top information channels. The project runs on than round earth videos.” Guillaume small grants and is now supported by had identified a fundamental Guillaume’s Mozilla fellowship. design flaw playing out at a massive scale. “We can only At the moment, do so much fact checking. A l g o Tr a n s p a r e n c y If the algorithm decides works primarily with 70% of the views, it’s a journalists. The losing battle.” functioning of the tool is a bit complicated, Unable to convince and it takes time to his colleagues to explain how it works. intervene - even Guillaume would like to recognize the tool to be useful 24
YOUTUBE KEY PRODUCT AND POLICY to more people and broader in what it LAUNCHES TO RAISE AUTHORITATIVE VOICES surveys. In general, he still sees a huge TO REDUCE THE SPREAD OF BORDERLINE need for transparency into algorithms. CONTENT SINCE 2015 “People either don’t know how to do it because they weren’t insiders, or are afraid • Raise authoritative content • Reduce spread of borderline content because they don’t want to face big tech companies, or discouraged because there isn’t a real business in it” he says. Access to 2015 data is still an issue, and it is insufficient July 27 for platforms to be able to decide what Improved ranking system to reduce the visibility of junk comments information they share publicly. What’s needed is external pressure. 2016 Ideally, Guillaume would like to build out September 14 a team and expand monitoring across Launched YouTube Player for a range of platforms, then be able to Publishers for the news industry work with fact checkers, analysts, and journalists to make this data digestible 2017 for different stakeholders: individuals, researchers, and the employees at these July 20 tech companies themselves - many of August 18 Redirected users seeking violent whom don’t want to look at their own extremist propaganda-related content Launched Top News sheld in search to playlists that debunk its mythology problem. “Instead of small investigations, results and Breaking news sheld on the homepage I want to look at the general issue. I want to make the data available and enable everyone to use it as they want.” September 14 Started to surface authoritative content in search results Inspiring 2018 February 2 Launched information panels change alongside videos from government or public-funded publishers to provide July 9 content • $25m investment to AlgoTransparency has had enormous support trusted journalistic impact. Guillaume himself has organisations and improve the news experience on YouTube achieved media recognition as a • Launched information panels October 16 whistle blower of sorts (he recently providing context on topics Launched information panels to prone to misinformation and appeared in the film “The Social conspiracy theories provide context on US election candidates. Later, for EU Parliament Dilemma”), as the policy discussion has • Launched Top News sheld on election candidates too the YouTube homepage increasingly turned towards the need for algorithmic transparency. The impact on platforms is more difficult January 25 2019 to understand. “YouTube changed their Reduced recommendations of borderline content and videos that March 7 algorithm 30 times to address the exact could misinform people in harmful Launched information panels in India issues I was talking about”, Guillaume ways in the US to provide fact checking in Youtube search notes, referring to their actions in June 3 June 5 2017 on the amplification of terrorist Improved our machine learning Raised authoritative content in Watch classifier to better apply our content. But they did not take action protections for content featuring Next panel for people engaging with borderline content in the US. regarding disinformation minors and risky situations, across Further reduced recommendations to more videos like the flat earth content borderline content in the US that he had brought to July 8 Started reducing recommendations their attention. to English-language borderline videos outsie the US
First Draft Dr Claire Wardle, Marie Bohner alia UK, US, Austr Global ADDRESSING THE INFORMATION ECOSYSTEM DISORDER 26
W First Draft was founded in ith offices in three time zones (London, New York, and Sydney), 2015 as a nonprofit coalition and a team of 52 employees, of nine founding partners First Draft is monitoring around the world and around the clock. First Draft is with the mission of protecting addressing an “information ecosystem disorder”, communities from harmful says Marie. This work is guided by their Information Disorder Framework which Claire misinformation. That original Wardle, co-founder and US director, first laid down coalition has expanded greatly in a 2017 report for the Council of Europe with co-author Hossein Derakhshan. This framework since, and now includes helped put First Draft on the map, but it is one of international partnerships many approaches the organisation has pioneered since its founding. CrossCheck, a collaborative with newsrooms, universities, approach to reporting around elections, has companies (recently they have been used in the US, UK, France, Germany, Brazil, begun work with Spotify), Nigeria and Spain, and inspired similar initiatives in many other countries. “Elections are our DNA”, institutions such as the WHO or Marie explains. First Draft has developed a strong UNICEF and other NGOs. editorial department, which produces daily and weekly briefings from their global monitoring and they contribute regularly to the research discussion. Through their global Partner Network of newsrooms, fact checkers, human rights INFORMATION DISORDER HEALTH organisations and technology platforms, they have been working to share knowledge and set JOURNALISM POLICY / EXPERT ADVISORY ROLE standards in the areas of collaboration, training RESEARCH GLOBAL COLLABORATIVE NETWORK and research. ACADEMIA CAPACITY BUILDING Rooted in strong local partnerships and collaborations, First Draft takes a global approach in order to gain a cross- border view of information The Deceptive seven: Common types of misinformation & disinformation Satire or parody Misleading content Imposter content Fabricated content No itention to cause Misleading use of When genuine sources New content that harm but has potential information to frame an are impersonnated 100% false made to to fool issue or individual deceive and do harm False connection False context Manipulated content When headlines, visuals When genuine is When genuine or captions don’t support shared with false information or imagery is the content contextual information manipulated to deceive 27
disorder. “This approach aims to overcome The organisation is growing quickly, but rapid individual biases, to compare stories, and to share expansion during the pandemic has been a similar problems,” Marie explains. challenge. While they have had plenty of work to do around elections - particularly the US election - they still lack long term, flexible finance that guarantees an organisation sustainability and peace of mind. “It’s uncomfortable being 100% dependent on funding, so we are developing diverse revenue streams to WHILE THE MEDIA TEND TO ensure we can continue supporting those relied FOCUS ALL ON THE SAME THING, upon for accurate information in the years to WE TRY TO CONCENTRATE ON come”. Meanwhile, there is no lack of work to WHAT PEOPLE ARE LOOKING FOR be done. While continuing their focus around ANSWERS TO democracy and health, particularly vaccines, they are already looking ahead to other global topics that are vulnerable to disinformation, such as the looming economic downturn and Recently, First Draft has been exploring what the climate crisis. There is more awareness they identify as data deficits, areas where a lack and acknowledgement of the threats and risks of relevant and readily accessible information of online disinformation, an urgent need to may lead to misinformation. They are trying empower societies with knowledge and skills, to identify people’s questions, for example, and for collaborative efforts to increase. After using Google Trends, or else through direct 5 years building experience and expertise, conversation with their audiences. To better First Draft is needed now more than ever understand misinformation and data deficits before. in closed spaces, they apply crowd-sourcing strategies, such as tip lines, where members of the public flag content for examination. Capacity building Dr Claire Wardle First Draft is building capacity across all imaginable stakeholders. In addition to their series of trainings for journalists (from personal and cyber security to the ethics of investigations, etc.), their Local News Fellowship Program around the US 2020 election equipped local and regional journalists in social media monitoring. They pioneered information crisis simulations to help build resilience, and have a growing number of resources targeted at more general audiences as well. Marie Bohner 28
Account Analysis “MACHINE LEARNING OFTEN GETS IT WRONG, Luca Hammer I WANTED TO GO Germany A STEP BACKWARDS” Global Luca began the project behind Account Analysis one weekend in 2017, in response to a growing debate about bots. RESEARCH DATA ANALYSIS OSINT TECHNOLOGY DEVELOPMENT J ournalists were talking about social bots Luca is less interested in bots nowadays. “I’ve who influence elections and debate online.” discovered bot-nets on Twitter and published While he knew a handful of existing tools about them, but I think their impact is low. to automatically identify bots on Twitter, The platforms have great tools and try to stop usually by giving accounts a score, it was difficult spamming.” He’s more focused on platform to understand how those scores were produced. manipulation. “It doesn’t matter if it’s one The research community tends to agree that person or several people, I’m interested in what most of the tools used to identify bots are flawed. they do and what they try to achieve.” Account “It’s nearly impossible to reliably identify bots from Analysis helps him study this because it lets the outside; platforms can do this better because him understand how those accounts work. He’s they have access to more data points.” also interested in false information sharing in big public channels on Telegram. In the future, His approach has been to build a tool that wouldn’t he aims to make more tools on more platforms deliver a score, but rather allow users to evaluate accessible to more people. He also has plans to for themselves. Account Analysis lets users study add features, including tools to be able to analyse Twitter accounts by looking at criteria like how debates, not simply accounts. many retweets they post, which hashtags they use, which websites they link to most often. “For researchers to fully understand what’s going Keeping it free on, I’m convinced that you have to see the data Account Analysis has both a free and yourself.” Luca believes it isn’t enough to just paid version (for 15 euros a month, the “get the numbers” of how often contents are tool essentially works more quickly). He shared. In order to understand influence and developed this ‘freemium model’ after impact, researchers need to be able to see the finding the business case for marketers, data directly. This requires more communication who use the tool to evaluate the accounts between platforms, the researcher and of influencers and politicians. This business programmer communities. Twitter’s API is by far case covers his server costs and the most accessible for researchers: anything allows him to continue to public on the platform is public on the API. provide the tool for Telegram, which Luca is beginning to focus on free. His primary © Tony Gigov more closely, is similarly unrestricted. Facebook, motivation is to who’s API is notoriously difficult to use, has a less serve the OSINT clear cut user-facing definition of private and and journalist public space, which contributes to the challenge. c o m m u n i t y. Still, all platforms could do more to make their He also offers APIs accessible for research. workshops for journalists.
2020, A TURNING POINT IN OUR RESPONSE TO DISINFORMATION? T his research was conducted at a civil society ecosystem. Disinformation is moment when the disinformation a horizontal challenge by which more and challenge has never seemed higher. more actors find themselves confronted. From the Covid-19 health crisis and A decentralized architecture can catalyze parallel ‘infodemic’ to elections in the US and multiplier effects and build capacity and Belarus, 2020 has been a tumultuous year for resilience among those currently at the our information ecosystems. It has also been periphery of the disinformation threat (climate a significant year for regulatory response to activists, health professionals, etc.). disinformation. The European Democracy Action Plan, a roadmap to enhance democratic Our aim in this project has been to try to integrity and resilience across the European understand this growing civil society network Union, was released in December. It was - their struggles and their successes - in followed by the Digital Services Act and the order to better support their activities and Digital Markets Act, regulatory packages to fortify this ecosystem as a whole. Far from a clarify the role and responsibilities of online comprehensive survey, this research can only platforms and increase accountability for scratch the surface of this vast landscape of online disinformation. actors. And yet even with this partial view or snapshot, we have been able to highlight trends Regulation of online platforms will reduce and shared challenges, to pinpoint areas where opacity and better equip anti-disinformation collaboration is possible and where more actors like those interviewed here, but support is necessary. It is our hope that these regulation is not a silver bullet. As a diffuse findings will further empower these actors and and rapidly evolving set of challenges, those like them, and provide evidence for the disinformation requires a broader response. competent authorities to act. Key to this response is a thriving, decentralized 30
You can also read