Media and online platforms meeting on misinformation related to noncommunicable diseases and risk factors: 5 February 2021, online event

Page created by Lewis Morales
 
CONTINUE READING
Media and online platforms meeting on misinformation related to noncommunicable diseases and risk factors: 5 February 2021, online event
Media and online platforms
  meeting on misinformation
related to noncommunicable
    diseases and risk factors:
       5 February 2021, online event
                                   1
ABSTRACT
Disinformation and so-called fake news are a growing concern, as more and more individuals
obtain their information from digital venues such as search engines or social media platforms.
While generally increased access to a wider range of information on health issues can be seen
as positive, the spread of misinformation – or more acutely, of disinformation – is problematic, as
inaccurate information can lead to consequences such as harmful lifestyle or dietary choices,
self-medication, the abandonment of medical treatment or incorrect diagnoses.
In February 2021, WHO convened a meeting with media stakeholders in an effort to collect more
information on misinformation concerning NCDs and to develop a catalogue of policy initiatives,
building on the first NCD misinformation meeting that was held in February 2020 with civil society
representatives.
A third and final meeting was held with WHO European Region Member States. The expected
outcome of these meetings is the development of a new toolkit of policy recommendations that
can be used by Member States and other stakeholders to tackle the negative impacts of NCD
misinformation.
Document number: WHO/EURO:2023-6889-46655-67838
© World Health Organization 2023
Some rights reserved. This work is available under the Creative Commons Attribution-
NonCommercial-ShareAlike 3.0 IGO licence (CC BY-NC-SA 3.0 IGO;
https://creativecommons.org/licenses/by-nc-sa/3.0/igo).
Under the terms of this licence, you may copy, redistribute and adapt the work for noncommercial
purposes, provided the work is appropriately cited, as indicated below. In any use of this work,
there should be no suggestion that WHO endorses any specific organization, products or services.
The use of the WHO logo is not permitted. If you adapt the work, then you must license your work
under the same or equivalent Creative Commons licence. If you create a translation of this work,
you should add the following disclaimer along with the suggested citation: “This translation was
not created by the World Health Organization (WHO). WHO is not responsible for the content or
accuracy of this translation. The original English edition shall be the binding and authentic edition:
Media and online platforms meeting on misinformation related to noncommunicable diseases
and risk factors: 5 February 2021, online event. Copenhagen: WHO Regional Office for Europe; 2023”.
Any mediation relating to disputes arising under the licence shall be conducted in accordance
with the mediation rules of the World Intellectual Property Organization.
Suggested citation. Media and online platforms meeting on misinformation related to
noncommunicable diseases and risk factors: 5 February 2021, online event. Copenhagen: WHO
Regional Office for Europe; 2023. Licence: CC BY-NC-SA 3.0 IGO.
Cataloguing-in-Publication (CIP) data. CIP data are available at http://apps.who.int/iris.
Sales, rights and licensing. To purchase WHO publications, see http://apps.who.int/bookorders.
To submit requests for commercial use and queries on rights and licensing, see
http://www.who.int/about/licensing.
Third-party materials. If you wish to reuse material from this work that is attributed to a third
party, such as tables, figures or images, it is your responsibility to determine whether permission
is needed for that reuse and to obtain permission from the copyright holder. The risk of claims
resulting from infringement of any third-party-owned component in the work rests solely with the
user.
General disclaimers. The designations employed and the presentation of the material in this
publication do not imply the expression of any opinion whatsoever on the part of WHO concerning
the legal status of any country, territory, city or area or of its authorities, or concerning the delimitation
of its frontiers or boundaries. Dotted and dashed lines on maps represent approximate border
lines for which there may not yet be full agreement. The mention of specific companies or of
certain manufacturers’ products does not imply that they are endorsed or recommended by WHO
in preference to others of a similar nature that are not mentioned. Errors and omissions excepted,
the names of proprietary products are distinguished by initial capital letters.
All reasonable precautions have been taken by WHO to verify the information contained in this
publication. However, the published material is being distributed without warranty of any kind,
either expressed or implied. The responsibility for the interpretation and use of the material lies
with the reader. In no event shall WHO be liable for damages arising from its use.
This publication contains the report Media and online platforms meeting on misinformation
related to noncommunicable diseases and risk factors: 5 February 2021, online event and does not
necessarily represent the decisions or policies of WHO.
CONTENTS

Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
 The rise of disinformation and misinformation . . . . . . . . . . . . . . . . . . . . . 1
 Rationale for the WHO meeting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
Part 1. Challenges . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
 Business model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
 Cognitive dissonance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
 Epistemology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
 Complexity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .6
Part 2. Case studies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
 Google . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
 The BBC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
 The NYT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
 Twitter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Part 3. Solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
 Specific interventions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
 Convening stakeholders . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
 Next steps . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
Annexes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
 Annex 1. List of participants . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
 Annex 2. Agenda . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

                                                                                                               II
BACKGROUND

    The rise of disinformation and misinformation
    Disinformation and so-called fake news are a growing concern, as more and
    more individuals obtain their information from digital venues such as search
    engines or social media platforms. While generally increased access to a wider
    range of information on health issues can be seen as positive, the spread
    of misinformation – or more acutely, of disinformation – is problematic, as
    inaccurate information can lead to consequences such as harmful lifestyle or
    dietary choices, self-medication, the abandonment of medical treatment or
    incorrect diagnoses.
    It is important to distinguish misinformation from disinformation. While the former
    might simply be described as the spread of false or inaccurate information,
    disinformation refers to the spreading of false information deliberately (and
    often covertly) in order to influence public opinion or to obscure the truth (Box 1).

    Box 1. Definitions

    Disinformation is information that is created and shared with the explicit
    purpose to cause harm (1). WHO has also uses the definition from Merriam-
    Webster: ’’the proliferation of false information deliberately and often covertly in
    order to influence public opinion or to obscure the truth’’ (2).

    Misinformation is information that is inadvertently false and is shared without
    intent to cause harm. Considering the difficulty in distinguishing between
    intentional and unintentional purposes, the term misinformation is often used
    to mean any false information, regardless of intent to cause harm.

    Fake news comprises false information transmitted in the form of “news”, often
    by sources attempting to pass-off as online newspapers. The term has become
    highly politicized, most recently and notably being used to refute statements
    that the recipient does not like or agree with.

    Conspiracy theories are explanations of significant events as secret plots
    concocted by powerful and malevolent institutions, groups, and/or people.

    Misinformation and disinformation are not new or unique to the technological
    age; however, the Internet has led to a step-change in the speed and scale at
    which misinformation can spread across communities and around the world.
    Furthermore, the scale of the Internet materially impacts the quality of content
    and the trustworthiness with which it is perceived. An example can be seen in
    information cascades, where high numbers of reposts serve as a normative
    cue, indicating that the content is legitimate and worthy of further sharing (3).

1
According to a survey by the Organisation for Economic Co-operation and
Development, half of all European Union (EU) residents sought health information
online in 2017, a figure that has almostdoubled since 2008 (4). This points towards
a growing trend whereby the young and increasingly digitally literate seek early
diagnoses, lifestyle counselling or dietary advice through Internet searches and
self-help tools and also rely on Internet articles as reliable sources of health
information.
In 2019 WHO enacted a statement on the Role of Social Media Platforms in Health
Information (5) stating that “misinformation about vaccines is as contagious
and dangerous as the diseases it helps to spread”. Likewise, in February 2020
WHO warned that the COVID-19 pandemic had been followed by an equally
dangerous so-called infodemic – an overabundance of information, some
accurate and some not – that continues to make it hard for people to find
trustworthy sources and reliable guidance when they need it. This infodemic
has had severe consequences for human health and is part of a bigger trend
of health disinformation. The harmful impact of such misinformation and
disinformation practices concerning noncommunicable diseases (NCDs) has
not been properly studied by national and international scientific authorities,
even though it has been an area where behaviour plays such a determinant role.

Health misinformation is a serious problem today that is all around us in our
daily lives. Myths about risk factors and treatments can spread faster than ever.
– Carina Ferreira-Borges, a.i. Head of the WHO European Office for Prevention
and Control of Noncommunicable Diseases

Alongside infodemics, which are grounded in crises, misinformation is also
disseminated in the forms of conspiracies and daily misinformation. Conspiracies
lack the urgent element of infodemics and largely constitute narratives about
powerful hidden forces, based on widely debunked beliefs. In contrast, daily
misinformation is unprompted, ordinary, not focused on any particular event
and often grounded in erroneous beliefs. Daily misinformation is the usual
home of NCD misinformation: it lacks the sense of urgency of a crisis and
the steady interest of a conspiracy. Alcohol provides a good example of NCD
misinformation: it is easy to find articles, blogs and authentic-looking “research”
summaries suggesting that alcohol is good for you. Nutrition misinformation is
also widely prevalent online, with social media influencers playing a significant
role in disseminating unscientific dietary advice.
Misinformation can cause fear, anxiety and anger, as well as undermine public
health policies. It discredits institutions and can lead to disease, disability and
death. In a sobering recent example, hundreds of Iranians died after being
exposed to misinformation claiming that alcohol cures COVID-19 (6). Given that
NCDs are responsible for approximately 90% of all deaths in the WHO European
Region (7), tackling NCD misinformation is a major public health priority.

                                                                                      2
Rationale for the WHO meeting
    WHO convened this second meeting with media stakeholders, in an effort to
    collect more information on misinformation concerning NCDs and to develop a
    catalogue of policy initiatives, building on the first NCD misinformation meeting
    that was held with civil society representatives. Media stakeholders have a vital
    role to play in sharing their perspectives and data on challenges and potential
    solutions; sharing data on how misinformation spreads; and in participating in
    efforts to counter disinformation.
    The specific aims of the media meeting were to:
       •   identify barriers, challenges and possible ways to fight health disinformation
           through media initiatives;
       •   discuss the role of social media platforms in combating health
           disinformation, including their role in preventing NCD myths and
           conspiracy theories from spreading;
       •   discuss best practices against health disinformation, including fact-
           checking tools and algorithmic decision-making processes;
       •   discuss how WHO and other public health communities could contribute
           to address this challenge;
       •   assess the role of traditional media in fighting health disinformation and
           showcase successful initiatives already in peace;
       •   discuss the impact of digital literacy on the reduction of online NCD-
           related disinformation; and
       •   showcase success stories and innovative practices in the media sector.
    A third and final meeting was held with WHO European Region Member States.
    The expected outcome of these meetings is the development of a new toolkit
    of policy recommendations that can be used by Member States and other
    stakeholders to tackle the negative impacts of NCD misinformation.
    We need to tackle this issue at multiple different levels. Cooperation with media
    and social media companies is absolutely key and we are grateful for their
    participation here today.
    – From the speech by Nino Berdzuli, Director of the Division of Country Health

    Programmes, WHO Regional Office for Europe, read by Carina Ferreira-Borges

3
PART 1. CHALLENGES

Throughout the meeting, participants shared a number of challenges faced
in countering misinformation and its spread in traditional and social media.
All of these have been covered in the first meeting report; however, several
resurfacing themes merit further discussion.
Business model
James Williams, a Research Fellow at the Oxford Uehiro Centre for Practical
Ethics reminded participants that we ought to be careful not to implicitly adopt
a so-called message effects model of persuasion (i.e. a knowledge deficit
model). We need to ensure that actions also consider the wider attentional
environment; motivations for sharing content; industry incentives; business
models; and design goals and values.
There was a shared acknowledgement that countering misinformation in a
comprehensive and sustainable manner requires going beyond measures
that target dangerous content to engage with underlying business models
that profit from the spread of misinformation. In the words of Francisco
Goiana-da-Silva, “fake news has a high virality potential” and misinformation
is often more appealing presented than scientific information. Media platforms
generate revenue by exposing users to large volumes of content, and as long
as misinformation is appealing, platforms have a financial incentive to direct
users towards it, or ate least not to hide or demote it.
Websites that produce the original content are financed by advertising. We
need to tackle this.
– Paulo Pena, Journalist, Investigate Europe

Earlier this year, Apple’s boss Tim Cook challenged Facebook about prioritizing
harmful content “simply because of their high rates of engagement” (8). As well
as directing users towards engaging but often misleading content, the status
quo business model can also drive the creation of informational bubbles: closed
communities of discourse. Our civil society participants argued that search
engines can similarly reinforce echo chambers by presenting highly tailored
results; however, Clement Wolf, our Google representative, refuted this claim,
stating that personalization of search results is limited to local components.
However, the fact that web services and broader messaging can be targeted
with pinpoint accuracy does erode the accountability of large platforms, as
they are able to provide different messages to different people.

Cognitive dissonance
Miguel Poiares Maduro, Chairman of the European Digital Media Observatory,
discussed how the broader technological transformation has eroded confidence
in the people and institutions that have been the traditional editors of the public
space. Journalists and politicians have lost their monopoly, as anyone with an

                                                                                      4
Internet connection can find unlimited information of varying quality on any
    given topic. Unfortunately, people commonly conflate having information
    about a topic with knowing something about the field, as evidence in the rise
    of so- called armchair epidemiologists during the COVID-19 pandemic. When
    governments are opaque about the data that they are using to inform policy,
    this can lead to cognitive and epistemological dissonance. Rather than openly
    incorporating scientific data into policy-making processes, we have witnessed
    politicians appealing to science with the main aim of boosting perceived
    legitimacy.
    Social media has had a corrosive effect on trust in traditional media.
    – Marc Lavallee, Executive Director/Head of R&D, The New York Times

    A further structural problem is the way that public deliberation has changed.
    The mismatch between the space where political participation occurs (the
    state) and the space where policy decisions take place (increasingly at the
    supranational level) can lead to cognitive and epistemological dissonance.
    As a result, one major aspect of addressing NCD misinformation has to be re-
    establishing trust in the traditional editors of the public space and promoting their
    use of high-quality scientific evidence. This job has been made much harder by
    recent political shifts, probably best crystallized in the Trump presidency’s “fake
    news”, “alternative facts” and interpretation of data through a tribal/identity-
    based prism.
    Epistemology
    The general framing and discourse of the WHO meeting was heavily based on
    a truth paradigm. This epistemological paradigm assets that there is one true,
    objectively discoverable reality that is best understood using scientific methods.
    The post-modern age (“your truth may be different to my truth, but equally
    valid”) and the trashing of scientific norms, institutions and experts in the rise of
    populism has made it increasingly difficult to separate high-quality information
    from misinformation.
    Misinformation has sometimes been peddled by political leaders themselves.
    This leads people to believe there are no reference points to truth anymore and
    that all statements have the same credibility. This creates a vicious cycle of
    misinformation.
    – Miguel Poiares Maduro, Chairman, European Digital Media Observatory

    In the present age, who has the legitimate authority to arbitrate reality? And who
    has the administrative power for this task? Media platforms play increasingly
    interventionist roles to weed out dangerous misinformation, such as adding
    pre-bunking warnings to misleading Tweets; however, this forces platforms into
    a position where they have to determine what purported version of the truth
    they stand behind and will side with on a range of topics. Often this process is
    informed by consultation with independent experts, but arbitration is ultimately
    an internal affair with no external accountability, unless dealing with content
    that breaks the law.

5
A related challenge is that for many aspects of NCDs and NCD risk factors,
the truth is still a matter for debate; for example, much of our understanding
of nutrition science is based on low-level observational evidence rather than
clinical trials.
While a large majority of adults believe that major Internet companies have
an obligation to identify misinformation that appears on their platforms (9)
identifying and censoring misinformation can easily become politicized and
curtail freedom of expression. It is important to note that free speech is very
different to free publication of speech, but there are significant epistemological,
legal, political and moral complexities in developing processes to transparently
and objectively identify and demote misinformation. Kate Saunders, Senior
Policy Advisor at the United Kingdom’s BBC, explained that the part of the appeal
of media literacy training is that it sidesteps these issues.

Complexity
For many NCD issues, much scientific research remains to be done, leading to
an information void where disinformation can flourish. At the other end of the
spectrum, some areas have received so much scientific enquiry that it can be
hard to synthesize all of the information to come to a conclusion; for example,
what is the best way to lose weight? Humans have a range of heuristic tools to
help us to make sense of the world, and tend to favour simple conclusions with
good face validity. Often the scientific truth of a matter is much more nuanced
and complex, and so is impossible to encapsulate into a catchy soundbite. This
creates two challenges: presenting complex scientific consensus effectively,
and countering simple-but-effective misinformation. As Nils Fietje said, a good
place to start is “being open and honest about acknowledging complexity”.
Health information is, most of the time, incredibly complex and that very often
there is no clear correct answer to a question. I would argue that a really
important challenge is not just to combat misinformation in the population, but
to figure out how to communicate about complexity and uncertainty.
– Nils Fietje, Chair of the Behavioural and Cultural Insights Initiative, WHO
Regional Office for Europe

A first objective would be to focus on helping people to feel comfortable with
uncertainty, rather than viewing uncertainty as a cue that information should
not be trusted. There is a role for public health institutions, scientists and
governments to play here in working with the media to provide them with clear
explanations of research findings, so that journalists and content creators stand
the best chance of communicating complex issues with clarity. One message
that needs to be continually repeated is that new research findings are part
of an evolving larger picture, rather than supplanting all that came before
and presenting a new definitive conclusion. The rapidly changing research
landscape during the COVID-19 pandemic has probably helped to advance
this cause.

                                                                                      6
Misinformation is compelling because it presents information as a firm
    conclusion, whereas accurate information is often full of caveats. That won’t
    change because that is the nature of science, but we can help make people
    comfortable with such uncertainty.
    – Vanessa Boudewyns, Senior Scientist in the Public Sphere Program, Center
    for Communication Science at RTI International

    PART 2. CASE STUDIES

    Four industry representatives shared their organizations’ experiences in
    countering health misinformation. They used COVID-19 as a touchstone issue
    and presented the strategies they have developed that could be applied to the
    field of NCDs. Google, the New York Times (NYT), the BBC and Twitter all provided
    insight, covering a range of both traditional and big-technology media giants.

    Google
    Clement Wolf, Google’s Global Public Policy Lead for Information Integrity
    presented the company’s five approaches to combating health misinformation,
    with reference to the COVID-19 pandemic.
    Raise authoritative information from public health authorities
    Google has designed its ranking system to elevate authoritative information
    from expert sources for health topics. This has involved partnering with creators
    to elevate authoritative health voices.
    It has also used the Google Ad Grants programme to promote public service
    announcements from governments and health authorities during the pandemic.
    Reduce borderline information
    In order to reduce the prominence of low-quality information, the Google search
    and YouTube ranking systems have been designed to reduce the spread of
    borderline information. This has reduced the watch time of low-quality content
    by 70% on YouTube.
    Remove content harmful to users and society
    Long-standing company policies prohibit harmful and misleading medical
    or health-related content. Google now employs in-house clinicians who help
    to identify harmful content, and partners with external health organizations.
    Actions are graded according to the harmfulness of the content.

7
Incentivize creation of high-quality information experiences
Google policies do not allow adverts that potentially profit from or exploit a
sensitive event with significant social, cultural or political impact, such as a
public health emergency. The platform has also created policies which prohibit
monetization of COVID-19 misinformation, as well as pranks and challenges.
Supporting quality reporting and research on vaccine misinformation
Google Trends enables analysis of peoples’ search activity and the team has
created a dedicated COVID-19 Trends portal (10). Google financially supports a
number of fact-checking organizations
and has provided training or resources to around 10 000 reporters around the
world. The company also funds research on combating misinformation. This
naturally carries the inherent risks of any industry-led research but is nonetheless
welcomed in an under-researched field.

The BBC
A very high proportion of the United Kingdom population use the BBC, showing
that traditional platforms still have an important role to play in providing
accurate news information. The BBC employs a range of tactics for countering
misinformation. Kate Saunders provided an overview of five approaches.
Editorial policies
The BBC has its own standards, processes, policies, frameworks and checklists
that journalists use to check the reliability of data and stories, and whether
presenting the information serves the public interest. The organization is also
subject to independent regulation with clear sanctions for actions.
Debunking
The BBC has a dedicated Reality Check team that debunks fake news and
misinformation. The team use a checklist to ensure that this activity does not
inadvertently amplify misinformation.
Anti-disinformation unit
There is a dedicated anti-disinformation unit that was established to share
expertise and information across the corporation.
Trusted News Initiative
The BBC was a founding member of this industry collaboration of major
news and global technology organizations. Members alert each other when
disinformation is circulating that may cause serious harm.

                                                                                       8
Media literacy initiatives
    The BBC Young Reporter is a partnership with schools to encourage children to
    think critically about how news stories are produced. The organization has also
    created BBC iReporter (11). an online interactive game to help young people.
    Finally, Project Origin (12) is an initiative to verify the original source of content.
    This allows end users to see whether content has been altered or manipulated,
    which can help to identify deep fakes. In this work the BBC has partnered with
    Microsoft, the NYT and the Canadian Broadcasting Corporation to build a
    registry for platforms to perform two-factor authentication for media.

    The NYT
    Marc Lavallee is head of R&D at the NYT and a team member of the News
    Provenance Project (NPP). He presented the work of the NYT in accrediting
    information that travels around the Internet so that users can quickly and
    easily assess where it has come from. The NPP stemmed from the observation
    that legitimate media is commonly used in misleading contexts (e.g. recycled,
    unsourced or modified). The paper takes the view that “knowing the origin
    and authenticity of information is a human right and the cornerstone of re-
    establishing trust on the internet”. The NPP also believe that content publishers
    play a critical role in the information ecosystem.
    Partly due to the historical development of journalistic standards, all content
    from mainstream publishers contains the “what”, “where”, “when” and the name
    of the accountable journalist(s). Marc argued that while it might be impossible
    to accredit every single post on the Internet, if 90% of content came with
    accreditation it might make users question the remaining 10%.
    NYT qualitative research suggests that people judge the reliability of posted
    content by looking at the source; verification by a network of major news
    organizations and the presence of the original caption are the strongest markers
    of veracity in social media news posts. Interestingly, study participants scoffed
    at the notion that their friends or family should be judges of credible sources of
    information.
    Users can be segmented into four groups using two axes: those who are more
    or less trusting of mainstream media; and those that are more or less aware
    of veracity cues like source and date. Those who are less aware are likely to
    take content at face value, and those with low levels of trust are likely to be
    sceptical about all media institutions. Marc provided two figures: one showing
    this segmentation with trust and awareness levels (Fig. 1) and one showing
    strategies that might help users in the four segments (Fig. 2).

9
Fig. 1. Segmentation of users along trust and awareness axes

                                                MORE AWARE OF CONTEXT
MORE CONTEXT-AWARE                                                      MORE CONTENT-AWARE,
MEDIA-SKEPTIC READER                                                    MEDIA-TRUSTING READER
(e.g. hyper-partisan news junkies)                                      (e.g. techy NYT subscribers)
“If I don’t understand something, I can’t                               “I trust [those media orgs]. It would
use it to prove things for myself… I need                               make me trust the post even more if I
more information”.                                                      saw it was marked verified by them”.

                  LESS MEDIA TRUST                                               MORE MEDIA TRUST

LESS CONTEXT-AWARE                                                      LESS CONTENT-AWARE

                                                LESS AWARE OF CONTEXT
MEDIA-SKEPTIC READER                                                    MEDIA-TRUSTING READER

(e.g. emotional, reactive “local Twitter”)                              (e.g. people with broadcast media
                                                                        habits who are new to social media)
“The more local it is the more credible I
would just take their word for it”.                                     “Sometimes I can’t see what other
                                                                        people see, I just look at photos like a
                                                                        little girl”.

Interestingly, there was an age divide, with older users having higher levels of
trust and a lower awareness of cues. This is likely because the older generation
grew up with generally trustworthy major broadcasters providing news, whereas
younger users face a cacophony of competing sources online. Different
approaches are required to help users in each of the four groups to appraise
content. The most important group to focus on is the “trusting gullible users” in
the bottom left quadrant of Fig. 1. This group need content that feels authentic,
with clear cues to avoid false and misleading information (Fig. 2).

Fig. 2. Strategies for users in each of the four groups

NEEDS HARD-EARNED TRUST                                                 NEEDS MORE CONTEXT
                                                MORE AWARE OF CONTEXT

Seeks to call out bias in mainstream                                    Alert to cues of journalistic rigor
media
                                                                        • Digitally savvy in distinguishing true from
• Uses motivated reasoning to confirm                                     fake information, when possible
  ideological anti-mainstream media
  beliefs, especially in politics                                       • Trusts in journalism to help them do the job
• Fundamentally needs more trust in                                     • Wants to be informed about issues and
  mainstream media institutions                                           having as much content as possible

                  LESS MEDIA TRUST                                               MORE MEDIA TRUST

                                                                        NEEDS CLEAR CREDIBILITY CLUES
                                                LESS AWARE OF CONTEXT

NEEDS MORE TRUST,
CLEAR CREDIBILITY CUES                                                  Trusts mainstream media, but doesn’t
Feels marginalized and           uncritically                           pause to judge trustworthiness online
accepts hot takes as trusts                                             Needs more clear cues to identify false and
•Needs more clear cues to identify false                                 misleading content
  and misleading content
•Wants news that feel local & authentic

                                                                                                                         10
Based on this research, the NPP team has developed a number of insights for
     credentialing visual content.

        •   Assess visuals for source information at the time that it is uploaded. This
            ensures that there is clarity about the source from the very first time that
            the content is viewed.
        •   Ensure that prompts induce a more critical mindset. Instead of flagging
            items as false, prompt users to “check for yourself – what does this photo
            show?” This is about introducing speedbumps rather than stop signs, as
            an unintended consequence of the latter can be making people curious,
            leading to the further propagation of misinformation.
        •   Highlight information that users can interpret for themselves.
        •   Provide multiple visual perspectives (i.e. include multiple photos of an
            event to help users build a stronger sense of what happened).
        •   Source editorial history from multiple publishers for a wider perspective:
            give access to the workflow from sourcing to publication.
        •   Use provenance to emphasize what is known, without discrediting all
            photos that lack provenance information. This is to avoid disadvantaging
            local and small-scale news groups that do not have the resources to do
            the same.

     Twitter
     Nick Pickles, Senior Director of Public Policy Strategy and Development, presented
     Twitter’s approach to tackling misinformation with a focus on COVID-19 vaccines.
     Twitter’s mission in this arena is to protect the public conversation, reduce the
     spread of harmful misinformation, elevate sources of helpful information and
     support the needs of customers and communities.
     Twitter’s policy approach can be broken down into goals and tools. Goals include
     addressing directly harmful information through targeted removal; limiting
     advertising to pre-approved advertisers; providing context and authoritative
     voices; and protecting robust debate and conflict about issues of public
     importance where the facts may not yet be clear. The platform has a range of
     tools at its disposal:
        •   content removal
        •   annotations, labelling and human-curated content
        •   filtering and de-ampliyfing
        •   disabled engagements (likes and retweets)
        •   nudges (asking users “would you like to read this before sharing”)
        •   pre-bunks (e.g. “this content contains disputed information”).

11
Nick underlined that most of the content and exchange that occurs on the
platform is healthy conversation. As such, a graded approach is required
when moderating content. While the small segment of harmful false claims
needs to be removed, the larger segment of decontextualized information or
misleading or debated claims should be annotated and restricted rather than
de-platformed (Fig. 3).
Fig. 3. The graded approach to moderating Twitter content

 Healthy conversation        Annotate & restrict          Remote
 Discussion and debate,      Decontextualized             Harmful, false claims,
 personal accounts and       information, misleading or   networked bad actors
 anecdotes, emerging         debated claims
 science

For COVID-19, three criteria are being used:
 1. Is the content advancing a claim or fact regarding COVID-19?
 2. Is the claim demonstrably false or misleading?
 3. Would belief in this information, as presented, lead to harm?

Twitter has also proactively elevated authoritative voices by working with WHO
and other global organizations. It aims to use partner organizations that are
geographically local to users. The platform has also provided advertising grants
for nongovernmental organizations and free access to all COVID-19 tweets for
researchers. Among other things, Twitter data have been used to assess the
effectiveness of different public health strategies.
Nick also presented some advice for public health messaging on the platform.
Users need emotionally engaging content
Scientific information is often dry and quantitative; however, in the words of
Matthew d’Ancona, “truth needs an emotional delivery framework”. Content
needs to be presented in a way that users find compelling and will actually
digest: they need to be emotionally engaged in the content.
Understand your audience
Ask who would be a credible voice for the relevant audience and why. This may
not be a staid public health official (Fig. 4). Segment your audience and then
tailor your communications to meet the different groups where they are.

                                                                                   12
Timing matters
     A simple way to reach a different audience is to tweet while your issue is being
     discussed in the public sphere. As Rory Sutherland said, “brands do not have
     target markets, they have target moments”.
     Be authentic and engaging
     High production standards and polished graphics do not always equate to
     more engagement. Users are looking for authenticity.
     Call to action
     It’s important not to be vague. Be clear and direct about what you want people
     to do.
     Educate people
     The majority of users come with an open mind to learn new things. Q&As are a
     helpful way of capitalising on this. Tweets that contain practical information are
     more likely to be passed on.
     Included videos or photographs
     Content that comes with a photograph or video is much more likely to be shared.
     Move away from a one platform approach
     Different platforms require differing content for effectiveness. Content has to be
     tailored based on what users are searching for.

13
PART 3. SOLUTIONS

Specific interventions
During the course of the meeting, a number of specific interventions and
approaches were discussed that could potentially be used to address NCD
misinformation.
Algorithms
A recurring theme was the need to amend algorithms that funnel users towards
harmful misinformation. Participants suggested demoting or blocking content
that misinforms; however, this is difficult to police. An alternative approach
is modifying the algorithms themselves so that they direct users towards
scientific-based content, or content from trustworthy sources. Participants
noted that private actors should not be arbitrators of truth, but elected public
authorities do not have the capacity to decide whether individual posts should
be removed. Regulating algorithms may be a much more efficient way of setting
and upholding standards. Note that this still requires transparent and objective
criteria that distinguish between “truth”, or trustworthiness, and misinformation.
Twitter already lets users “turn the algorithm off”,1 and it is likely that in the near
future users will be able to select their own algorithms to use when browsing
content on different platforms, imported from third-party providers. While this
ameliorates the problem of scientifically agnostic, revenue-focused proprietary
algorithms (designed to maximize advertisement exposure), it is possible that
people may choose to plug in new algorithms that further restrict their exposure
to opposing or scientific views and subsequently reinforce their prejudices
or erroneous beliefs. Despite commonly espoused concerns about echo
chambers, Nick Pickles noted that social media users are actually exposed to
more contrasting views than people who consume traditional forms of media.
Miguel Poiares Maduro suggested that it should be possible for the user to
make a choice from alternative algorithms to govern the organization of the
content they receive in social media. Nick Pickles shared Twitter’s openness to
work towards such a possibility.
Develop agnostic solutions
Kremlin Wickramasinghe, WHO European Office for the Prevention and Control
of Noncommunicable Diseases (NCD Office) asked the media representatives
whether there was value in developing approaches specifically targeted at NCD
misinformation. Google’s Clement Wolf argued that misinformation should be
tackled as nonspecifically as possible, treating it as a part of the much broader
class containing all health misinformation. The rationale is that it is hard to predict
what NCD misinformation will surface in the future; 15% of all daily searches have
never been seen before, so future-proofing requires broad-based and agnostic
solutions that apply to all health-related searches/content.

1. For more information on how the Twitter algorithm works, see this (unofficial) overview from
Josiah Hughes: Hughes, Josiah. How the Twitter Algorithm Works [2023 GUIDE] [webpage].
Hootsuite (https://blog.hootsuite.com/twitter-algorithm/). Accessed on 25 May 2022.

                                                                                                  14
Media provenance
     Marc Levalle argued that taking media provenance to scale requires two
     different approaches. From the bottom up, publishers already understand
     that search engine optimization (SEO) is crucial to their business and want to
     use schemes which encourage web traffic to their sites. Small publishers and
     content creators need cheap, user-friendly plug-ins that help them to do the
     heavy lifting of certifying provenance. From the top down, scaling provenance
     accreditation is going to be based on regulators and platforms insisting on its
     use. Legislation is most likely in states where governments adopt the NYT view
     that “knowing the origin and authenticity of information is a human right”.
     Synthesising complex scientific evidence
     Relating to the observation that much NCD research is complex, several
     journalists expressed their desire for public health agencies to produce clear
     summaries of complex topics to help content producers to represent current
     scientific knowledge accurately. The United Kingdom’s Science Media Centre
     (13) provides a good example of an organization that executes this function.
     WHO might be able to play a role in this space.
     The WHO can help journalists by providing rapid, efficient and accessible
     information about what is really going on.
     – Luisa Meireles, Director, Agência Lusa (Portuguese National News Agency)

     Registering users
     There was some discussion concerning the potential for requiring users to
     register on platforms using identity documents in order to reduce fake accounts
     and increase transparency and accountability. However, it was noted that
     anonymity is crucial for freedom of speech and whistleblowing. Paulo Pena, a
     journalist from Investigate Europe, argued that social media companies should
     take action to vet and filter users in order to demote content from misleading
     accounts.
     Data sharing and intellectual property
     At times during both the civil society and the media meetings, public health
     representatives expressed a desire for media platforms to share data on the
     size of the problem, the workings of proprietary algorithms and the (confidential)
     business plans that collectively underpin the spread of misinformation. A less
     contentious request was for industry to engage with the development of
     formalized communication avenues between industry, governments and civil
     society.
     To tackle the problem effectively we need to access industry data.
     –Francisco de Abreu Duarte, WHO Consultant,
     European University Institute Department of Law

15
Increase the resilience of the ecosystem
Although the fundamental commercial architecture of media platforms has to
be the focus of our efforts, participants agreed that there is value to be gained
by improving the media literacy of end users. This can take the form of critical
appraisal training and improving the transparency of sources.
As well as the NYT media provenance work, other existing projects in this sphere
include the Content Authenticity Initiative, a collaboration between Adobe,
Twitter and the NYT to help creative tools like Photoshop to retain metadata
and establish forward provenance for publishers and platforms to build a chain
of custody and trust, and the Partnership on AI, with over 100 members, which
focuses on detection methods for deep fakes.
Convening stakeholders
Specific interventions to tackle NCD misinformation can be divided into three
broad levels of action: government, industry and civil society. At each level
there are multiple discrete actions. WHO has developed a simple framework
to illustrate the different levels of action for addressing NCD misinformation
and where these currently sit within the three broad actor groups (Fig. 5). For
example, Member States have taken two different approaches to removing
disinformation: France and Germany have tried to tackle illegal content, taking
down content that breaks the law. The alternative approach is to use fake
news laws to tackle harmful content. This is much harder to define. The Russian
Federation has legislated to take down “socially significant false information”.
Media organizations have used fact-checking, monitoring and promotion of
news literacy and media literacy training kits. Civil society engages with another
set of interventions.
We are all working very hard to tackle the issue, but we are working independently.
– Francisco de Abreu Duarte, WHO Consultant,
European University Institute Department of Law

While the first meeting covered a number of different approaches that can be
used to tackle misinformation, a major issue is the lack of cooperation between
governments, media companies and civil society, so that interventions currently
exist in insulated silos and the value that would come from sharing information,
resources and approaches is yet to be harvested.

                                                                                      16
Fig. 4. Three levels of action for tackling NCD misinformation

      Governmental level             Media/social level               Civil society level

      United Nations                 Media                            Think tanks
       • Interagency cooperation     • Fact-checking tools             • Forums of discussion
      European Union                 • News literacy                   • Academic writing
       • Rapid alert systems         Social media                     NGOs
       • European Digital Media      • Content labelling               • Counter-misinformation
         Observatory                                                     campaigns
                                     • Certification
      Member States                                                    • News literacy
                                     • Redirecting users towards
       • Content Laws                                                 Civil society
                                     reliable information
       • Anti-fake news laws                                           • Independent fact-
                                                                         checkers
       • Media/News literacy
                                                                       • Online fact-checking
                                                                         tools

     WHO aims to develop a toolkit that can be used to break down silos and foster
     collaborative and synergistic action to address NCD disinformation. The toolkit
     will use a continuous improvement approach to collecting data, working with
     stakeholders from across the spectrum to identify learning and collaboratively
     promote effective policies to address misinformation. Fig. 6 presents an overview
     of the process.
     Disconnected activities can only take us so far.
     – João Marecos, Consultant, WHO Regional Office for Europe

     Fig. 5. Overview of the draft WHO toolkit

                                          Collect data

                           Promote                            Learn with
                         cooperation
                           policies                          case studies

                                    Detect                Engage
                                   synergies           stakeholders

17
NEXT STEPS

Recognizing the significance of the current problem, participants welcomed
WHO’s initiative in this area. They urged the WHO team to adopt a multifaceted
approach that considers the whole spectrum of factors which influence the
scale and spread of misinformation, targeting multiple different actors.
This is not an easy task, but a framework that does not attempt to tackle
misinformation holistically is doomed to fail. At the third and the final meeting,
Member States will be invited to share their perspectives and requirements for
an effective and implementable framework.
This is one of the most important issues in the world today in the area of health
and WHO is well positioned to convene the different actors
– James Williams, Research Fellow, Oxford Uehiro Centre for Practical Ethics

This is a broad and complex issue. We need a toolkit to target multiple levels
and actors.
– Alice Echtermann, Journalist and team lead, CORRECTIV.Faktencheck

These remain a number of unresolved issues. The first is that, in the world of
Nick Pickles, “the vast majority of content does not fall into definitely true or
definitely false. People have opinions and science changes”. It is impossible to
tackle misinformation without developing robust and transparent processes for
reliably identifying it in the first place. Furthermore, as companies like Google
grade levels of harm, robust multi-tiered systems need to be developed. Deeper
philosophical answers also need attention: what should take precedence – the
precautionary principle or freedom of expression?
There seems to be consensus that arbitrating reality should not fall to private
companies, but public authorities are unlikely to find the task much easier. For
now, platforms are taking the view that the best way to intervene is by providing
context, rather than establishing the veracity of every piece of information.
These approaches help to address media literacy and the perceptions of users.
We need to think about the underlying beliefs that drive disinformation, rather
than focusing on individual misinformation posts. We need the framework to go
to the foundation, including how people understand and trust science.
– Vanessa Boudewyns, Senior Scientist in the Public Sphere Program, Center
for Communication Science at RTI International

A final issue that the meeting raised pertained to the appropriate score of actions
to counter NCD misinformation. Marc Lavallee and Miguel Poiares Maduro both
argued that tackling misinformation is predicated on broader actions to re-
establish trust in the Internet itself, traditional editors of the public discourse
and traditional sources of authority. Perceptions in the trustworthiness and
legitimacy of traditional actors has (justifiably) wanted in recent years. To what
extent should the WHO framework engender actions that strengthen these

                                                                                      18
actors? While a number of interventions exist to bolster content producers,
     such as media regulation, editorial standards, accountability mechanisms and
     objective arbitration criteria, it is less clear that WHO has a role to play in boosting
     trust in politicians, especially those who lack a democratic mandate. Ultimately,
     it will be up to Member States to decide where the limits of the framework lie.
     At the third and final meeting, WHO European Member States will provide their
     perspectives. The WHO team will refine the NCD disinformation toolkit, with an
     emphasis on developing pathways for intersectoral collaboration.

19
REFERENCES 1

 1.   Wang Y, McKee M, Torbica A, Stuckler D. Systematic literature review on the spread
      of   health-related   misinformation     on   social   media.   Soc   Sci   Med.   2019;240:112552.
      doi: 10.1016/j. socscimed.2019.112552.
 2.   Disinformation. In: Merriam-Webster dictionary [online database]. Springfield (MA): Merriam-
      Webster; 2022 (https://www.merriam-webster.com/dictionary/disinformation).
 3.   Information cascades and the spread of “fake news”? Ithaca (NY): Cornell University; 2018
      (https://blogs.cornell.edu/info2040/2018/11/27/information-cascades-and-the-spread-of-
      fake-news/).
 4.   Health at a glance: Europe 2018: state of health in the EU cycle. Paris: Organisation for
      Economic Co-operation and Development; 2020 (https://www.oecd-ilibrary.org/social-
      issues-migration-health/health-at-a-glance-europe-2018_health_glance_eur-2018-en).
 5.   WHO Director-General statement on the role of social media platforms in health information.
      Geneva: World Health Organization; 2019 (https://www.who.int/news/item/28-08-2019-
      who-director-general-statement-on-the-role-of-social-media-platforms-in-health-
      information).
 6.   Aghababaeian H, Hamdanieh L, Ostadtaghizadeh A. Alcohol intake in an attempt
      to fight COVID-19: a medical myth in Iran. Alcohol. 2020;88:29-32. doi:10.1016/j.alcohol.2020.07.006
 7.   Bennett JE, Stevens GA, Mathers CD, Bonita R, Rehm J, Kruk ME et al. NCD countdown 2030:
      worldwide trends in noncommunicable disease mortality and progress towards Sustainable
      Development Goal target 3.4. Lancet. 2018;392(10152):1072–88.
      doi: 10.1016/S0140-6736(18)31992-5.
 8.   Cook J. Tim Cook hits out at social media apps that “prioritize conspiracies” as war
      with Facebook heats up. The Telegraph. 28 January 2021 (https://www.telegraph.co.uk/
      technology/2021/01/28/mark-zuckerberg-accuses-apple-killing-competition/).
 9.   Americans’ views of misinformation in the news and how to counteract it. Miami (FL): John
      S and James L Knight Foundation; 2018 (https://knightfoundation.org/reports/americans-
      views-of-misinformation-in-the-news-and-how-to-counteract-it/).
 10. Coronavirus search trends. In: Google Trends [website]. Mountain View (CA): Google; 2022 (https://
      trends.google.com/trends/story/US_cu_4Rjdh3ABAABMHM_en?fbclid=IwAR159CKSid1b3M-
      eGfwz-_9uN_PkhVKvpDAFTlSZsf4Gpd8krLRG8tiJ0Io).
 11. BBC iReporter [website]. London: British Broadcasting Corporation; 2018
      (https://www.bbc. co.uk/news/resources/idt-8760dd58-84f9-4c98-ade2-590562670096).
 12. Project Origin [website]. London: Incorporated Society of British Advertisers; 2022
      (https://www.originproject.info/).
 13. Science Media Centre [website].London: Science Media Centre; 2022
      (https://www. sciencemediacentre.org/).

1.    All references were accessed on 25 May 2022.

                                                                                                             20
ANNEXES

     Annex 1. List of participants

     Alice Echtermann
     Journalist
     Team lead CORRECTIV.Faktencheck

     Ashan Pathirana
     Registar in Community Medicine
     Health Promotion Bureau

     Catarina Carvalho
     Director
     A Mensagem

     Clement Wolf
     Google’s Global Public Policy Lead Information Integrity

     James Williams
     Research Fellow
     Oxford Uehiro Centre for Practical Ethics University of Oxford

     Jennifer McDonald
     Twitter

     Kate Saunders
     Senior Policy Advisor at BBC

     L. Suzanne Suggs
     Professor of Social Marketing
     University of Lugano (USI)

     Louise Coghlin
     Freelance writer and editor
     Health and Life Sciences

     Luisa Meireles
     Director
     Lusa
     Portuguese National News Agency

21
Marc Lavallee
Head of R&D
The New York Times Team Member of the News Provenance Project

Miguel Poiares Maduro
President
European Digital Media Observatorysil

Nick Pickles
Senior Director
Public Policy Strategy and Development Twitter

Paulo Pena
Journalist
Investigate Europe

Ruth Delbaere
Legal and Advocacy Officer
Avaaz

Silvia Caneva
Twitter

Vanessa Boudewyns
Senior Scientist
Science in the Public Sphere Program
Center for Communication Science
RTI International

Vera Novais
Journalist
Observador

                                                                22
World Health Organization

     Carina Ferreira Borges
     Acting Head
     WHO European Office for Prevention and Control of Noncommunicable Diseases

     Dodkhudo Tuychiev
     Programme assistant
     WHO European Office for the Prevention and Control of Noncommunicable
     Diseases
     WHO Regional Office for Europe

     Eric Carlin
     Consultant
     WHO European Office for the Prevention and Control of Noncommunicable
     Diseases
     WHO Regional Office for Europe

     Francisco de Abreu Duarte
     Consultant
     WHO European Office for the Prevention and Control of Noncommunicable
     Diseases
     WHO Regional Office for Europe

     Francisco Goiana-da-Silva
     Consultant
     WHO European Office for the Prevention and Control of Noncommunicable
     Diseases
     WHO Regional Office for Europe

     João Marecos
     Consultant
     WHO European Office for the Prevention and Control of Noncommunicable
     Diseases
     WHO Regional Office for Europe

     Kremlin Wickramasinghe
     a.i. Programme Manager
     Nutrition, Physical Activity and Obesity
     WHO European Office for Prevention

23
Luke Allen
Meeting rapporteur

Maria Neufeld
Consultant
WHO European Office for the Prevention and Control of Noncommunicable
Diseases
WHO Regional Office for Europe

Mohamed Hamad
Consultant
WHO European Office for the Prevention and Control of Noncommunicable
Diseases WHO Regional Office for Europe

Nils Fietje
Research Officer
Behavioural and Cultural Insights Initiative

Nino Berdzuli
Director
Division of Country Health Programmes
WHO Regional Office for Europe

Olga Oleinik
Consultant
WHO European Office for the Prevention and Control of Noncommunicable
Diseases
WHO Regional Office for Europe

                                                                        24
ANNEX 2. Agenda

     15:00–15:15 Welcome remarks and setting the stage
            Nino Berdzuli
            Director of the Division of Country health Programmes, WHO Regional
            Office for Europe

            Carina Ferreira-Borges
            a.i. Head of the WHO European Office for Prevention and Control of
            Noncommunicable Diseases

            Recap of the 9 December 2020 meeting
            Session I. The triple entente: bringing stakeholders together to address
            NCD related health misinformation

     15:15–15:35 Draft strategy presentation
            Chair: Representative of the Behavioural and Cultural Insights Initiative,
            WHO Regional Office for Eurpoe

            João Marecos
            Consultant, WHO Regional Office for European Office for Prevention and
            Control of Noncommunicable Diseases

            Francisco de Abreu Duarte
            Consultant, WHO Regional Office for European Office for Prevention and
            Control of Noncommunicable Diseases

     15:35–16:15 Plenary discussion
            Moderator: Francisco Goiana-da-Silva
            Consultant, WHO Regional Office for European Office for Prevention and
            Control of Noncommunicable Diseases
            Session II. The present situation

        •   Identifies barriers, challenges and possible ways to fight NCD health
            disinformation through media/social media initiatives
            Discuss the spread of “fake news” related to NCDs on the different media/
            social media venues

25
You can also read