Subscriptions and external links help drive resentful users to alternative and extremist YouTube videos

Page created by Anne Collins
 
CONTINUE READING
Subscriptions and external links help drive resentful users to alternative and extremist YouTube videos
Subscriptions and external links help drive resentful
     users to alternative and extremist YouTube videos†
  Annie Y. Chen1 , Brendan Nyhan2 , Jason Reifler3 , Ronald E. Robertson4,5 , and
                               Christo Wilson5
                         1 CUNY       Institute for State & Local Governance
                                            2 Dartmouth College
                                           3 University of Exeter
                                           4 Stanford University
                                        5 Northeastern University

                                                       Abstract
       Do online platforms facilitate the consumption of potentially harmful content? Despite widespread
       concerns that YouTube’s algorithms send people down “rabbit holes” with recommendations
       to extremist videos, little systematic evidence exists to support this conjecture. Using paired
       behavioral and survey data provided by participants recruited from a representative sample
       (n=1,181), we show that exposure to alternative and extremist channel videos on YouTube is
       heavily concentrated among a small group of people with high prior levels of gender and racial
       resentment. These viewers typically subscribe to these channels (causing YouTube to recom-
       mend their videos more often) and often follow external links to them. Contrary to the “rabbit
       holes” narrative, non-subscribers are rarely recommended videos from alternative and extremist
       channels and seldom follow such recommendations when offered.

   † We are grateful to the Russell Sage Foundation, Anti-Defamation League, Carnegie Corporation of New York, and

the National Science Foundation for financial support, to Samantha Luks at YouGov for survey assistance, to Kasey
Rhee for research assistance, to Andy Guess for helping design this project in its initial stages, and to Tanushree Mitra,
Joseph B. Phillips, David Rothschild, Gianluca Stringhini, and Savvas Zannettou for comments and feedback. We also
thank Virgílio A.F. Almeida, Stephen Ansolabehere, Manoel Horta Ribeiro, Aaron Sankin, Brian Schaffner, Robert
West, and Anna Zaitsev for sharing their data with us or making it publicly available. This research utilized equipment
funded by NSF grant IIS-1910064. Any opinions, findings, and conclusions or recommendations expressed in this
material are those of the authors and do not necessarily reflect the views of the funders. In general, all conclusions and
errors are our own.
What role do technology platforms play in exposing people to dubious and hateful information and
enabling its spread? Concerns have grown in recent years that online communication is exacerbat-
ing the human tendency to engage in preferential exposure to congenial information (1–3). Such
concerns are particularly acute on social media, where people may be especially likely to view con-
tent about topics such as politics and health that is false, extremist, or otherwise potentially harmful.
The use of algorithmic recommendations and platform affordances such as following and subscrib-
ing features may enable this process, helping people to find potentially harmful content and helping
content creators build and monetize an audience for it.
   These concerns are particularly acute for YouTube, the most widely used social media platform
in the U.S. (4). Critics highlight the popularity of extreme and harmful content such as videos by
white nationalists on YouTube, which they often attribute to the recommendation system that the
company itself says is responsible for 70 percent of user watch time (5). Many fear that these al-
gorithmic recommendations are an engine for radicalization. For instance, the sociologist Zeynep
Tufecki wrote that the YouTube recommendation system “may be one of the most powerful radi-
calizing instruments of the 21st century” (6). These claims seem to be supported by reporting that
features descriptions of recommendations to potentially harmful videos and accounts of people
whose lives were upended by content they encountered online (7–9).
   In response to these critiques, YouTube announced changes in 2019 to “reduce the spread of
content that comes close to—but doesn’t quite cross the line of—violating our Community Guide-
lines” (10). It subsequently claimed that these interventions resulted in a 50% drop in watch time
from recommendations for “borderline content and harmful misinformation” and a 70% decline in
watch time from non-subscribed recommendations (11, 12).
   Questions remain, however, about the size and composition of the audience for potentially harm-
ful videos on YouTube and the manner in which people reach those videos. To date, research inves-
tigating YouTube has lagged behind studies of its social media counterparts. Studies show that sites
like Twitter and Facebook can amplify tendencies toward extreme opinions or spread false informa-
tion (13, 14), though the extent of these effects and the prevalence of exposure is often overstated

                                                   1
(15–17). YouTube may operate differently, though, given its focus on video and the central role of
its recommendation system (18).
   YouTube’s 2019 changes do appear to have affected the propagation of some of the worst con-
tent on the platform, reducing both recommendations to conspiratorial content on the platform and
sharing of YouTube conspiracy videos on Twitter and Reddit (19, 20). In particular, subsequent
research has found relatively little support for “rabbit holes.” Though watching videos promoting
misinformation can lead to more recommendations of similar videos on some topics (21), random
walk simulations find people would very rarely reach extreme content if they followed YouTube
recommendations (22). Another study using 2018–2019 data similarly finds that YouTube recom-
mendations tend to direct people away from rather than toward the most extreme videos (23).
   However, the studies described above rely on bots and scraping; they cannot observe the rec-
ommendations seen by real humans under naturalistic circumstances. Conversely, browsing data
has documented the existence of a sizeable audience of dedicated far-right news consumers on
YouTube who often reach extremist videos via external links (24), but these data lack information
on the recommendations shown to users.
   Unlike both of these approaches, we study behavior activity data from a sample weighted to
be representative of the US adult population that includes the actual recommendations shown to
participants. Our sample consists of 1,181 participants recruited from a sample of 4,000 YouGov
panelists, including oversamples of two groups who we identified as especially likely to be exposed
to potentially harmful video content: (1) people who previously expressed high levels of gender
and/or racial resentment and (2) those who indicated they used YouTube frequently. Participants
voluntarily agreed to install a custom browser extension in Chrome or Firefox that monitored their
web browsing behavior and to use that browser while the extension was active for at least two days.
The study was conducted from July 21–December 31, 2020; respondents were enrolled in data
collection for a median of 133 days. (See Methods below for further details on measurement. We
provide descriptive statistics on study participants and their browser activity data availability and
aggregate consumption patterns in Online Appendix A.)

                                                 2
This research design advances our understanding of exposure to alternative and extremist videos
(categories we define below in Methods) on YouTube in several important respects. First, we collect
data from real people instead of relying on automated bots, allowing us to measure how these
videos are encountered and watched by humans. Second, our browser extension captures the exact
videos that YouTube recommends, providing the most precise estimate to date of the extent to
which real-world algorithmic recommendations on the platform push people toward potentially
harmful content. In particular, we construct a specific definition of what constitutes a “rabbit hole”
and measure its prevalence. Finally, we leverage survey data from our participants to examine the
association between demographic and attitudinal variables, especially gender and racial resentment,
and YouTube video watching behavior.
   We report the following key findings. Though almost all participants use YouTube, videos from
alternative and extremist channels are overwhelmingly watched by a small minority of people with
high levels of gender and racial resentment. Even within this group, total viewership is concentrated
among a few superconsumers who watch YouTube at high volumes. Viewers often reach these
videos via external links and/or are subscribers to the channels in question. By contrast, we rarely
observe recommendations to alternative or extremist channel videos being shown to, or followed
by, non-subscribers.
   We thus find little support in post-2019 data for prevailing narratives that YouTube’s algorithmic
recommendations send unsuspecting members of the public down “rabbit holes” of extremism.
The greater threat, our results suggest, is the way that social media platforms and the affordances
they offer enable audiences of resentful people to easily and repeatedly access potentially harmful
content.

                                                  3
Methods

Study participants

Study participants completed a public opinion survey and installed a browser extension that recorded
their browser activity (n=1,181). Specifically, we contracted with the survey company YouGov to
conduct a public opinion survey with 4,000 respondents from three distinct populations: a nation-
ally representative sample of 2,000 respondents who previously took part in the 2018 Cooperative
Congressional Election Survey (CCES) when it was fielded by YouGov; an oversample of 1,000
respondents who expressed high levels of racial resentment (25), hostile sexism (26), and denial of
institutional racism (27) in their responses to the 2018 CCES; and an oversample of 1,000 respon-
dents who did not take part in the 2018 CCES but indicated that they use YouTube “several times per
day” or “almost constantly” in their survey response. (The prior measures of racial resentment and
hostile sexism, which were collected as part of the 2018 CCES for 3,000 of our 4,000 respondents,
are also used as independent variables in our analysis; see below for details on question wording.)
   While completing the survey, participants who used an eligible browser (Chrome or Firefox)
were offered the opportunity to download a browser extension that would record their browser ac-
tivity in exchange for additional compensation. A total of 1,181 respondents did so (778 from the
nationally representative sample, 97 from the high resentment oversample, and 306 from the high
YouTube user oversample).
   All analyses we report use survey weights to approximate a nationally representative sample,
including the oversamples. These weights were created by YouGov to account for the fact that, in
addition to a national sample, we have also specifically recruited participants who fall into one of
two groups: (1) those who previously expressed gender and/or racial resentment, or (2) those who
are frequent YouTube users. When we apply these weights to all three samples, the total sample is
weighted to be nationally representative. (Additional details about respondent demographics and
other characteristics are provided in Online Appendix A.)

                                                 4
Ethics and privacy

Our study methods were approved by the Institutional Review Boards (IRBs) at the authors’ respec-
tive institutions (REDACTED FOR PEER REVIEW).
   All participants were asked to consent to data collection before completing our survey and again
when they installed our browser extension. Participants were fully informed about the data collected
by our extension when they were invited to install it and this information was provided a second
time during installation of the extension. The extension did not collect any data until consent was
provided and participants were free to opt out at any time by uninstalling our extension. The ex-
tension automatically uninstalled itself from participants’ browsers at the end of the study period.
(See Online Appendix B for the full text of our informed consent notices.)
   To protect participants’ security and privacy, we adopted a number of best practices. Our par-
ticipants are indexed by pseudonymous identifiers. Our browser extension used TLS to encrypt
collected data while it was in transit. All participant data is stored on servers that are physically
secured by key cards. We use standard remote access tools like SSH to access participant data
securely.
   Data and code will be posted on a publicly available repository such as Dataverse upon publica-
tion of this manuscript that allows for the replication of all results in this article. All analysis code
will also be posted. However, raw behavior data cannot be posted publicly to protect the privacy of
respondents.

Data collection and measurement

The browser extension passively logged user pageviews, including the full URL and a timestamp,
and collected HTML snapshots when users viewed YouTube videos, allowing us to examine the
video recommendations that participants received. This combination of passive monitoring and
HTML snapshots provides us with the ability to measure not just what respondents clicked on but
what YouTube showed them prior to that action.
   To account for duplicate data, we dropped additional pageviews of the same URL within one

                                                   5
second of the prior pageview on the assumption that the user refreshed the page (28).
   Our analysis focuses on browser activity data rather than browser history. While browser history
provides a clear record of each time a URL is opened in a browser, it does not account for changes
in the active browser tab. For example, if someone opens web page A in a tab, then opens web page
B in another tab, and then switches their browser tab back to A, browser history will not register
this shift in attention, making it difficult to obtain accurate estimates of time spent on a given web
page. Our passive monitoring records all changes in the active tab, allowing us to overcome this
issue. (In Online Appendix A, we validate our browser activity data against browser history data
from the extension.)
   In this article, we refer to YouTube “views,” “consumption,” and “exposure.” These terms refer
to videos that appear in the browser activity data described above. As with any passive behavioral
data, we cannot verify that every user saw the content that appeared on their device in every instance.
   We measured the amount of time a user spent on a given web page by calculating the difference
between the timestamp of the page in question and the next one they viewed. This measure is
imperfect because we do not have a measure of active viewing. Though some participants might
rewind and rewatch videos more than once, we are more concerned about our measure overstating
watch time due to users leaving their browser idling. We therefore refine this measure by capping
the time spent measure at the length of the video in question (obtained from the YouTube API).
   We measure which channels users subscribed to by looking at the HTML snapshots of the
videos they watched. Specifically, we parsed the subscribe button from each HTML snapshot:
“Subscribe” indicates that the participant was not subscribed to the video channel at the time the
video was watched and “Subscribed” indicates that they were already subscribed. Because we must
use this indirect method to infer channel subscriptions, we do not know the full set of channels to
which participants subscribe. In particular, not all recommended videos in our dataset were viewed
by participants. As a result, we could not determine the subscription status for all recommended
videos.
   We denote the web page that a participant viewed online immediately prior to viewing a YouTube

                                                  6
video as the “referrer.” We are unable to measure HTTP Referrer headers using our browser
extension, so instead we rely on browser activity data to identify referrers to YouTube videos. Using
prior browsing history is a common proxy used to analyze people’s behavior on the web (29, 30).

Channel definitions and measurement

We construct a typology of YouTube channel types identified in previous research. We classify
videos as coming from an alternative channel, an extremist channel, a mainstream media channel,
or some other type of channel (“other”).
   In our typology, alternative channels tend to advocate “reactionary” positions and typically
claim to espouse marginalized viewpoints despite the channel owners primarily identifying as White
and/or male. This list combines Lewis’ Alternative Influence Network (31), the Intellectual Dark
Web and Alt-lite channels from Ribeiro et al. (22), and channels classified by Ledwich and Zaitsev
(23) as Men’s Rights Activists or Anti-Social Justice Warriors. Example alternative channels in our
typology include those hosted by Steven Crowder, Tim Pool, Laura Loomer, and Candace Owens.
   Our list of extremist channels consists of those labelled as white identitarian by Ledwich and
Zaitsev (23), white supremacist by Charles (32), extremist or hateful by the Center on Extremism
at the Anti-Defamation League, and those compiled by journalist Aaron Sankin from lists curated
by the Southern Poverty Law Center, the Canadian Anti-Hate Network, the Counter Extremism
Project, and the white supremacist website Stormfront (33). Example extremist channels include
those hosted by Stefan Molyneux, David Duke, Mike Cernovich, and Faith J. Goldy.
   In total, our alternative and extremist channel lists consist of 322 and 290 channels, respectively.
Of the 302 alternative and 213 extremist channels that were still available on YouTube as of January
2021 (i.e., they had not been taken down by the owner or by YouTube), videos from 208 alternative
and 55 extremist channels were viewed by at least one participant in our sample. We are not making
these lists publicly available to avoid directing attention to potentially harmful channels. We are,
however, willing to privately share them with researchers and journalists upon request.
   To create our list of mainstream media channels, we collected news channels from Buntain et al.

                                                  7
(20) (65 mainstream news sources), Lediwch et al. (23) (75 mainstream media channels), Stocking
et al. (34) (81 news channels), Ribeiro et al. (22) (68 popular media channels), Eady et al. (35)
(219 national news domains), and Zannettou et al. (36) (45 news domains). We manually found
the corresponding YouTube channels via YouTube search when authors only provided websites (22,
35, 37). In cases where news organizations have multiple YouTube channels (e.g., Fox News and
Fox Business), all YouTube channels under the parent organization were included. Any channels
appearing in fewer than three of these sources were omitted. Finally, we also included channels that
were featured on YouTube’s News hub from February 10, 2021 to March 5, 2021.
   The resulting list of mainstream media channels were then checked to identify those that meet
all of the following criteria:

   1. They must publish credible information, which we define as having a NewsGuard score
      greater than 60 (https://www.newsguardtech.com) and not being associated with
      any “black” or “orange” fake news websites listed in Grinberg et al. (38).

   2. They must meet at least one criteria for mainstream media recognition or distribution, which
      we define as having national print circulation, having a cable TV network, being part of
      the White House press pool, or having won or been nominated for a prestigious journalism
      award (e.g., Pulitzer Prize, Peabody Award, Emmy, George Polk Award, or Online Journalism
      Award).

   3. They must be a US-based organization with national news coverage.

   Our final mainstream media list consists of 127 YouTube channels. We placed all YouTube
channels in our dataset that did not fall into one of these three categories (alternative, extremist, or
mainstream news) into a residual category that we call “other.”

Survey measures of racial resentment and hostile sexism

We measure anti-Black animus with a standard four-item scale intended to measure racial resent-
ment (25). For example, respondents were asked whether they agree or disagree with the statement

                                                   8
“It’s really a matter of some people just not trying hard enough: if blacks would only try harder they
could be just as well off as whites.” Responses are provided on a five-point agree/disagree scale and
coded such that higher numbers represent more resentful attitudes. Respondents’ racial resentment
score is the average of these four questions. Responses to these questions are taken from respondent
answers to the 2018 Cooperative Congressional Election Survey (as noted above, participants were
largely recruited from the pool of previous CCES respondents).
   We operationalized hostile sexism using two items from a larger scale that was also asked on the
2018 Cooperative Congressional Election Survey (CCES) (26). For example, one of the questions
asks if respondents agree or disagree with the statement “When women lose to men in a fair com-
petition, they typically complain about being discriminated against.” Responses are provided on a
five-point agree/disagree scale and coded such that higher numbers represent more hostile attitudes.
   All other question wording is provided in the survey codebook in Online Appendix C. Racial
resentment and hostile sexism measures were also included in our 2020 survey; responses showed
a high degree of persistence over time (r = .92 for racial resentment, r = .79 for hostile sexism).

Results

Exposure levels

Though 91% of participants visited YouTube, the vast majority of participants did not view any
alternative or extremist channel videos. Just 15% of the sample for whom we have browser activity
data (n=1,181) viewed any video from an alternative channel and only 6% viewed any video from
an extremist channel. By comparison, 44% viewed at least one video from a mainstream media
channel. (See Methods for how channel types were defined and how view history and watch time
were defined.)
   The audience for alternative and extreme channels is heavily skewed toward people who sub-
scribe to the channel in question or one like it, which we determine by inspecting whether the
subscription button is activated when a participant views a video from that channel (see Methods

                                                  9
Figure 1: Distribution of video views by subscription status and channel type

                       100%

                       80%
 Percentage of views

                       60%

                       40%

                       20%

                        0%
                               22,441    4,930     3,758        4,198     411      547       21,986    8,473   23,318    347,796   611,225     72,749

                                  Alternative channel              Extremist channel             Mainstream media              Other channel
                                        (2.7%)                          (0.4%)                        (4.6%)                      (92.3%)

                                                           Subscribed to current       Subscribed to another     Not subscribed

Percentage of views for videos from each type of channel that come from people who are subscribed to that channel
(yellow), who subscribe to one or more different channels of the same type but not the channel currently being viewed
(green), and who do not subscribe to any channel of that type (blue). Each estimate includes the corresponding 95%
confidence interval. Total view counts are displayed at bottom of each bar. Total views for videos of that type as a
percentage of all views are displayed under the channel labels.

for more details). Among the set of people who saw at least one extremist channel video, for in-
stance, 52% saw at least one video from an extremist channel they subscribe to during the study
period. Similarly, 39% of all people who saw at least one alternative channel video viewed at least
one video from a channel to which they subscribed.
                       Figure 1 illustrates this point by disaggregate video views according to both channel type and
subscription status. We observe that 72% of views for videos from alternative channels and 81%
of views for videos from extremist channels come from subscribers to the channel in question. If
we instead define subscribers to include all people who subscribe to at least one channel of the type
in question, the proportion of views from subscribers increases to 88% for alternative channels and
89% for extremist channels.
                       These patterns for alternative and extreme channels are distinct from mainstream media chan-
nels, which receive 41% of their views from people who do not subscribe to any channel in the

                                                                                   10
Figure 2: Concentration of exposure to alternative and extremist channels

                                           100%
  Percentage of total exposure (minutes)

                                                                           Percentage of total exposure (minutes)
                                                                                                                    100%   1.7% of users account for 80% of time spent
                                                                                                                               viewing alternative channel videos.
                                           75%

                                                                                                                    75%

                                           50%                                                                      50%

                                                                                                                    25%

                                           25%

                                                                                                                     0%
                                                                                                                           0%                0.1%                 1%           10%   100%
                                                                                                                                                 log10 (Percentage of users)
                                            0%

                                                  0%                25%                                                                50%                               75%            100%
                                                                                                                           Percentage of users

                                                                     Alternative                                           Extremist          Mainstream media            Other
                                                                     channels                                              channels           channels                    channels

Weighted empirical cumulative distribution function (eCDF) showing the percentage of participants responsible for a
given level of total observed video viewership of alternative and extremist channels on YouTube (in minutes). Inset
graph shows the same data using a log scale for the weighted eCDF.

category. (This difference in viewership patterns is likely the result of the greater number of rec-
ommendations given to mainstream media videos [see Figure 8 below], which accordingly receive
more views from non-subscribers.)
                             Among the participants who viewed at least one video from an alternative or extremist channel,
the time spent watching them was relatively low: 26 minutes per week for alternative channel videos
(62 minutes per week for subscribers to one or more alternative channels [6%] versus 0.2 minutes
per week for non-subscribers [9%]) and 8 minutes for extremist channel videos (15 minutes per
week for subscribers [3%] versus 0.04 minute per week for non-subscribers [3%]). The comparison
statistics are 12 minutes per week for mainstream media channel videos and 214 minutes per week
for videos from other channels.
                             Mirroring patterns observed for Twitter and untrustworthy websites (29, 38), viewership of po-

                                                                                                                                 11
Figure 3: YouTube video diets of alternative and extremist superconsumers

   A                                         Alternative channel superconsumers (n = 17)                  B                                         Extremist channel superconsumers (n = 9)

                                      3000                                                                                                   3000
 Minutes per week on YouTube videos

                                                                                                        Minutes per week on YouTube videos
                                      2000                                                                                                   2000

                                      1000                                                                                                   1000

                                        0                                                                                                      0

                                                                              Alternative   Extremist                                         Mainstream      Other
                                                                              channels      channels                                          media           channels

Total YouTube behavior of alternative (panel A) and extremist (panel B) superconsumers measured in minutes per
week of video watch time. Each bar represents one individual and the height of the bar represents total view time of
YouTube videos by channel type. The 17 alternative superconsumers are ordered left to right by time spent on videos
from alternative channels (orange portions of bars); the eight extremist superconsumers in the right panel are ordered
left to right by time spent on videos from extremist channels (red portions of the bars). Red icons under bars in the left
panel represent individuals who are also extremist superconsumers; orange icons under bars in the right panel represent
individuals who are also alternative content superconsumers.

tentially harmful videos on YouTube is heavily concentrated among a few participants. As Figure 2
indicates, 1.7% of participants (17 people) account for 79% of total time spent on videos from
alternative channels. This imbalance is even more severe for extremist channels, where 0.6% of
participants (9 people) were responsible for 80% of total time spent on these videos. Skew is simi-
lar when we examine view counts (Figure A13) rather than time spent on videos—1.9% and 1.1% of
participants were responsible for 80% of alternative and extremist channel viewership, respectively.
We observe a similar pattern of concentration for mainstream media consumption—just 3.8% of
participants (40 people) account for 80% of the total views.
                                       We examine the behavior of these “superconsumers” in more detail in Figure 3, which separately

                                                                                                   12
presents watch time totals for the people responsible for 80% of the viewership of videos from alter-
native and extremist channels in our sample. We note two facts about superconsumers. First, they
often watch a great deal of YouTube. Alternative channel superconsumers spend a median of 29
hours each week watching YouTube, while the median time that extremist channel superconsumers
spend watching is 16 hours per week. By comparison, the median time per week across all partic-
ipants is 0.2 hours. Second, there is substantial overlap between the two sets of superconsumers,
who number 26 in total (2% of all participants). Five of the nine superconsumers of extremist
channel videos (56%) are also among the seventeen superconsumers of alternative channel videos.
Conversely, five of the seventeen superconsumers of alternative channel videos (29%) are among
the nine superconsumers of extremist channel videos. Figures A3 and A4 show the YouTube video
diets by channel type for individuals who viewed any alternative or extremist channel video during
the study.

Correlates of exposure

We next evaluate demographic and attitudinal factors that are potentially correlated with time spent
watching videos from alternative, extremist, and mainstream media channels. We focus specifi-
cally on hostile sexism, racial resentment, and negative feelings toward Jews — three factors that
may make people vulnerable to the types of messages offered by alternative and extremist chan-
nels, which often target women, racial and ethnic minorities, and Jews (31, 37). Negative attitudes
towards these outgroups may make people vulnerable to the types of messages offered by alter-
native and extremist channels. We therefore estimate the statistical models reported below on the
subset of 851 respondents for which prior scale measures of hostile sexism and racial resentment
are available from the 2018 Cooperative Congressional Election Study. (Details on survey wording
and measurement, including the wording for these scales, are provided in Methods below; feelings
toward Jews are measured using a feeling thermometer.)
   We estimate models measuring the association between the average time per week that respon-
dents spent on videos from alternative, extremist, or mainstream media channels and the mea-

                                                 13
Figure 4: Predictors of video watch time

                        Minutes/week on alternative                     Minutes/week on extremist                       Minutes/week on mainstream
                              channel videos                                 channel videos                                media channel videos

         Hostile                                   1.71
                                                       *                                               *
                                                                                                    1.60                           0.00
         sexism

          Racial                        0.19                                           0.09                                      −0.42
     resentment

   Feeling Jews                    −0.01                                               0.00                                         0.00

            Age                     0.03                                               0.05
                                                                                            *                                       0.04
                                                                                                                                         *
           Male                                1.01                                          0.74                                          0.85

     Non−white                  −0.79                                          −1.30                                                          1.50

   Some college                             0.72                                            0.50                                              1.60
                                                                                                                                                  *
      Bachelors                                       1.98
                                                           *                                        1.79
                                                                                                           *                                             *
                                                                                                                                                      2.43

      Post−grad                  −0.52                                   −1.99                                                                         2.62
                                                                                                                                                             *
                   −4      −2           0              2       4   −4     −2            0              2       4   −4      −2        0            2              4
                                                                         Quasipoisson coefficient

Quasipoisson regression coefficients for correlates of the amount of time respondents spent on videos from alternative,
extremist, and mainstream media channels in minutes per week. Figure includes 95% confidence intervals calculated
from robust, survey-weighted standard errors. Stars indicate coefficients that are significant at the p < .05 level. See
Table A2 for regression table.

sures listed above as well as relevant demographic characteristics: age, sex (male/not male), race
(white/non-white), and indicators for different levels of education above high school (some college/bachelor’s/post-
grad). Results of the quasipoisson models we estimate, which account for the skew in video watch
time, are shown in Figure 4. (See Figure A6 for equivalent results for the number of views of videos
from alternative and extremist channels.)
    The results indicate that prior levels of hostile sexism are significantly associated with time
spent on videos from alternative channels and time spent on videos from extremist channels but
not time spent watching mainstream media channels. This relationship, which is consistent with
the commenter overlap observed between men’s rights/anti-feminist channels and alt-right channels
on YouTube (39), is not observed for prior levels of racial resentment when controlling for hostile
sexism. However, racial resentment is positively associated with time spent on videos from alter-
native channels when entered into statistical models separately (see Table A6). Finally, we find no

                                                                          14
Figure 5: Hostile sexism as predictor of alternative and extremist channel viewing

                                               2000                                                        500
 Expected minutes per week on channel videos

                                                                                                           400
                                               1500

                                                                                                           300

                                               1000

                                                                                                           200

                                               500
                                                                                                           100

                                                 0                                                              0

                                                       1        2        3        4               5                  1               2         3   4   5
                                                                                                Hostile sexism scale

                                                                                   Alternative channel videos       Extremist channel videos
                                                                                   y−axis = [0, 2000]               y−axis = [0, 500]

Predictions are estimated from the models in Figure 4 holding other covariates at their median (continuous variables)
and modal (categorical variables) values. Colored bands represent 95% robust confidence intervals.

association between feelings toward Jews and viewership of any of these types of channels.
                                                Figure 5 illustrates the relationship between prior levels of hostile sexism and time spent per
week watching videos from alternative or extremist channels using the model results described
above. When hostile sexism is at its minimum value of 1, expected levels are 0.4 minutes per
week spent watching alternative channel videos and 0.08 minutes for extremist channel videos.
These predicted values increase to 383 and 51 minutes, respectively, when hostile sexism is at its
maximum value of 5 (with the greatest marginal increases as hostile sexism reaches its highest
levels).

Internal and external referrers

We next analyze the process by which people come to watch alternative and extremist videos on
YouTube. We denote the page that people viewed immediately prior to a video being opened (within
an existing browser tab or within a new tab) as the “referrer” and broadly distinguish between

                                                                                                      15
two different types of referrers: “on-platform” referrers consisting of various types of pages on
YouTube (a channel page, the YouTube homepage, a YouTube search page, or another video) and
“off-platform” referrers that are not part of the YouTube domain such as search engines, webmail
sites, mainstream social media sites (e.g., Facebook, Twitter, Reddit), or alternative social media
sites (e.g., Parler, Gab, 4chan). The complete list of external referrers in each category can be found
in Table A9. Details on how we identify referrers are provided in Methods below. We consider
YouTube’s recommendations directly in the section below.
   We find that off-platform referrers are responsible for approximately half of all views of alter-
native and extremist channel videos, a finding that is broadly consistent with YouTube’s statement
that “borderline content gets most of its views from other platforms that link to YouTube” (40).
As we show in Figure 6, 49% and 51% of referrers to alternative and extremist channel videos,
respectively, were off-platform sources compared to 41% and 44%, respectively, for videos from
mainstream media channels and other channels. With respect to on-platform referrers, we observe
homophily across the video types, with 18% of referrers to alternative videos coming from other
alternative video, 14% of referrers to extreme videos coming from other extreme videos, and 26%
of referrers to mainstream media videos coming from other mainstream media videos. Interest-
ingly, we observe 5% of referrals to extreme videos coming from alternative videos, but only 0.7%
of referrals from alternative videos coming from extreme videos, which suggests that among our
participants it is rare to move from highly radical to less radical videos. Lastly, we observe that
alternative, extreme, and mainstream media videos all receive roughly equal referrals from videos
in other channels (13–16%) and other on-platform sources (16–19%).
   Figure 7 instead reports the proportion of views to each type of YouTube channel video (alterna-
tive, extremist, mainstream media, and other) from each type of referrer. This analysis allows us to
determine which types of referrers are unusually (un)common across channel types. On-platform,
we note that the YouTube homepage, YouTube search, and other YouTube videos are relatively less
frequent sources of referrals to alternative and extremist channel videos than videos from main-
stream media channels and other channels. In contrast, channel pages are a more common referral

                                                  16
Figure 6: Pages viewed immediately prior to YouTube videos by channel type

                                            60%
 Estimated percentage from preceding link

                                            40%

                                            20%

                                            0%
                                                         Alternative              Extremist channel                  Mainstream media            Other channel
                                                       channel videos                  videos                             videos                    videos

                                                                                                      Alternative   Extremist     Mainstream
                                                                        Type of preceding link:       channels      channels      media
                                                                                                      Other         Non−video
                                                                                                                                  Off−platform
                                                                                                      channels      on−platform

Proportion of each type of URL recorded immediately before viewing a YouTube video of a given channel type.
Observations where the preceding link was not a YouTube video are shown in the “non-video, on-platform” and
“off-platform” bars. (“Non-video, on-platform” referrers combines YouTube channel pages, YouTube homepage, and
YouTube search.)

source to alternative and extremist channel videos. This finding highlights that participants arrive
at alternative and extreme videos from a variety of referrers, not just YouTube recommendations.
                                             Among off-platform referrers, social media platforms stand out as playing an especially impor-
tant role in referring people to alternative and extremist channel videos. Participants are dispropor-
tionately more likely to reach alternative channel videos via mainstream social media sites and to
reach extremist channel videos via alternative social media sites compared with videos from other
types of channels. For instance, about 1 in every 6 (17%) extremist channel video views were pre-
ceded by a visit to an alternative social media site. This difference may be the result of the content
moderation policies of mainstream social media platforms, which are more likely to moderate ex-
tremist posts promoting such videos than platforms like Gab and 4chan that attract extremist users
due to their lax content policies.

                                                                                                        17
Figure 7: Relative frequency of referrals to YouTube videos by channel and referrer type

                    A                                                                                                                                          On−platform referrer
                                              60%
 Estimated percentage from preceding domain

                                              50%

                                              40%

                                              30%

                                              20%

                                              10%

                                              0%
                                                        other on−                     YouTube                YouTube                  YouTube                      YouTube
                                                         platform                     channel               homepage                   search                       video

                    B                                                                                                                                          Off−platform referrer
                                              60%
 Estimated percentage from preceding domain

                                              50%

                                              40%

                                              30%

                                              20%

                                              10%

                                              0%
                                                       Alternative                Mainstream                 Other off−                   Search                   Webmail
                                                          social                    social                    platform                    engine

                                                                                         Alternative      Extremist        Mainstream         Other
                                                                Domains leading to:
                                                                                         channel videos   channel videos   media videos       channel videos

Proportion of referrals to YouTube videos of each channel type by referrer type. Other on-platform platform referrals
such as YouTube playlists and personal user pages were grouped into a separate category. Similarly, off-platform
domains that do not fit into any of the labelled categories in panel B are grouped together. A list of all domains
included in each group can be found in Online Appendix A.

Recommendations and YouTube “rabbit holes”

Critics of YouTube have emphasized the role of its algorithmic recommendations in leading people
to potentially harmful content. We therefore first measure which types of videos YouTube recom-

                                                                                                           18
mended to participants and how often those recommendations were followed. Next, we specifically
count how often people follow recommendations to more extreme channels to which they don’t
subscribe in a manner that is consistent with the “rabbit hole” narrative. Finally, we disaggregate
YouTube recommendations and following behavior based on subscription status. In general, we
find that recommendations to alternative and extremist channel videos are rare and largely shown
to and followed by people who already subscribe to those channels.
   We first disaggregate the recommendations shown to participants by the type of video on which
the recommendation appears. As Figure 8 shows, recommendations to alternative and extremist
channel videos are vanishingly rare, especially while watching videos from mainstream media or
other types of channels, which together make up 97% of views in our sample. Recommendations
to alternative and extremist channel videos are much more common, however, when people are
already viewing videos from alternative and extremist channels, which make up 2.6% and 0.4% of
views, respectively. 34.6% of recommendations when viewing an alternative channel video point
to another alternative channel video, while 25.5% of recommendations follow the same pattern for
extremist channel videos.
   Figure 9 provides corresponding statistics for the proportion of recommendations followed by
channel type. Given the interest people show by watching an alternative or extremist channel video,
it is not surprising that the proportion of recommendations that people followed to other videos of
that type are even more skewed. Among people who were watching alternative channel videos,
45.7% of recommendations followed were to alternative or extremist channel videos. Correspond-
ingly, 61.1% of recommendations followed from extremist channel videos were to other extremist
channel videos or to alternative channel videos.
   Next, we more directly test how often YouTube video recommendations create “rabbit holes” in
which people are shown more extreme content than they would otherwise encounter. Specifically,
we define three conditions that must be met to constitute a “rabbit hole” sequence of recommen-
dations and exposure and report the number of views, sessions, and users that meet these criteria
when sequentially applied:

                                                   19
Figure 8: Recommendation frequency by type of channel being watched

A) Percentage of total recommendations shown:
                            Mainstream
    Alternative Extremist                                                                                                                      Other
                              media
     (1.4%)     (0.2%)       (5.8%)                                                                                                        (92.6%)

B) Recommendations shown when watching:
        Alternative channels                Extremist channels                         Mainstream media                       Other channels

                    Recommendations shown to:      Alternative channels     Extremist channels       Mainstream media   Other channels

Number of colored tiles shown are proportional to the proportion of recommendations shown for each type of video
when watching videos from alternative, extremist, mainstream media, or other channels.

                                  Figure 9: Recommendation follows by video channel type

A) Percentage of total recommendations followed:
                            Mainstream
    Alternative Extremist                                                                                                                      Other
                              media
     (0.7%)     (0.2%)       (6.4%)                                                                                                        (92.8%)

B) Recommendations followed when watching:
        Alternative channels                Extremist channels                         Mainstream media                       Other channels

                   Recommendations followed to:     Alternative channels        Extremist channels   Mainstream media   Other channels

Number of colored tiles shown are proportional to the proportion of recommendations followed to
each type of video when watching videos from alternative, extremist, mainstream media, or other
channels.

                                                                           20
1. A participant followed a recommendation to an alternative or extremist channel video: 794
      instances (0.16% of all video visits) among 65 users (6.05% of all users);

   2. The recommendation that the participant followed moved them to a more extreme channel
      type (i.e., {mainstream media, other} ! {alternative} or {mainstream media, other, alterna-
      tive} ! {extreme}): 376 instances (0.08% of all video visits) among 53 users (4.94% of all
      users);

   3. The participant does not subscribe to the channel of the recommended video: 108 instances
      (0.02% of all video visits) among 41 users (3.82% of all users).

   We find little evidence for the typical “rabbit hole” story that the recommendation algorithm
frequently leads people to extreme content. Sequentially applying these rules leaves us with only
108 instances in which a YouTube visit met all three criteria, which represents 0.022% of all video
visits — 97 for recommendations to alternative channel videos (0.020% of all video visits) among
37 users (3.445% of users) and 11 for extremist channel videos (0.002% of all video visits) among
9 users (0.838% of users). (We provide qualitative accounts of three such sequences in Online Ap-
pendix A as well as an analysis showing no trend toward greater exposure to alternative or extremist
channel videos in longer YouTube sessions.)
   Moreover, some of these 108 cases are participants who followed a recommendation to a video
from a category in which they already subscribe to one or more other channels (e.g., a person who
subscribes to extremist channel A and follows a recommendation to extremist channel B). When
we exclude cases of this kind, the set of qualifying “rabbit hole” events declines to just 60 cases
(0.012% of all video visits) among only 30 users (2.793% of users).
   Contrary to the “rabbit hole” narrative, recommendations to videos from alternative and ex-
tremist channels are instead most frequently shown to channel subscribers — the same group that
is most likely to follow those recommendations. As Figure 10 demonstrates, people who subscribe
to at least one alternative channel received 55.8% of all alternative channel video recommenda-
tions and represented 71% of the cases in which a participant followed a recommendation to an

                                                21
Figure 10: YouTube recommendations by subscription status and channel type

                                 100%

                                 80%
 Percent subscribed to channel

                                 60%

                                 40%

                                 20%

                                  0%
                                           114,805              642   17,360              174           481,976           5,752   7,688,996           84,674

                                              Alternative channels       Extremist channels                 Mainstream media             Other channels
                                                     (2.7%)                    (0.4%)                            (4.6%)                     (92.3%)

                                                                                     Recommendations   Recommendations
                                                                                     shown             followed

The percentage of recommendations shown and followed to people who subscribe to one or more
videos of each channel type (including 95% confidence intervals for both, though these are some-
times not visible due to the sample size of the recommendations shown data). The percentage of
views of each type of video are shown in parentheses under the labels.

alternative channel video. This skew was even wider for extremist channel videos—subscribers to
one or more extremist channels saw 66.1% of recommendations to videos from extremist channels
and made up 82.8% of the cases in which respondents followed a recommendation to watch such
a video. These figures far exceed those observed for mainstream media channels or other types of
channels.

Discussion

Using data on web browsing, we provide behavioral measures of exposure to videos from alterna-
tive and extremist channels on YouTube. Our results indicate that exposure to these videos after
YouTube’s algorithmic changes in 2019 is relatively uncommon and heavily concentrated in a small
minority of participants who previously expressed high levels of hostile sexism and racial resent-
ment. These participants frequently subscribe to the channels in question and reach the videos
that they produce via external links. By contrast, we find relatively little evidence of people falling

                                                                                                22
into so-called algorithmic “rabbit holes.” Recommendations to videos from alternative and extrem-
ist channels on YouTube are very rare when respondents are watching other kinds of content and
concentrated among subscribers to the channels in question.
   Our findings imply that the process by which people are exposed to potentially harmful con-
tent on platforms like YouTube may have been misunderstood. Though we cannot rule out every
possible account of how YouTube’s algorithms might help expose people to dubious content (e.g.,
that such radicalization took place prior to our study period), our results provide few examples of
the patterns of behavior described in simple “rabbit hole” narratives. Future research expressing
concerns about online radicalization should offer more precise definitions of what a “rabbit hole”
is and the timescale over which it might be observed.
   By contrast, our results make clear that YouTube continues to provide a platform for alternative
and extreme content to be distributed to vulnerable audiences. In some ways, this outcome may be
even more worrisome. People who view videos from alternative and extremist channels typically
already hold extreme views on race and gender and often follow external links to these types of con-
tent. The subscription functionality that YouTube offers helps resentful audiences to follow content
from alternative and extremist channels and drives recommendations to more of their videos.
   Of course, it is important to note several limitations of the study. First, though our browser
extension sample is large and diverse and we weight our results to approximate national bench-
marks, it is not fully representative and does not capture YouTube consumption on other browsers
or on mobile devices. Any outside study of a platform also faces challenges in recruiting large
numbers of heavy consumers of fringe content. Second, these results only cover U.S. users; they
should be replicated outside the U.S. in contexts including Europe and the global South. Third,
YouTube is constantly changing its features, algorithm, etc. and its user and creator populations
evolve over time as well. Findings from 2020 may not mirror what would have been observed
in prior years—in particular, it is possible that YouTube algorithms recommended alternative and
extremist channel videos more frequently prior to the changes made in 2019. Fourth, our results
depend on channel-level classifications from scholars and subject matter experts; further research

                                                23
should examine whether the patterns we observe are robust to alternate measures at the channel and
(if possible) video level. Finally, our measures of views, referrals, and subscriptions contain some
degree of error due to technical and ethical limitations. In particular, as with any passive behavioral
data, we cannot verify that every user paid attention to the content that appeared on their device in
every instance.
   Ultimately, these findings underscore the need to apply the tools of behavioral science to mea-
sure exposure to extremist content across social media platforms and to determine how these plat-
forms may reinforce (or hinder) those patterns of behavior individually and collectively. As our
findings from YouTube suggest, these problems often center on the way social media platforms en-
able the distribution of potentially harmful content to vulnerable audiences rather than algorithmic
exposure itself.

References

 [1] Cass Sunstein. Republic.com. Princeton University Press, 2001.

 [2] Natalie Jomini Stroud. Polarization and partisan selective exposure. Journal of communica-
     tion, 60(3):556–576, 2010.

 [3] Matthew Gentzkow and Jesse M Shapiro. Ideological segregation online and offline. The
     Quarterly Journal of Economics, 126(4):1799–1839, 2011.

 [4] Brooke Auxier and Monica Anderson. Social media use in 2021. Pew Research Center,
     April 7, 2021. Downloaded June 22, 2021 from https://www.pewresearch.org/
     internet/2021/04/07/social-media-use-in-2021/, 2021.

 [5] Joan E. Solsman. Youtube’s ai is the puppet master over most of what you watch. CNET,
     January 10, 2018. Downloaded June 22, 2021 from https://www.cnet.com/news/
     youtube-ces-2018-neal-mohan/, 2018.

                                                  24
[6] Zeynep Tufekci. Youtube, the great radicalizer. March 10, 2018. Downloaded June 22, 2021
     from https://www.nytimes.com/2018/03/10/opinion/sunday/youtube-
     politics-radical.html, 2018.

 [7] Jack Nicas.       How youtube drives people to the internet’s darkest corners.     Wall
     Street Journal,     February 7,   2018. Downloaded June 22,       2021 from https:
     //www.wsj.com/articles/how-youtube-drives-viewers-to-the-
     internets-darkest-corners-1518020478, 2018.

 [8] Kevin Roose. The making of a youtube radical. New York Times, June 8, 2019. Down-
     loaded June 22, 2021 from https://www.nytimes.com/interactive/2019/06/
     08/technology/youtube-radical.html, 2019.

 [9] Jesse McCrosky and Brandi Geurkink. Youtube regrets: A crowdsourced investigation into
     youtube’s recommendation algorithm. Mozilla Foundation, July 2021. Downloaded July 16,
     2021 from https://assets.mofoprod.net/network/documents/Mozilla_
     YouTube_Regrets_Report.pdf, 2021.

[10] YouTube. Continuing our work to improve recommendations on youtube. January 25, 2019.
     Downloaded June 28, 2021 from https://blog.youtube/news-and-events/
     continuing-our-work-to-improve/, 2019.

[11] YouTube.      Our ongoing work to tackle hate.    June 5, 2019. Downloaded June 28,
     2021 from https://blog.youtube/news-and-events/our-ongoing-work-
     to-tackle-hate/, 2019.

[12] YouTube. The four rs of responsibility, part 2: Raising authoritative content and reduc-
     ing borderline content and harmful misinformation. December 3, 2019. Downloaded June
     22, 2021 from https://blog.youtube/inside-youtube/the-four-rs-of-
     responsibility-raise-and-reduce, 2019.

                                              25
[13] Christopher A Bail, Lisa P Argyle, Taylor W Brown, John P Bumpus, Haohan Chen, MB Fallin
     Hunzaker, Jaemin Lee, Marcus Mann, Friedolin Merhout, and Alexander Volfovsky. Exposure
     to opposing views on social media can increase political polarization. Proceedings of the
     National Academy of Sciences, 115(37):9216–9221, 2018.

[14] Soroush Vosoughi, Deb Roy, and Sinan Aral. The spread of true and false news online. Sci-
     ence, 359(6380):1146–1151, 2018.

[15] Eytan Bakshy, Solomon Messing, and Lada A Adamic. Exposure to ideologically diverse
     news and opinion on facebook. Science, 348(6239):1130–1132, 2015.

[16] David MJ Lazer, Matthew A Baum, Yochai Benkler, Adam J Berinsky, Kelly M Greenhill,
     Filippo Menczer, Miriam J Metzger, Brendan Nyhan, Gordon Pennycook, David Rothschild,
     et al. The science of fake news. Science, 359(6380):1094–1096, 2018.

[17] Andrew Guess, Jonathan Nagler, and Joshua Tucker. Less than you think: Prevalence and
     predictors of fake news dissemination on facebook. Science advances, 5(1):eaau4586, 2019.

[18] Kevin Munger and Joseph Phillips. Right-wing youtube: A supply and demand perspective.
     The International Journal of Press/Politics, 2020.

[19] Marc Faddoul, Guillaume Chaslot, and Hany Farid. A longitudinal analysis of youtube’s
     promotion of conspiracy videos. arXiv preprint arXiv:2003.03318, 2020.

[20] Cody Buntain, Richard Bonneau, Jonathan Nagler, and Joshua A Tucker. Youtube recom-
     mendations and effects on sharing across online social platforms. Proceedings of the ACM on
     Human-Computer Interaction, 5(CSCW1):1–26, 2021.

[21] Eslam Hussein, Prerna Juneja, and Tanushree Mitra. Measuring misinformation in video
     search platforms: An audit study on youtube. Proceedings of the ACM on Human-Computer
     Interaction, 4(CSCW1):1–27, 2020.

                                               26
[22] Manoel Horta Ribeiro, Raphael Ottoni, Robert West, Virgílio AF Almeida, and Wagner
     Meira Jr. Auditing radicalization pathways on youtube. In Proceedings of the 2020 conference
     on fairness, accountability, and transparency, pages 131–141, 2020.

[23] Mark Ledwich and Anna Zaitsev. Algorithmic extremism: Examining youtube’s rabbit hole
     of radicalization. arXiv preprint arXiv:1912.11211, 2019.

[24] Homa Hosseinmardi, Amir Ghasemian, Aaron Clauset, David M Rothschild, Markus Mobius,
     and Duncan J Watts. Examining the consumption of radical content on youtube. Proceedings
     of the National Academy of Sciences, 32(e2101967118), 2021.

[25] Donald R Kinder and Lynn M Sanders. Divided by color: Racial politics and democratic
     ideals. University of Chicago Press, 1996.

[26] Peter Glick and Susan T Fiske. Hostile and benevolent sexism: Measuring ambivalent sexist
     attitudes toward women. Psychology of women quarterly, 21(1):119–135, 1997.

[27] Christopher D DeSante and Candis Watts Smith. Less is more: A cross-generational analysis
     of the nature and role of racial attitudes in the twenty-first century. The Journal of Politics, 82
     (3):967–980, 2020.

[28] Andrew M Guess. (almost) everything in moderation: New evidence on americans’ online
     media diets. American Journal of Political Science, 2021.

[29] Andrew M Guess, Brendan Nyhan, and Jason Reifler. Exposure to untrustworthy websites in
     the 2016 us election. Nature human behaviour, 4(5):472–480, 2020.

[30] Magdalena Wojcieszak, Ericka Menchen-Trevino, Joao FF Goncalves, and Brian Weeks. Av-
     enues to news and diverse news exposure online: Comparing direct navigation, social me-
     dia, news aggregators, search queries, and article hyperlinks. The International Journal of
     Press/Politics, page 19401612211009160, 2021.

                                                  27
[31] Becca Lewis. Altlernative influence: Broadcasting the reactionary right on youtube. Data &
     Society. September 18, 2018. Downloaded June 22, 2021 from https://datasociety.
     net/library/alternative-influence/, 2018.

[32] Christopher Charles. (main)streaming hate: Analyzing white supremacist content and framing
     devices on youtube. University of Central Florida. Downloaded June 22, 2021 from https:
     //stars.library.ucf.edu/etd2020/27/, 2020.

[33] Aaron Sankin.     Youtube said it was getting serious about hate speech. why is it still
     full of extremists?      Gizmodo, July 25, 2019. Downloaded August 5, 2021 from
     https://gizmodo.com/youtube-said-it-was-getting-serious-
     about-hate-speech-1836596239, 2019.

[34] Galen Stocking, Patrick Van Kessel, Michael Barthel, Katerina Eva Matsa, and Maya Khuzam.
     Many americans get news on youtube, where news organizations and independent producers
     thrive side by side. Pew Research Center, September 28, 2020. Downloaded June 22, 2021
     from      https://www.pewresearch.org/journalism/2020/09/28/many-
     americans-get-news-on-youtube-where-news-organizations-and-
     independent-producers-thrive-side-by-side/, 2020.

[35] Gregory Eady, Richard Bonneau, Joshua A. Tucker, and Jonathan Nagler. News Sharing on
     Social Media: Mapping the Ideology of News Media Content, Citizens, and Politicians. type:
     article, November 2020. URL https://osf.io/ch8gj/.

[36] Savvas Zannettou, Tristan Caulfield, Emiliano De Cristofaro, Nicolas Kourtelris, Ilias Leon-
     tiadis, Michael Sirivianos, Gianluca Stringhini, and Jeremy Blackburn. The web centipede:
     understanding how web communities influence each other through the lens of mainstream and
     alternative news sources. In Proceedings of the 2017 Internet Measurement Conference, pages
     405–417, London United Kingdom, November 2017. ACM. ISBN 978-1-4503-5118-8. doi:

                                               28
You can also read