DISINFORMATION AND ENVIRONMENTAL ADVOCACY
←
→
Page content transcription
If your browser does not render page correctly, please read the page content below
Disinformation and Environmental Advocacy This report was written by Melissa Ryan with assistance from Michael Khoo and Kevynn Gomez of Friends of the Earth U.S.
Table of Contents 4 Acknowledgments 14 Further Reading 5 Executive Summary 15 CALL OUT BOX: 6 Russia and Disinformation Case Study: Russian Troll Farm Spreads Enviro Disinformation Further Reading 8 The Basics: Misinformation, 16 Culture Wars Disinformation and Malinformation Alexandria Ocasio-Cortez and 17 Further Reading Green New Deal Conspiracies 9 Who does this and why? “Bad 18 Further Reading/Action Items actors” and state-sponsored attacks 19 The Pro-Trump Media Ecosystem 10 Strategies and Tactics Used Case Study: The 4chan to Brigading Fox News to Trump Pipeline Disinformation Flooding 20 Further Reading/Action Items Malinformation 21 Consultant hyper-partisan news 11 networks (and email leads) Organized Online Harassment and Doxxing Case Study: ACORN in 2009 vs. Sierra Club in 2018 Conspiracy Mongering 23 Communication Strategies For Gaming the Refs and Nonprofits to Adapt Conservative Bias 24 Assessing Threats Further Reading 12 25 Train Your Staff Social Media is Weaponized 13 Make a Plan What is weaponization, and am I doing it too? 26 Digital Security How platforms are gamed The Importance of Solidarity Facebook Action Items Twitter 27 Pressuring the Tech Platforms 14 YouTube 28 Additional Resources Instagram Crisis Plan Templates Crowdfunding External Organizations
Disinformation and Environmental Advocacy Acknowledgments “Disinformation and Environmental We would like to thank the Advocacy” cites research by: following organizations for their commitment to this topic and their Data and Society; support: Equality Labs; First Draft News; Defenders of Wildlife, HOPE not Hate; Environmental Defense Fund, Jan-Willem van Prooijen, André P. Greenpeace, M. Krouwel and Thomas V. Pollet, National Audubon Society, University of Amsterdam; Natural Resources Defense Council, Dr. Kate Starbird, University of National Wildlife Federation, Washington; Ocean Conservancy, Media Matters; New Knowledge AI; PAI, Oxford Internet Institute; Partnership Project, Pen America; Pew Research Center, Pew; Sierra Club, Renee DiResta; The Trust for Public Land and Securing Democracy Alliance; Union of Concerned Scientists. Stop Online Violence Against Women; Women’s Media Center; and Yochai Benkler, Robert Faris and Hal Roberts, Harvard University. We are grateful for their work and coverage of these important topics. 4
Executive Summary Over the last few decades, disinformation in the and targets, misrepresent your environmental space has featured companies like Exxon quietly funding global warming denial or message and damage your brand. pesticide companies covering up health effects. In It’s an issue already plaguing 2019, disinformation often looks like high-profile other movements — notably political figures like Sebastian Gorka, purposely and continually claiming the Democratic Party immigration reform, racial justice will take away Americans’ hamburgers as part of and reproductive freedom. a communist Green New Deal agenda, and using right-wing social media echo chambers to turn “Disinformation and Environmental Advocacy” those lies into truth. The former represents insid- focuses on common disinformation tactics and ious corporate power, the latter: just how blatant how these connect (or in some instances, may one and weaponized disinformation has become. day connect) to the environmental community. We discuss studies, social media as an easily exploit- While environmental groups continue to use able world of tools, right-wing trolls and their technology like social media and email well, dis- Trump connections, as well as the future of disin- information-spreading trolls and other bad actors formation. We also take a look at a current exam- are increasingly many steps ahead, using these ple of manipulation by messaging and technology: same tools to further divide and derail conversa- the Green New Deal. Further, we offer practical tions and reduce the power of activists. Multiple strategies and tactics for organizations to prepare extensive studies documented how Americans themselves and to be ready for an attack if and were manipulated during the 2016 election by when it arrives. bad actors using social media. Disinformation is already a systemic societal problem, not merely Findings in this report are a threat of the future, and if the history of power straightforward. Disinformation and control is any guide, environmental advocacy will be next. is a danger already at work. And as environmental topics, concerns This report highlights the and policies increasingly become existence of disinformation in the linchpins in American society, we environmental community. From believe damaging disinformation Russia’s Internet Research Agency incidents will grow in size and to harassment of Alexandria Ocasio- damage. Climate change, pollution Cortez, we point to a few core and the shift to 2020 election examples of what environmental policy are all ways environmental disinformation is evolving into in topics will continue to gain attention 2019. Disinformation can confuse — and therefore become more and deceive your members at-risk from manipulating actors. 5
Disinformation and Environmental Advocacy In the world of disinformation, cultural values, control the cultural landscape. ideas and images are the primary fodder for bad actors. Gun rights, abortion and immigration have These types of disinformation all been exploited by disinformation actors pre- campaigns are, at their core, a cisely because they are easy visual and cultural cover by which to attack people targets. There is reason to believe that as disinfor- mation spreads, it will begin to take on topics like of color, women, immigrants and environmental policy by using cultural ideas, not other marginalized communities policy facts, as a way to bridge the gap. — they are a way to uphold white Protecting sacred lands is an issue already under supremacist ideals. attack, but when the activists are indigenous people of color, the disinformation takes on a Disinformation is therefore much more than simply racial tone. When a young member of Congress damage to one’s brand; it is an attempt to manipu- fights for environmental rights, the disinformation late society into supporting oppressive beliefs. begins to take on a clear undercurrent of misog- yny, racism and classism. It is our goal to bring these problems to light before they severely hit the environmental com- munity. Our hope is that we may come together Disinformation campaigns through to identify disinformation threats, prepare, and of these bigoted filters are about course, fight back. This problem is too large, and something deeper: a struggle too novel, to handle alone. Through skills building and solidarity, the environmental community can for power by far-right, white, take a stand against these toxic messages that will patriarchal actors demanding to attempt to disrupt environmental campaigns. CASE STUDY: The IRA notoriously influenced the 2016 U.S. presidential election, bombarding American social Russian Troll Farm media with content supporting Trump and smear- Spreads Enviro ing Democratic candidate Hillary Clinton, and last spring Special Counsel Robert S. Mueller even Disinformation indicted the troll farm and Russians working for it. Russian trolls used climate change Yet the IRA still exists; their disinformation cam- paign hasn’t ended. in an attempt to divide Americans in 2016 as part of their broader Several federal government committees have effort to influence American political investigated the IRA. The House Committee on Science, Space and Technology’s investigation discourse. The Russian organization focused solely on the IRA’s disinformation on Internet Research Agency exploited energy topics. They found that the IRA infiltrated U.S. social media sites to manipulate Americans’ ideological differences through opinions on the energy sector — such as pipe- modern disinformation tactics in lines, climate change and fracking. The end goal? the U.S., exacerbating hot-button To further sociopolitical divisions in the U.S., and maybe even control which industries we support topics like oil pipelines and the and which we protest. validity of climate change. 6
From 2015 to 2017, over 4,000 IRA-linked accounts were thought to exist between Twitter, Facebook and Instagram. Russian accounts pushed content in a variety of areas, not just energy and environ- ment; the IRA troll farm worked on elections and racial issues as well. IRA actors fanned the flames of controversial issues with the intention of divid- ing Americans and influencing their beliefs. IRA accounts were dishonest, pretending to be Amer- ican activists and social media users. These are all standard tactics of disinformation. The accounts had blatantly partisan names like “Born Liberal,” “_americafirst_” and “Native_ Americans_United_.” Their Instagram posts (fea- tured) consisted of memes, i.e., photos with text added in, often with highly emotional language. Some posts were published on both Instagram and Facebook. These accounts targeted both sides of the aisle: some accounts posted content that was critical of the Dakota Access Pipeline and supportive of activists protesting it; others Instagram criticized environmental activists while support- Account Name: “_americafirst_”; “stand_for_freedom” ing the energy industry. This was an attempt to Likes: 3,076; 1,774 create a divisive, false dichotomy pitting ideologi- Comments: 57; 17 cal groups against each other. Posted: May 8, 2017; April 17, 2017 The Science, Space and Technology Committee believes the goal of these accounts was to influ- ence the American energy industry by persuading social media users, playing upon ideological differ- ences and exacerbating disagreement and tension. “The main focus of the Russian efforts centered on disruption of pipeline development or the advancement of climate change policies targeting fossil fuels,” the report states. In some instances, the accounts directed viewers to links for protests and petitions — persuading the viewers to act on their newfound political stances, thereby influenc- ing American policy or economy. Despite journalists’ coverage of the disinformation campaigns and the committees’ investigations, the accounts still reached tens of thousands and Facebook Instagram sometimes hundreds of thousands of Americans. Page Name: Account Name: “bornlib- “Born Liberal” eral” Buzzfeed News states that one account, Native_ Shares: 5 Likes: 1,794 Americans_United_, had 33,000 followers. Face- Likes: 9 Comments: 96 book account “Stop A.I. (Stop All Immigrants)” Comments: 0 Posted: May 11, 2017 had over six million likes. Posted: May 11, 2017 7
Disinformation and Environmental Advocacy These Russian actors were largely successful: they environmental space. Instead of being a problem received millions of likes, comments and support from the past, disinformation on such hot-button from Americans and were even able to influence environmental topics will likely increase, especially Americans to attend protests and events. Many of as the United States heads toward a 2020 election these accounts have since been blocked — though year where climate and “green topics” remain a others are likely being created anew. focal point. This underscores a disinformation campaign by Research and news reports on the IRA’s expan- bad actors feeding off already-controversial envi- sive disinformation run deep, but the underlying ronmental topics. The IRA’s infiltration of Amer- message here is clear: Russians have manipulated ican social media is a prime example of disinfor- Americans on topics of environmental import — mation, not just at large, but also within the niche and will try again if given the chance. The Basics: Misinformation, Disinformation and Malinformation Misinformation, defined by Oxford English Dictio- would consider fake news to be false news, Donald nary as “false or inaccurate information, especially Trump’s base uses the term to describe news they that which is deliberately intended to deceive,” is believe is politically biased against them. a helpful term when describing false information seen online. “Fake news,” the popular term for The organization First Draft News organizes what misinformation that emerged after the 2016 elec- they term “information disorder” into three types: tion, is a problematic term. Whereas most of us misinformation, disinformation and malinformation. Information disorder Source: firstdraftnews.com FALSE INTENT TO HARM Mis-Information Dis-Information Mal-Information False Connection False Context (Some) Leaks Misleading Content Imposter Content (Some) Harassment Manipulated Content (Some) Hate Speech Fabricated Content 8
Whereas misinformation can be accidental, disinfor- mation is always intentional. Malinformation doesn’t receive as much attention in the progressive move- Further Reading ment as the first two, which is a vulnerability. • Data & Society: Dead Reckoning Most progressive organizations under attack will Navigating Content Moderation After face a combination of mis-, dis- and malinformation. Fake News And, malinformation will almost certainly be a major • First Draft News: Information disorder: part of the attack, as it was with the DNC hacking Toward an interdisciplinary framework or ACORN attacks. Malinformation is particularly for research and policy making tricky in that the information released is likely to include at least some information that is true. • Pacific Standard: How Trump Weaponized ‘Fake News’ For His Own First Draft News also includes some hate speech Political Ends and harassment in their definition of malinfor- mation, but for this manual, we’ll use a narrower definition and treat harassment and hate speech as separate tactics. Who does this and why? “Bad actors” and state-sponsored attacks The first two questions to address are: who As environmental groups challenge spreads disinformation and why? Disinformation often comes across as a mass of individuals with a the corporate polluters that particular opinion. But research shows that these influence the White House or people are often backed by institutions — the U.S. Congress, we can reasonably Russian government in the case of the IRA, or part of a formal grouping of racists, as in the case of expect they will utilize these same right-wing extremism. technological tools to fight back to maintain their power. One way to explain the use of disinformation is to view it as a tool of power. The power of one group or institution to control another. Racist disinfor- mation is an attempt to maintain its power over an evolving multicultural America. In another way, misogynistic trolling or revenge porn are tools used to fight the rising status of women in America. 9
Disinformation and Environmental Advocacy Strategies and Tactics Used Tech platforms and tools might social media users will have trouble telling truth from fiction. The disinformation that percolates change rapidly, but strategies and from these floods often morph into conspiracy tactics deployed by bad actors theories that remain in the online ether for years. waging information warfare have After any crisis breaking news event, disinfor- been fairly consistent over the mation spreads online like wildfire. During the past several years. Environmental Parkland mass shooting in 2018, posters on the message boards 4chan and 8chan spread false groups need to become familiar information about the shooter being linked to with these tactics and be able to a white supremacist militia, the most widely recognize them as they develop reported of the multiple hoaxes about the massa- cre found online. After the Sandy Hook shooting, and expand. right-wing websites like Infowars spread conspir- acies that the victims were “crisis actors” and that BRIGADING the shooting was a liberal plot for gun control. Know Your Meme defines brigading as “the prac- tice of mobilizing a campaign within an online Dr. Kate Starbird, a professor at the University of community to promote or undermine a tar- Washington, has done research on what she refers geted page, user or belief en masse through the to as alternative narratives. She writes: “Over user-voting system.” #Release the Memo is proba- time, we noted that a similar kind of rumor kept bly the best-known example of this. Using Twitter, showing up, over and over again, after each of the trolls were able to turn a political stunt from Rep. man-made crisis events — a conspiracy theory or Devin Nunes into a front-page news story, keeping ‘alternative narrative’ of the event that claimed it “release the memo” trending on Twitter and gar- either didn’t happen or that it was perpetrated by nering days of media attention as a result. someone other than the current suspects.” Star- bird also highlights the role that botnets play in During the net neutrality fight, a debate that had disseminating alternative narratives. been occupied by nongovernmental organization activists on one side and telecom interests on MALINFORMATION the other was suddenly and repeatedly flooded As defined by First Draft News, malinformation is by new “actors.” In the public comments to the when “genuine information is shared to cause harm, Federal Communications Commission, hundreds often by moving private information into the public of thousands of fake names were submitted by sphere.” The most well-known examples are the the pro-telecom side. At an earlier FCC vote, the leaked Democratic National Committee and John Twitter feed of #netneutrality was flooded by sus- Podesta emails in 2016. The success of malinforma- picious accounts posting unintelligible and errone- tion relies on traditional media to cover the released ous information. information on the basis that the info is true and now in the public record. The leaked information can result DISINFORMATION FLOODING in weeks of negative news stories for the target. The goal of disinformation flooding isn’t necessar- ily to change anyone’s views, but to create confu- Malinformation can also be a potent tool for sion and potentially blunt a campaign’s message. sowing division. Leaked documents often contain Reporters covering these events are forced into gossip or discussion of others. It can also be used wild goose chases to debunk misinformation and to add fuel to conspiracy theories online. 10
ORGANIZED ONLINE GAMING THE REFS AND HARASSMENT AND DOXXING CONSERVATIVE BIAS Organized online harassment is a staple tactic Bad actors have adapted the frame that media is of bad actors. Online harassment is an array of biased against conservatives for the social media tactics including trolling, cyberstalking, doxxing, era, and the mainstream conservative movement swatting and revenge porn. Most campaigns of is aiding them. organized harassment employ multiple attacks against their targets. Right-wing politicians, pundits and campaigns continually claim that Facebook and other tech Women and people of color are disproportionately platforms censor conservative content online. the targets of online harassment. The Women’s Donald Trump’s campaign manager, Brad Parscale, Media Center Speech Project states that “women, frequently makes this argument. the majority of the targets of some of the most severe forms of online assault – rape videos, extor- Media Matters conducted an extensive six-month tion, doxing with the intent to harm – experience study into alleged conservative censorship on abuse in multi-dimensional ways and to greater Facebook and found no evidence that conserva- effect.” A Pew Research study on online harass- tive content is being censored on the platform or ment found that “although this harassment can that it is not reaching a large audience. take many forms, some minority groups more frequently encounter harassment that carries Republicans continue to harp on the conservative racial overtones. This is particularly true for black censorship conspiracy theory as a way to rally Americans, a quarter of whom say they have been their base, and in doing so, they give cover to targeted online due to their race or ethnicity, com- disinfo disseminators who are constantly fighting pared with 10% of Hispanics and 3% of whites.” suspension and expulsion from the platforms. Anyone is a potential target for online harassment — but those who threaten power structures or are attempting to change culture are the most at risk. Further Reading CONSPIRACY MONGERING • PEN America: Defining “Online Harassment”: A Glossary of Terms Conspiracy theories have long been a fixture of extremists and right-wing media, but have become • Women’s Media Center Speech mainstreamed in the era of Donald Trump, him- Project Online Harassment Report self a conspiracy theorist. Extremist actors have continually used conspiracy theories as a way to • Dr. Kate Starbird: Information Wars: galvanize Trump supporters online. A Window into the Alternative Media Ecosystem Conspiracy theory communities online can also • Media Matters study debunking claims serve as a recruitment tactic for trolls. Research of conservative bias on Facebook suggests that people with extremist political views (left or right) are also likely to believe in con- spiracy theories. The QAnon conspiracy theory, currently popular among Trump supporters, is the most prominent current example of this. 11
Disinformation and Environmental Advocacy Social Media is Weaponized The most important thing to These hostile actors — who always existed in political debates — now amplify their reach sur- understand about disinformation reptitiously with technology. Bots are strategically is that it’s deliberate. Social media programmed to blitz social media with links to platforms that gained popularity right-wing content at key times. This often hap- pened during the 2016 election at key moments by innocently sharing photos for Trump. The bots’ end products were millions of between friends and family have Twitter and Facebook posts carrying links to sto- ries on conservative internet sites such as Breitbart been weaponized against us. and InfoWars, as well as on the Kremlin-backed RT Hostile actors work overtime to News and Sputnik. amplify their message and make Twitter botnets are both foreign and domestic. their movement seem larger They can be deployed to both amplify content than it is. They also run targeted and harass those they disagree with. The result of harassment campaigns with the that amplification is that reporters and media see trending topics and report what people are saying goal of silencing others, especially on social media, further amplifying the message influencers and thought leaders. and normalizing extreme ideas. 12
What is weaponization, How platforms are and am I doing it too? gamed Communicators and digital organizers often ask This next section walks through the most popular how to define the difference between legitimate social media platforms and how they’re exploited online organizing and weaponization. It’s a good to spread disinformation. question, especially as trolls continue to adopt tra- ditional organizing and marketing tactics for their FACEBOOK own purposes. The following is a proposed frame- work for determining the difference. Facebook’s role in the spread of disinformation is multi-faceted. Trolls utilize a network of real and fake profiles, pages and closed groups to create, False amplification. Attempting to workshop and disseminate disinformation and make your view or movement seem other toxic content. Right-wing meme pages out- larger than it is with fake accounts, perform all other political content on Facebook. Facebook has been slow to respond to this wea- bots or algorithmic manipulation. ponization, taking down some pages and closed Spreading disinformation. groups but allowing others to remain. Knowingly spreading and Disinformation also spreads through Facebook amplifying false information online. ads. Facebook launched a public database of paid Online harassment. This includes ads deemed “political” that ran on the platform. A review of the database found that the platform, in bullying, doxxing or similar violation of its own policies, allowed ads featuring methods with the shared goal of fake stories and conspiracy theories. Facebook intimidating others into silence. has announced numerous initiatives to combat this, but a former journalist who participated in Fanning the flames. Specifically Facebook’s third-party fact checking program trolling for the purpose of inciting called these efforts doomed, even as Facebook ignores calls for specific reforms from technolo- outrage rather than pushing a gists working with communities of color. particular political viewpoint. TWITTER Most environmental groups do not do this, but it’s Twitter is used to amplify messages knowing that an important question to periodically ask, as solu- a built-in audience of influencers including leg- tions are complicated. acy media will see and absorb them. Memes and messages propagate on Twitter through the use of For example, technology researcher Camille Fran- trolls, botnets and cyborg accounts, over-ampli- cois recently posed the question to progressive fying them to an audience with influence in media, activists on how to regulate bots. Should the focus politics and tech. be that they’re automated? No — your website uses automation. Should the focus be that such Over-amplification can include the brigading of use is not transparent? No. Most organizations’ the trending topics section. Twitter accounts post using scheduling software too. Or, that they tell lies? Again, no: you don’t Twitter is also the tech platform of choice for want the state adjudicating here. Bots, like other online harassment campaigns. Extremists have weaponized tools, cannot be easily regulated — organized countless harassment campaigns though conversations on their inherent risks and against women, people of color, journalists and impact, like Camille started, are necessary. 13
Disinformation and Environmental Advocacy verified users, to name a few. These campaigns looking to spread propaganda as part of Russia’s are designed to silence influential voices and efforts to influence Americans. Trolls also sell demobilize them from the public conversation products using Instagram’s ad program. Instagram online entirely. Twitter’s CEO Jack Dorsey has also continues to be a platform for notable disin- admitted there is a link between Twitter harass- formers and extremists such as Milo Yiannopoulos ment campaigns and those targeted often being and Alex Jones, despite the fact that other plat- put in real physical danger as a result. forms — including Facebook — have banned or suspended them. YOUTUBE CROWDFUNDING YouTube functions as a hub for alternative media outlets, a network to mainstream extremist ideas, Digital ad platforms (e.g., search ads, display ads a conduit for conspiracy theories to spread or social media ads), payment processors (e.g., and a source of income for several prominent credit card processors or services like PayPal) and extremist figures. crowdfunding platforms (e.g., Kickstarter, Patreon and GoFundMe to name a few) are an often over- Right-wing media companies have built their busi- looked piece of the tech platform puzzle. YouTube nesses using YouTube’s algorithm that autoplays ran ads for hundreds of brands on conspiracy and and suggests similar content to users. The three extremist channels. Bad actors crowdfund their main players are Rebel Media, PragerU and the activities on a variety of platforms. Disinformation recently merged CRTV and The Blaze. They mone- sites use Paypal to raise money from their misin- tize using YouTube’s ad programs allowing them to formation and far-right personalities such as Alex sustain these activities over time. Jones aggressively crowdfund from their audi- ences to support them. Beyond YouTube’s algorithm, right-wing figures use influencer marketing techniques to grow their audiences. A report on YouTube from Data and Society’s Rebecca Lewis “presents data from Further Reading approximately 65 political influencers across 81 channels to identify the ‘Alternative Influence • Wired: How Bots, Twitter, and Hackers Network (AIN)’; an alternative media system that Pushed Trump to the Finish Line adopts the techniques of brand influencers to • Media Matters: How the Facebook right- build audiences and ‘sell’ them political ideology.” wing propaganda machine works YouTube has been prominent in the spread of • Media Matters: Under Facebook’s new conspiracies, from anti-vaxxers to Sandy Hook algorithm, conservative meme pages are truthers. Recently, YouTube announced that it outperforming all political news pages would no longer recommend conspiracy videos to users. But it remains to be seen how effective • MisinfoCon: Cyborgs: Where is the line this change will be on the spread of conspiracies, between man and misinformation machine? as Facebook’s current efforts to fight “fake news” • Data & Society: Alternative Influence: are failing. Expect conspiracymongers to work Broadcasting the Reactionary Right on overtime attempting to find creative ways to YouTube game the algorithms. • Storyz: Google Adsense allows ads on INSTAGRAM extremist sites Despite being owned by Facebook, Instagram • Media Matters: Multiple fake news sites often flies under the radar. Per a report on Rus- are using Paypal to raise money, violating sian propaganda from New Knowledge, Instagram its terms of service was perhaps the most effective tool for IRA trolls 14
Russia and Disinformation When most Americans think of disinformation Disinformation and counter-terrorism expert Clint from foreign actors, Russia’s attempts to influence Watts has predicted that state-sponsored dis- the 2016 election immediately comes to mind. information will become a common tactic if left Russia’s attempts to manipulate public opinion in unchecked, and that other entities seeking power the U.S. began in 2014 and continue to this day. will adopt Russian-style tactics. Watts says, “In A 2018 report from New Knowledge, commis- the future, Russia and other authoritarians will sioned by the Senate Intelligence committee, continue their manipulation, but it will be ordinary illustrates how Russia gamed our social media candidates and their campaigns, lobbyists, and platforms to pit Americans against one another. corporate backers that seek to exploit the manipu- Russian operatives, working from the now notori- lative advantages available on social media.” ous Internet Research Agency, preyed on Amer- ican cultural vulnerabilities and racial tensions to Russian disinformation is just the tip of the iceberg. fan the flames — first, to sow division, then, in sup- Other entities and individuals can learn and deploy port of Donald Trump. African-American voters the strategies and tactics used by Russia and were targeted with content meant to stoke racial other state actors. Tech platforms are continually tensions and with voter suppression ads. behind the curve in fighting the weaponization of their platforms. And troll armies for hire will almost Russia also targeted Americans with malinformation, certainly be utilized by state actors and corpo- namely the DNC hacked emails and the Podesta rations looking to spread disinformation for their email leaks. A Russian state-sponsored hacking own purposes. collective known as Fancy Bear was responsible for both of these attacks, which the American political press covered incessantly and gleefully in the weeks Further Reading leading up to the 2016 election. Oxford Internet Institute: Computational Disinformation attempts from Russia are ongo- Propaganda Worldwide ing but they are not the only foreign state actors attempting to manipulate Americans through New Knowledge: The Disinformation Report mis-, dis- and malinformation. Recent news shows Stop Online Violence Against Women: Facebook and Twitter’s removal of what they term Facebook Ads that Targeted Voters “inauthentic content” targeting Americans included Centered on Black American Culture with content originating from Iran and Venezuela as Voter Suppression as the End Game well as Russia. 15
Disinformation and Environmental Advocacy Culture Wars Whether state-sponsored or A recent study from the Oxford Internet Institute Computational Propaganda Project offers some domestic, trolls that traffic insight into this. They did a yearlong study map- in disinformation are largely ping out the discourse around climate change on uninterested in pushing any Twitter and Facebook and found that: “ specific legislative or policy 1. most of the content and agenda. In the Trump era, we’ve commentary shared on both platforms seen this with every policy fight espouses the scientific consensus; 2. the greatest share of content on from health care repeal to the Twitter (33%) and Facebook (49%) deeply unpopular tax bill. comes from professional news sources; 3. businesses drive a lot of the That’s because cultural content is far more effec- conversation on Twitter, while civil tive as an information warfare tactic. In a recent society content gets more traction on Media Matters study of memes and images from Facebook; the most popular right-wing Facebook pages, the content that consistently went viral was either 4. audiovisual content like YouTube anti-immigration, pro-gun, pro-Trump, veteran videos plays an important part in clickbait or race baiting. (I suspect if the study polarizing and conspiracy content; was done again in 2019, anti-choice memes 5. on Facebook, accounts promoting would be added to that list.) The only policy skepticism seem significantly less position that went viral with the right online were integrated with the broader community calls for voter suppression laws, and most of that than consensus accounts; and ” content was explicit about whose right to vote 6. there is little evidence of they wanted to suppress. automated tweeting. The emphasis on cultural rather than policy fights It’s tempting to read these findings and breathe a might explain why the environmental movement sigh of relief, but there are signs that the far-right has up until now been largely inoculated from are attempting to tie the environmental movement large-scale disinformation campaigns on the same to cultural issues. In this next section, we’ll cover scale that other progressive movements have emerging cultural issues that the right are exploit- been forced to deal with. ing around the Green New Deal and its champion, Rep. Alexandria Ocasio-Cortez. 16
Alexandria Ocasio- Cortez and Green New Deal Conspiracies New York’s Rep. Alexandria Ocasio-Cortez and her Green New Deal have rapidly become the GOP’s new punching bag. Ocasio- Cortez has made waves for her unapologetically progressive politics — and this has led to a wave of lies, attacks, trolls and amplification using bots and even imposter content all appear consistently. Dozens and sometimes hun- memes from the right wing. dreds of negative comments follow virtually every post she makes. Users (or bots) share doctored In an interesting new combination, it appears AOC photos and memes to make her look bad. is being targeted for being a socialist, a woman, a person of color and an environmentalist. This Attacks on AOC use memes and bullying messages confluence might be creating a dynamic where that attempt to discredit her professionalism, the furthest right echo chamber is now opposing intelligence or simply make fun of her appearance. environmentalism as a way to express its intrinsic They connect Ocasio-Cortez with communism or racism and misogyny. Marxism, make her seem ditzy and unknowledge- able or deem her guilty of corruption or illegal Ocasio-Cortez’s Twitter account shows classic acts such as money laundering. forms of disinformation at work. Harassment, spreading false and misleading information, false 17
Disinformation and Environmental Advocacy Mainstream news outlets sometimes spread anti- In keeping with disinformation tactics like AOC disinformation, such as the now-debunked imposter content, bad actors or bots created a Fox News claim that Ocasio-Cortez intended to number of fake look-a-like accounts of Alexandria get rid of farting cows. One common troll post in Ocasio-Cortez. With names like “Alexandria AOC’s comments is a right-wing conspiracy video Occasional Cortex” and “Alexandria Occasional called “The Brains Behind AOC Alexandria Oca- Cortex fake,” the accounts comment, post and like sio-Cortez” that claims she is a paid actor and a content just as anybody else would, using their “puppet congresswoman.” Right-wing news outlets, accounts as a way to mock the congresswoman such as Long Room, Infowars and Canada Free and potentially confuse users who don’t look too Press, give the video free publicity too. closely at Twitter handles or account info. Further Reading/Action Items • Oxford Internet Institute: Climate Change Consensus and Skepticism: Mapping Climate Change Dialogue on Twitter and Facebook • Quartz: Linguistic data analysis of 3 billion Reddit comments shows the alt-right is getting stronger • Book: Troll Nation 18
The Pro-Trump Media Ecosystem The current state of political media Here’s how the network breaks down: is built around supporting Trump. Message board communities including but Just as far-right movements across not limited to 4chan and 8chan are a breeding ground of white supremacist and misogynist the globe have coalesced around messages and memes, disinformation, conspiracy Trump, so have the activities of theories and organized harassment campaigns. trolls spreading disinformation. This Narrative builders such as false news sites, You- ecosystem impacts the traditional Tubers, podcasters and fringe right-wing media sites such as Breitbart amplified these messages media in ways we haven’t seen to a larger audience. previously, and public relations Dissemination networks operating on popular agencies (corporate and nonprofit platforms, mostly Facebook and Twitter, spread clients) are forced to adapt their messages and ideas to a much larger audience strategies as well. Understanding through a combination of automation, false accounts and private groups for organizing. this new ecosystem is key. Traditional right-wing media outlets like Fox News and talk radio broadcast a (usually) san- itized version of this content to their massive audiences once critical mass is reached online. CASE STUDY: The 4chan to Fox News to Trump Pipeline Pro-Trump trolls often take advan- tage of the traditional media’s pro- pensity to mine social media for stories. The thinking seems to be: if it’s trending on social media, view- ers will care about it. There are several examples of disinformation and extremist content making its way from 4chan to Fox News (where Trump often further amplifies Fox News programming via his Twitter account). A good example of how this happens can be seen in a debunked myth that there’s a white genocide movement in South Africa. 19
Disinformation and Environmental Advocacy As Vox reports, the white genocide myth actually tweeted demanding his secretary of state, Mike began as corporate misinformation: Pompeo, open an investigation not long after. AfriForum is a lobbying group based in South The South African myth shows not just how the Africa that advocates for the rights and inter- pro-Trump network functions, but how easily it ests of a segment of the country’s white can be gamed by hostile actors, including corpora- minority known as Afrikaners. Formed in 2006, tions, who understand information warfare. Again, one of its biggest lobbying efforts in recent given that much of the disinformation the environ- years has been aimed at convincing the inter- mental movement currently deals with comes from national community that there is a widespread corporate actors, it’s helpful to have a working campaign of race-based killings targeting knowledge of this ecosystem. white farmers in South Africa. Charlie Warzel’s article outlining how pro-Trump In addition to a huge digital media presence, media respond to a crisis can deepen your under- its leaders travel around the world to meet with standing of the ecosystem. For a deep dive into prominent politicians and other influential peo- how this ecosystem works within the larger polit- ple and promote this horrifying story. ical media landscape, I’d recommend the book Network Propaganda: Manipulation, Disinforma- There’s just one problem: The story tion, and Radicalization in American Politics. isn’t real. There is no evidence of a widespread campaign of violence and murder targeting white farm- Further Reading/Action Items ers in South Africa. • New York Times: YouTube, the Great Radicalizer AfriForum used the pro-Trump online ecosystem to spread this untrue story. It became a popular • Buzzfeed: How The Pro-Trump Media topic on the chan message boards, the subreddit Responds To A Crisis In Just 4 Steps for Donald Trump fans, and with far-right person- • Book: Network Propaganda: alities such as Richard Spencer, Lauren South- Manipulation, Disinformation, and ern and Ann Coulter. Breitbart and Russia Today Radicalization in American Politics picked up the story. Fox News’ Tucker Carlson covered it on his primetime TV show. Trump 20
Consultant hyper-partisan news networks (and email leads) Republican candidates and consultants have migrant political action committee, controls mul- embraced hyper-partisan news sites (largely mod- tiple Facebook pages that have repeatedly linked eled from false news sites) as a political strategy. to hyper-partisan and fake news content from a GOP consultant Dan Backer has turned fake news handful of sites. Last election cycle, several GOP into a moneymaker for his pro-Trump super PACs candidates paid to use Roger Stone’s email list, by using them to drive email sign-ups and dona- including Rick Scott and Eric Brakey. tions. Reilly O’Neal, the CEO of Mustard Seed Media — which consulted on failed Alabama Sen- The line between campaign ate candidate Roy Moore — purchased Big League tactics and trolling has blurred Politics, a pro-Trump news site known for enter- taining conspiracy theories. It seems likely that to the point where it’s difficult the site will cover Republican candidates O’Neal to tell where legitimate is working with favorably, considering his back- campaigning ends and trolling ground as a political consultant. A recent study of Republican candidates in 2018 found that they begins. Environmental groups can were promoting fake news sites via their personal and should continue to call out Facebook pages run by Liftable Media. Three of traditional media when they fail to Liftable Media’s websites have spun anti-Islam conspiracy theories. report political consultant trolling or fail to disclose hyper-partisan Harris Media was caught running independent expenditure Facebook ads that were anti-Semitic news sites they run that might in Florida. William Gheen, the head of an anti-im- cover their candidate favorably. Legacy media added fuel to the fire. According CASE STUDY: to an article in the Huffington Post, “U.S. news- ACORN in 2009 vs. papers published over 1,700 stories on ACORN during October − more than 10 times the average Sierra Club in 2018 monthly total so far that year. More than three- fourths of these stories were devoted to allega- The dismantling of ACORN in 2008 tions of “voter fraud.” and 2009 is a prescient example In 2009, right-wing activist James O’Keefe of what bad actors spreading released a series of undercover videos filmed disinformation are capable of. in ACORN offices via Breitbart that were selec- tively edited. O’Keefe claimed the videos showed ACORN conspiracies and disinformation about ACORN workers partaking in a variety of illegal alleged voter fraud began percolating in 2008 activities, all of which would be proven untrue. and were validated by losing presidential candi- ACORN was exonerated after criminal investiga- date John McCain and his vice presidential nom- tions in New York and California, but by then it inee Sarah Palin through the end of the 2008 was too late. Congress, despite holding a Demo- presidential campaign. cratic majority, had already voted to defund them. 21
Disinformation and Environmental Advocacy Thanks to a troll campaign that The Sierra Club’s proactive strategy also allowed them to frame any recordings Project Veritas included conspiracy mongering, might release. From a ThinkProgress article: harassment of staffers, disinforma- tion and malinformation — ampli- The Sierra Club doesn’t believe White obtained anything embarrassing about the organization fied by legacy media every step of or gained access to sensitive information or the way — ACORN was no more. databases. But Kash emphasized the group is concerned he may have secretly videotaped or In 2018, the Sierra Club caught and exposed an recorded conversations with individuals who infiltrator from O’Keefe’s organization, Project Ver- thought they were having a private conversa- itas. They were able to control the narrative and tion. “That’s Project Veritas’s MO,” she said. prevent the release of any potentially damaging information by proactively telling their own story ACORN had no template to help them prepare for in the Sierra Club magazine: O’Keefe’s malinformation attack, but the Sierra Club’s handling of a Project Veritas infiltrator [Veritas staff person] White also allegedly shows how far the progressive movement has approached the campaign of Jon Tester, incum- come since O’Keefe first appeared on the scene 11 bent Democratic senator from Montana, pre- years ago. senting himself as the fundraising coordinator for the Sierra Club’s Angeles Chapter. Accord- That doesn’t mean we’re out of the ing to ThinkProgress, “He was so persistent in his requests for time with Tester’s campaign woods yet. The most effective troll staff that the campaign contacted the Sierra campaigns isolate the target from Club’s national office to ask about him—a short their community, and ACORN’s time before the Sierra Club received a tip about his infiltration.” That tip came via Windsor, who final blow came from former allies. says it came from a second anonymous source: Bad actors in 2020 will attempt “I reached out to the Sierra Club and said, ‘Hey, I to sow similar divisions within the got this tip; see if this is the guy.’” broader progressive movement. It was indeed, and White was quickly removed Legacy media organizations can be from his volunteer position. (The Club is now implementing new security procedures as a counted on to aid their efforts via result.) “We assume he was recording every- amplification. The environmental thing,” says Watland, although he knows of no movement can help organizations sensitive conversations that the faux reception- ist might have been involved in. California law facing attack by standing together prohibits most secret recordings. While target- in solidarity against mis-, dis- and ing the Sierra Club would seem to be in line with malinformation and creating an Project Veritas’s modus operandi, Watland and Windsor believe that this time he was using his information-sharing network so that involvement as a cover for a further infiltration of groups see each other’s attacks. key political campaigns. “I think what is happen- ing here is that they’re doing a bank shot,” Wind- sor says. “They’re building up cover to go after candidates. It shows a rising level of sophistica- tion in how they approach their targets.” 22
Communication Strategies For Nonprofits to Adapt Benjamin Franklin, talking about Now that we’ve outlined the fire prevention, said “an ounce potential problems facing the of prevention is worth a pound environmental movement, what of cure.” The same is true for can be done to combat them? This preventing attacks and inoculating next section will walk through the environmental movement from crafting an effective comms and them. As we’ve laid out, the tools digital strategy with an emphasis that trolls use might change, but on preparation. We’ll cover the strategies and tactics for the assessing threats, training staff, most part do not. Environmental creating a crisis communications organizations have the opportunity plan, as well as some suggestions to get ahead of attacks by for collective action. assuming they’re inevitable, assessing their organization’s own vulnerability, gaming out potential threats and creating a decision tree for response ahead of time. 23
Disinformation and Environmental Advocacy Assessing Threats 3. Determine if there is a political or social action point (i.e., something worth doing) related to the falsehood itself; At this point, you might be won- for example, editorials that provide dering if there is a way to assess media literacy strategies for recogniz- ing and resisting networked manipula- whether an organization should tion campaigns ignore or respond to an attack of 4. Determine if the risk of entrenching/ trolling or disinformation. If the rewarding the falsehood in some stories is worth dislodging the attack hasn’t spread beyond the falsehood in others dark corners of the internet, any organizational response risks If the answer to each of these ques- amplifying it and reaching main- tions is no, then the story isn’t worth reporting at that time. If a story ulti- stream audiences that would other- mately passes the tipping point and wise have no idea the attack exists. does become appropriate to report (because of clear risks to public safety, Advocacy organizations face a special challenge because of the underlying media here. Advocacy groups are always seeking ways systems the story unearths), reporters to keep their membership engaged, and an attack should be especially careful to follow can be an excellent opportunity for fundraising established best reporting practices, and list building. The risk lies in elevating informa- with particular attention paid to the tion that might not have spread otherwise. origins of the information, as well as its broader context— both of which should Data and Society’s Oxygen of Amplification be discussed transparently in the arti- report is a helpful resource for determining cle itself. Whenever possible, experts a course of action. The report is written with in the particular subject area should be reporters and newsrooms in mind, but I find the recruited to write or consult on edito- ” criteria for response work just as well for advo- rial pushback, to ensure the clearest cacy organizations. and most informed refutations possible. “ From the report: Environmental organizations will be well-served 1. Determine if the story reaches the tip- to use the same criteria when under attack. While ping point (drawing from Claire Ward- no evaluation system is perfect, answering these le’s definition, that it extends beyond questions before enacting a plan will help your the interests of the community being organization potentially spread damaging dis- or discussed) malinformation to a larger audience than other- 2. Determine if there would be a public wise would have seen it. health takeaway (i.e. something worth learning) from the debunking; for exam- ple, explanations that identify and ana- lyze manipulators’ rhetorical strategies, including their use of humor 24
Train Your Staff • Who are your messengers online and off who can help stop the spread, tell your story and show support? It is essential to train staff to recognize potential threats, to understand the organization’s pro- • Who are the friendly media outlets and cesses when threats occur and to enact the crisis reporters you’ll contact to tell your story? comms plan. There is a limited window of time in • Who will reach out to Facebook, Twitter and which a response can be effective, and the more Google if content takedowns are necessary prepared staff are, the better they’ll be able to due to disinformation? act with a clear head about the situation at hand. • Who are your trusted decision makers and/or The manual and the accompanying training mod- outside advisers in crisis? ule can serve as resources for new staff and a refresher course for veteran staffers. • Is a streamlined approval process for press statements and digital content necessary? If so, what does that process look like? Make a Plan We’ve included a worksheet as part of this man- When your organization is under ual to help your organization create a crisis plan of your own. attack, timing is everything. You have a limited window when you Outside of crisis planning, fostering a culture of awareness is essential. Social media monitoring can push back and keep dis- and needs to be someone’s clear and stated responsi- malinformation from spreading bility. The amount of monitoring needed varies by and sticking. It’s important to organization (and available resources) but can be as simple as having staff search Twitter, Facebook make a plan ahead of time for if and YouTube each day or as resource-intensive as this happens. You can’t plan for purchasing a social listening tool. every variable, but you can answer some broad and basic questions ahead of time. 25
Disinformation and Environmental Advocacy Digital Security • Organizing for Action: 10 things you and your grassroots organization can do to improve your digital security An organizational commitment to • Equality Labs: Anti-Doxing Guide For Activ- digital security for both the organ- ists Facing Attacks From The Alt-Right ization’s assets and staff’s personal accounts is critical. And such work The Importance of shouldn’t fall only to your organi- zation’s IT department or security Solidarity teams. Everyone from campaigners, Trolls look to sow division. They win by dividing us against one another. If one environmental orga- organizers, communications and nization falls, the others are more vulnerable to membership outreach to finance future attacks because the strategy has worked. and administration staff could be Presenting a united front against attacks is essen- tial. Solidarity with one another makes it more targeted. difficult for the attacks to succeed. Staffers are oftentimes not targets on their own Sharing information and best practices will also but because of their work for the organization, aid the environmental movement in improving the making dedication to staff’s own security a work- movement’s response to attacks. The Sierra Club place obligation. Time should be deliberately was able to employ learned best practices from carved out for taking these matters seriously: several groups who have been infiltrated by Proj- organizations can educate staffers through work- ect Veritas since ACORN. The sooner others learn shops, meetings with IT, standardized rules such about a tactic that trolls are using, the sooner they as password protection and even ensuring staffers can inoculate their organizations from facing a understand the implications behind such threats. similar line of attack. Organizations should keep in mind that some employees may be targeted more than others. Women, people of color and community activists Action Items are oftentimes more vulnerable, and companies would do well to protect them accordingly — and • Read The Oxygen of Amplification Report in turn — protect the information and materials these staff members have access to. The more • Using the provided worksheet, create a prominent and successful your campaigns or cam- preemptive crisis communications plan for paigners, the more they could become a target of your organization. your opposition. As many organizations have learned the hard way, one simple mistake from an unknowing staffer can become an organizational nightmare. The follow- ing guides can help you come up with a plan for your organization as a whole and for individual staff professional and personal accounts. 26
You can also read