TROLL PATROL INDIA Exposing Online Abuse Faced by Women Politicians in India - Amnesty International India
←
→
Page content transcription
If your browser does not render page correctly, please read the page content below
Amnesty International India is part of the Amnesty International global human rights movement. Amnesty International India seeks to protect and promote the human rights of everyone in India. Our vision is for every person in India to enjoy all the rights enshrined in the Universal Declaration of Human Rights, other international human rights standards and the Constitution of India. We are independent of any government, political ideology, economic interest or religion, and are funded mainly by contributions from individual supporters. First published in 2020 by Indians For Amnesty International Trust #235, Ground Floor, 13th Cross, Indira Nagar, 2nd Stage, Bengaluru – 560038, Karnataka, India © Indians For Amnesty International Trust (Amnesty International India) Original language: English Except where otherwise noted, content in this document is licensed under a Creative Commons (attribution, non-commercial, no derivatives, international 4.0) licence. https://creativecommons.org/licenses/by-nc-nd/4.0/legalcode Where material is attributed to a copyright owner other than Amnesty International India, this material is not subject to the Creative Commons licence. Illustrations: Bakarmax
TROLL PATROL INDIA – EXPOSING ONLINE ABUSE FACED BY WOMEN POLITICIANS IN INDIA 3 TROLL PATROL INDIA Exposing Online Abuse Faced by Women Politicians in India CONTENTS EXECUTIVE SUMMARY 04 - 07 METHODOLOGY 08 - 15 DECODERS: THE SPIRIT OF VOLUNTEERISM ENABLED 16 - 19 THE CROWDSOURCED RESEARCH FINDINGS 20 - 40 TWITTER: HUMAN RIGHTS RESPONSIBILITIES 41 - 47 TROLL PATROL INDIA: RECOMMENDATIONS 48 - 49 ANNEXURE 1: ENRICHMENT 50 - 51 ANNEXURE 2: LANGUAGES, LANGUAGE DETECTION AND 52 - 54 FLAGGING ANNEXURE 3: AMNESTY DECODERS TOOL AND 55 - 56 SCREENSHOTS ANNEXURE 4: WEIGHTING 57 ANNEXURE 5: AGREEMENT ANALYSIS 58 - 60 ANNEXURE 6: LETTER TO TWITTER 61 - 64 ANNEXURE 7: TWITTER'S RESPONSE 65 - 76
TROLL PATROL INDIA – EXPOSING ONLINE ABUSE FACED BY WOMEN POLITICIANS IN INDIA 5 India is one of the largest and fastest growing audience markets globally for Twitter, a social media platform.1 Touted PROBLEMATIC CONTENT as a ‘safe place for free expression’, Twitter was envisioned Tweets that contain hurtful or hostile content, especially if to be a space where marginalised populations, including repeated to an individual on multiple occasions, but do not women, Dalits and religious minorities, would have an equal necessarily meet the threshold of abuse. Problematic tweets opportunity to make their voices heard. Over the years, the can reinforce negative or harmful stereotypes against a group social media platform has evolved into an indispensable tool of individuals (e.g. negative stereotypes about a race or for political engagement, campaigning and activism, but has people who follow a certain religion). the vision translated into reality? Many women do not believe We believe that such tweets may still have the effect of so. Every day, women on Twitter face a barrage of abuse: from silencing an individual or groups of individuals. However, we racist and sexist attacks to rape and death threats. do acknowledge that problematic tweets may be protected expression and would not necessarily be subject to removal Using crowd-sourced research and data science, Indians for from the platform. Amnesty International Trust,2 in collaboration with Amnesty International – International Secretariat (AI-IS) measured the scale and nature of online abuse faced by women politicians in India during the 2019 General Elections of India. The study found that abuse experienced by Indian women ABUSIVE CONTENT politicians was high, suggesting that Twitter is failing in its Tweets that promote violence against or threaten people responsibility to respect women’s rights online. The study based on their race, ethnicity, national origin, sexual supports the notion that for many women, the social media orientation, gender, gender identity, religious affiliation, age, platform has turned into a ‘battlefield’.3 disability, or serious disease. Examples include physical or sexual threats, wishes for the physical harm or death, We studied tweets mentioning 95 Indian women politicians in reference to violent events, behaviour that incites fear or the three-month period of March – May 2019, in the lead- repeated slurs, epithets, racist and sexist tropes, or other up to, during and shortly after the 2019 General Elections content that degrades someone. in India. Of the total volume of 7 million tweets mentioning these politicians, we sampled 114,716 for the purpose of analysis through our Troll Patrol India project.4 Engaging 1,912 volunteers, known as ‘Decoders’ from 82 “When you are on social media, countries, the tweets were analysed to create a labelled set of 'problematic' or 'abusive' content. The Decoders were shown you face trolls, threats, abuses a tweet with username obscured, mentioning one of the women in our study. They were, then asked simple questions and challenges 100% of the time. Their purpose is to silence about whether the tweet was problematic or abusive, and if so, whether they revealed sexist or misogynistic, religious, casteist, racist or homophobic abuse, or physical or sexual threats. Each tweet was analysed by multiple people. The you. It makes you want to cry. Decoders were given a tutorial, definitions and examples of 'problematic' and 'abusive' content, as well as an online They talk about your personal forum where they could discuss the tweets with each other and with our researchers. The labelling of the Decoders was analysed between July 2019 - November 2019.5 life, your looks, and your family.” – Alka Lamba, Member, Indian National Congress 1. Colin Crowell, Setting the record straight on Twitter India and impartiality, 8 Feb. 2019, Twitter, https://blog.twitter.com/en_in/topics/events/2019/impartiality.html 2. Hereinafter referred as ‘Amnesty International India’ 3. Interview with Shazia Ilmi, Member, Bharatiya Janata Party, on 25 November 2019 in New Delhi, India 4. For purpose of study, the term ‘women politicians’ includes party members who may or may not have held elected office 5. The crowd-sourced research mobilised as many as 1,912 Decoders from 82 countries, with 57.3% (1,095 Decoders) hailing from India across 26 states. About 1,750+ hours was contributed cumulatively by the Decoders. The research analysed tweets in 9 languages, including Bengali, English, Gujarati, Hindi, Kannada, Malayalam, Marathi, Tamil and Telugu.
6 TROLL PATROL INDIA – EXPOSING ONLINE ABUSE FACED BY WOMEN POLITICIANS IN INDIA WHAT DID WE FIND? suggested categories. Nevertheless, of all the tweets that were labelled as problematic or abusive, over 1 in 5 answers (19.9%) showed sexism or misogyny. 1. 1 in every 7 tweets that mentioned women politicians in A deeper look at the tweets that demonstrated sexism India was 'problematic' or 'abusive' or misogyny showed that sexism was experienced by women independent of political ideology or affiliation, 13.8% of the tweets that mentioned 95 women religion, caste, race, age, marital status, and election politicians in the study were problematic (10.5%) or outcome. abusive (3.3%). This amounted to 1 million problematic or abusive mentions of the 95 women between March and May 2019, or over 10,000 problematic or abusive tweets every day across all women in the sample. 5. Muslim women received 94.1% more ethnic or religious slurs than women from other religions Women politicians who are or perceived as Muslims 2. Indian women politicians experienced substantially received significantly more abuse when compared to higher abuse than their UK and USA counterparts women from other religions. They received 55.5% more problematic or abusive content. 26.4% of the problematic A similar study conducted by Amnesty International in or abusive content experienced by them contained ethnic/ the UK and USA in 2018 to measure the online abuse religious slurs, nearly double the proportion for women who faced by 323 women politicians found that 7.1% tweets are or perceived as Hindus (13.7%). mentioning politicians were 'problematic' or 'abusive'.6 Whereas this study, using a similar methodology, but focusing on a shorter period during elections found that Indian women politicians experienced 13.8% 6. Women politicians belonging to marginalised castes problematic or abusive tweets, which is substantially received 59% more caste-based abuse compared to higher. women from other castes Women from marginalised caste received 59% more caste based abuse than women from general castes. 3. Women politicians prominent on Twitter are targeted more In cases, where problematic or abusive content was identified, women belonging to marginalised castes, It has long been assumed that being more visible on such as Scheduled Castes, Scheduled Tribes and social media as a woman, can lead to more abuse. This Other Backward Classes (8.6%) received more caste- study confirms that there is a clear correlation between based slurs than those belonging to general (5.4%) or the number of mentions and the proportion of abusive Unknown/ Undeclared caste (7.2%).7 content received by women. Notably, a prominent woman politician from We tested this by grouping the politicians by mentions. marginalised caste received significantly more caste- When we split the group into the top 10 most-mentioned based slurs than others. This indicates that caste politicians and others, we found that the top 10 had a identity is more often than not, a key element of mean of 14.8% problematic or abusive content versus problematic or abusive content for women belonging to 10.8% for the others. This meant that while the top 10 marginalised castes. received 74.1% of all mentions, they received 79.9% of problematic or abusive mentions. 7. Women politicians from political parties other than the Bharatiya Janata Party experienced more abuse 4. 1 in every 5 problematic or abusive tweets was sexist or misogynistic While women politicians across all political parties experienced sexist abuse, their overall experience was Decoders were directed to label the problematic or divided along party lines. abusive content as containing sexism and/or misogyny, ethnic or religious slur, racism, casteism, homophobia or As compared to women politicians associated with the transphobia, sexual threats, physical threats or ‘other’. Bharatiya Janata Party (BJP), which is also the ruling 'Other' was used when tweets containing problematic or party in India currently, women politicians from ‘other abusive content did not fit in the categories provided. parties’ 8 experienced 56.7% more problematic or Notably, the most common selection was the 'other' abusive content. Women politicians associated with the category (74.1%). This indicated that most problematic Indian National Congress (INC) also received 45.3% or abusive content did not fit neatly into any of the more problematic or more abusive content than BJP.
TROLL PATROL INDIA – EXPOSING ONLINE ABUSE FACED BY WOMEN POLITICIANS IN INDIA 7 Online abuse against women on this scale should not and uniform, and are based on human rights standards and does not have to exist on social media platforms. Companies gender-sensitive due diligence. Considering India’s linguistic like Twitter have a responsibility to respect human rights, diversity, Twitter should ensure coverage not only of India’s which means ensuring that women and marginalised groups main languages, but also regional languages, with due using the platform are able to express themselves freely and focus on mixed language tweets where native scripts are without discrimination. used alongside Latin scripts. Further, it should ensure that discrimination and abuse on the basis of gender, caste, In the recent past, Twitter has admitted that it has created religion, ethnicity, gender identity, sexual orientation an unsafe space for women by perpetuating harassment and and other identifying factors does not prevent users from abuse. While Twitter has guidelines in place to identify abuse exercising their right to freedom of expression equally on the and hate, and has also improved its policies and reporting platform. Importantly, it must constantly and transparently process over the years, the findings suggest that these evaluate and measure whether it is effectively tackling online policies are not sufficient to address the toxicity that women violence against women. face online. Online abuse has the power to belittle, demean, intimidate Amnesty International through its Toxic Twitter study (2018)9 and eventually silence women. Twitter must reaffirm its and Troll Patrol study (2018)10 has repeatedly asked Twitter commitment to providing a ‘safe space’ to women and to make available meaningful and comprehensive data marginalised communities. Until then, the silencing effect regarding the scale and nature of abuse on their platform, as of abuse on the platform will continue to stand in the way of well as how they are addressing it. women’s right to expression and equality. Based on the findings of this study, we recommend further steps for Twitter to ensure that its policies are transparent, “People should know what women in politics endure, what they have to put up with and how unequal it becomes for them. It is such a tough battlefield, so to speak. Really I do believe that Twitter is my workplace.” “But if my workplace were to be a battlefield, all the time, would I be able to contribute, to the cause that I represent, easily and with fairness, if I am constantly being attacked for being a woman.” – Shazia Ilmi, Member, Bharatiya Janata Party 6. Amnesty International, Troll Patrol Findings, https://decoders.amnesty.org/projects/troll-patrol/findings 7. Marginalised caste, in this study, included politicians belonging to Scheduled Caste, Scheduled Tribe and Other Backward castes. The Scheduled Caste (SCs) and Scheduled Tribes (STs) are officially designated groups of historically disadvantaged people in India. Articles 341 and 342 of the Constitution of India define who would be Scheduled Castes and Scheduled Tribes with respect to any State or Union Territory. For the purpose of study, the caste identity of the women was taken from Lok Dhabha, an initiative of Ashoka University. ‘Unknown/ Undeclared caste’ refers to those who either did not publicly declare their caste and/or Amnesty International India was unable to identify them with a particular caste group. 8. Other parties included Aam Aadmi Party, Apna Dal, All India Anna Dravida Munnetra Kazhagam, All India Trinamool Congress, Bahujan Samaj Party, Community Party of India (Marxist), Communist Party of India (Marxist-Leninist), Dravida Munnetra Kazhagam, Jammu & Kashmir National Congress, Jammu & Kashmir Peoples Democratic Party, Jharkhand Mukti Morcha, Rashtriya Janata Dal, Shiromani Akali Dal, Shiv Sena, Samajwadi Party, Telangana Rashtra Samiti, Yuvajana Sramika Rythu Congress Party. Our analysis could not be further disaggregated by ‘other’ parties as we did not have statistically relevant samples for each party (having a sample per party between 1 and 4 members). Further, we observed that, except for some notable cases, many women from these parties did not have an active Twitter account. 9. Amnesty International, Toxic Twitter, INDEX NO. ACT 30/8070//2018 10. Amnesty International, Troll Patrol Findings, https://decoders.amnesty.org/projects/troll-patrol/findings
8 TROLL PATROL INDIA – EXPOSING ONLINE ABUSE FACED BY WOMEN POLITICIANS IN INDIA Methodology Troll Patrol India project is a joint effort by human rights researchers, technical experts and thousands of online volunteers to build a large crowdsourced dataset of online abuse against women politicians in India. This chapter explores the methodology adopted to measure the scale and nature of online abuse faced by women politicians in India. It discusses ‘who did we study’, ‘when did we study’ and ‘how did we study’. The women politicians represented a variety of political views spanning the ideological spectrum. Using data science we were able to provide a quantitative analysis of the scale of online abuse against women politicians in India.
TROLL PATROL INDIA – EXPOSING ONLINE ABUSE FACED BY WOMEN POLITICIANS IN INDIA 9 Troll Patrol India is a crowdsourcing effort to demonstrate The Troll Patrol India microsite was launched for decoding the scale and nature of online abuse against Indian women on 15 May 2019 and was open till 8 August 2019. Great politicians on Twitter in the context of the 2019 General effort was put into designing the user-friendly and interactive Indian Elections of India. Following the methodology interface, and could be accessed by Decoders through developed together with Element AI for Amnesty computers and mobile phones.13 International’s 2018 Troll Patrol study, we built a database of over 114,716 tweets mentioning 95 women politicians from STUDIED POPULATION: WOMEN India and asked digital volunteers, known as ‘Decoders’ to identify problematic and abusive content in those tweets. An impressive number of 1,912 Decoders from 82 POLITICIANS ON TWITTER countries analysed the tweets to create a labelled dataset of problematic or abusive content. The Decoders were shown a tweet with username obscured, mentioning one of the women IDENTIFYING INDIAN WOMEN in our study, then were asked simple questions about whether the tweets were problematic or abusive, and if so, whether POLITICIANS they revealed misogynistic, casteist or racist abuse, or other types of violent threats. Each tweet was analysed by multiple The sample of women politicians was identified prior to the Decoders. submission of nomination papers by the women contesting in the 2019 General Elections. The criteria for sample selection The Decoders were shown a video tutorial and definitions were as follows- and examples of problematic and abusive content, and they were encouraged to engage on an online forum where they • Members of Parliament in the two most recently elected could discuss the tweets with each other and with Amnesty Lower House of Parliament (15th and 16th Lok Sabha) International’s and Amnesty International India's researchers. • Members of Parliament in the two most recently elected In total the Decoders labelled 142,474 tweets, totalling Upper House of Parliament (Rajya Sabha) 474,383 individual answers. This large number allowed us • Members of the Legislative Assembly of the States and to have both a dataset of 114,716 tweets sampled uniformly Union Territories as of February 2019 at random, and a smaller dataset of 27,758 tweets sampled from a more experimental procedure. The experimental • Party office bearers and spokeswomen for all the dataset was made possible by the huge amount of interest in national and regional political parties the campaign. It was a learning opportunity for us to further our experience of using advanced analysis tools on real world • Members from reserved constituencies (seats reserved data, and preparing the terrain for future campaigns. In the for specific groups in the Indian Parliament at national interest of simplicity and methodological clarity, however, all and state level) the statistics in the rest of this report were computed on the • Current and former chief ministers (elected heads of basis of the tweets sampled uniformly at random. government) for all states and union territories Using the subset of 114,716 tweets sampled uniformly at The Twitter accounts of these women were identified. If the random and annotated by the Decoders, we extrapolated the account was not “verified” (a Twitter badge that recognises abuse analysis to the full 7 million tweets that mentioned the the account to be authentic)14 then the team investigated to Indian women politicians selected for our study. The results ensure that the account was genuine, by using party websites published in this study are based on this. and other such sources. Our research resulted in 101 Twitter The tweets were deployed through Troll Patrol India page11 handles. Of these some had very little Twitter activity – 95 - hosted on Amnesty Decoders,12 the micro-tasking platform politicians had at least one tweet mention in the final dataset based on Hive (Labs, 2014) and Discourse (Discourse). This and 82 had 10 or more tweet mentions in the sample set platform by Amnesty International engages Decoders (mostly given to Decoders. existing members and supporters) in human rights research. 11. Troll Patrol India, Amnesty International India, https://decoders.amnesty.org/projects/troll-patrol-india 12. Troll Patrol, Amnesty International, https://decoders.amnesty.org/projects/troll-patrol 13. See, Annexure 3, Amnesty Decoders Tool and Screenshots 14. Note that Twitter verification has been mostly on hold since 2018. See, About Verified Accounts, Twitter, https://help.twitter.com/en/managing-your-account/about-twitter-verified- accounts
10 TROLL PATROL INDIA – EXPOSING ONLINE ABUSE FACED BY WOMEN POLITICIANS IN INDIA PROFILE OF THE POLITICIANS 142,474 tweets for the Troll Patrol India project. The sample that was labelled by Decoders included 114,716 tweets sampled uniformly at random, and a smaller dataset We researched characteristics of these politicians such of 27,758 tweets sampled from a more experimental as party affiliation, political inclination, their official role procedure.15 All the statistics in the rest of this study were prior to the elections, current post, whether they contested computed on the basis of the 114,716 tweets sampled the election, whether they won the election, their religion, uniformly at random. caste, age, sexual orientation, gender identity, marital status, their twitter followers and their twitter mentions. For this classification, we used official sources such as ethnic diversity studies and information in the public domain such as public profiles, articles written by or about these HANDLING TWEET LANGUAGES women, official websites including party websites, Parliament Considering India’s linguistic diversity, this study also looked websites, nomination papers, Lok Dhaba and Wikipedia at tweets in languages other than English. Based on an pages. It is important to note that this is an approximate initial sample, Bengali, English, Gujarati, Hindi, Kannada, classification for the purpose of this analysis, utilising Malayalam, Marathi, Tamil and Telugu were found be the research available in public domain and is not necessarily a most commonly used languages in the tweets mentioning reflection of how each politician identifies herself. the women politicians. Keeping this in mind, at the time of registration, decoders were given the option of choosing one or more languages from these nine languages for decoding. SELECTING A TIMELINE We could not obtain the language classification from Twitter itself through the third-party service used to access the data. We aimed to study tweets directed at women politicians We instead, detected languages using an automated language during the three-month period of March-May 2019 - that is, detection tool (Google Inc.). Some of the challenges in this in the lead-up to, during, and shortly after the 2019 General approach included, very short content for some tweets, no Elections of India. Therefore, tweets mentioning women access to any user data to estimate language, the ability politicians over these 3 months, were sampled. The General of users to use both Latin and native scripts on Twitter, Elections ran from 11 April 2019 to 19 May 2019, the the usage of ‘Hinglish’ – a mix of Hindi and English – or results of which were announced on 23 May 2019. other mixtures of languages, etc. Despite its shortcomings, the classification was useful in engaging Decoders in their own language and in providing a rough analysis of abuse OBTAINING AND PROCESSING in different languages. Importantly, using the available tools, the Decoders could provide feedback on language TWITTER DATA mischaracterisation.16 OBTAINING SAMPLE TWEETS DECODING - ANALYSIS OF ABUSE The women in this study were mentioned in over 7 million BY VOLUNTEERS tweets between March and May 2019. Crimson Hexagon (now The tweets obtained through random sampling were analysed part of Brandwatch) was used to obtain a subsample of these through a process called ‘Decoding’. Amnesty Decoders tweets. A random sample of 10,000 tweets were extracted is an innovative platform for volunteers around the world per day as per the Twitter API (Application Program Interface) to help Amnesty researchers in analysing large volumes of terms. The sample included tweets mentioning at least one of pictures, documents and social media messages through the women politicians’ twitter handles and excluded retweets their computers and phones. These Decoders were trained in and tweets that were deleted before the date of extraction. identifying and categorising problematic and abusive content. After a video tutorial, Decoders were asked to select their Although the API allows 10,000 tweets per day, the actual language(s) of preference. After being shown a tweet from an number was lower due to exclusion of deleted or private obscured username, which mentioned one of the women in tweets and retweets. We also removed all completely content- our study, they were asked multiple-choice questions: free tweets (those that were just a list of @mentions and no other content) – such tweets amounted to 1.7% of our initial 1. “Does the tweet contain problematic or abusive sample. content?” (No, Problematic, Abusive). Out of the data obtained from Crimson Hexagon, we sampled 2. Unless their answer was No, the follow-up question
TROLL PATROL INDIA – EXPOSING ONLINE ABUSE FACED BY WOMEN POLITICIANS IN INDIA 11 was “What type of problematic or abusive content does not expect perfect alignment between individuals. We also it contain?” The Decoders could select one or more compared the agreement amongst the crowd and experts. We responses from a list, namely - sexism or misogyny, expected “expert” Amnesty International India staff familiar ethnic or religious slur, racism, casteism, homophobia with the issues to have higher agreement than the “crowd” of or transphobia, physical threat or sexual threat or a Decoders. miscellaneous ‘other’ category for those tweets that Agreement amongst raters of tweets was quantified using could not be easily categorised. two measures: Fleiss’ kappa and intra-class correlation 3. The Decoders had to further categorise the medium of coefficient.19 abuse - whether in the form of text, image or a video. There was fair agreement of experts on whether tweets The Decoders could, at any point in time, access the contained problematic, abusive or neutral content and definitions and examples of problematic and abusive good agreement when the ordinal nature of the classes was content.17 considered (that is, 'Problematic' and “Abusive' answers were closer together than 'Abusive' and 'No'). When grouping together the 'problematic' and 'abusive' EXPERT ANALYSES classes, expert agreement was moderate. Expert agreement with respect to the type of 'problematic' or 'abusive' content Apart from engaging the Amnesty Decoders, we also had was also fair. As was expected, agreement of Decoders was 796 tweets assessed in-house by Amnesty International lower than that of experts. India’s staff familiar with the parameters of ‘problematic’ and Agreement amongst Decoders was fair when the ordinal ‘abusive’ content. Each tweet was analysed by three experts, nature of the classification was taken into consideration. thus enabling the validation of response. For each of the nine Decoder agreement was also fair when grouping together the languages, three different experts looked at all of the tweets 'problematic' and 'abusive' classes and also when evaluating in that particular language, in order to allow for an agreement the type of content for tweets classified as such. The analysis analysis between the experts. People fluent in a particular showed that our crowd results were better than random language decoded tweets of that language. For example, 261 chance.20 tweets in English were decoded by three experts fluent in English, while 38 tweets in Marathi were decoded by another three experts fluent in Marathi.18 AGGREGATED PROPORTIONS GENERATED ESTIMATES OF ACROSS TWEETS PROBLEMATIC AND ABUSIVE For the descriptive statistics given in the findings of this study, we used the same methodology developed by Amnesty CONTENT International and Element AI for the 2018 Troll Patrol Study21, aggregating Decoders’ annotations (“votes”) across all tweets in the set, as opposed to aggregating the majority vote.22 AGREEMENT ANALYSIS Our method of treating each vote as a measure meant that the following two scenarios on 10 tweets would lead to the The Amnesty team investigated whether Decoders generally same aggregated estimate of 30% abuse (simplified as “3 agreed on the classification of a tweet as problematic or abusive tweets”): abusive. Due to the subjective nature of the analysis, we did 15. See, Annexure 1, Enrichment 16. For more information about handling languages, see, Annexure 2 17. For definitions and examples of problematic and abusive tweets, see, chapter on Methodology, page 14-15 18. For results of the expert decoding agreement analysis, see, Annexure 5. 19. (Fleiss, 1971) and intra-class correlation coefficient (Shrout, 1979). https://psycnet.apa.org/buy/1972-05083-001 20. See, full agreement analysis and methodology, including definitions of fair and moderate agreement in Annexure 5. 21. Amnesty International, Troll Patrol Findings, https://decoders.amnesty.org/projects/troll-patrol/findings 22. Laure Delisle et al, Troll Patrol Methodology Note, https://decoders.azureedge.net/data-viz/images/Troll%20Patrol%20-%20Methodology.pdf
12 TROLL PATROL INDIA – EXPOSING ONLINE ABUSE FACED BY WOMEN POLITICIANS IN INDIA • 10 tweets, each labelled as abusive by 30% of its raters, the research project many times over, including resource and as neutral by the remaining 70% of its raters. intensive steps of data collection and crowdsourced labelling, and comparing the results. However, given the actual • 10 tweets, 3 of which were labelled as abusive by 100% unfeasibility of this approach, we instead obtained measures of their raters, and the other 7 are labelled as neutral by of accuracy of the results through the statistical method of 100% of their raters. bootstrapping. This was justified by the agreement analysis discussed Bootstrapping involved taking 100 random resamples with in the previous section and Annexure 5. Unlike a visual replacement from the weighted set of 114,716 labelled classification task with a clear and objective ground truth, tweets. Each random resample taken was of the same size as raters were often divided, especially when the abuse in a the set of labelled tweets and resampling with replacement tweet was subtle or contextual. The aggregation of proportions implied that a specific tweet may have ended up multiple accounts for this variability of opinions and reflected the most times in a single resample whereas another tweet may not accurate picture, without resorting to the extreme solution of have ended up in the same resample at all. a majority vote, which dismissed the minority’s perception of abuse, or considered a tweet abusive if at least one of The frequencies of tweets with problematic or abusive its raters voted as such, which overestimated the overall content as well as the frequencies of types of problematic perception of abuse. or abusive content presented in this report were calculated as the medians across the 100 resamples. The uncertainty associated with these frequencies and conversely thus the WEIGHTING robustness of our findings were provided through 95% confidence intervals, which were calculated as the range between the 2.5% and 97.5% quantiles across the 100 The answers of the Decoders were re-weighted to account resamples. The 95% confidence intervals implied, roughly for sample mismatch with the pool of tweets (the true speaking, a 95% level of confidence that the intervals distribution in the “world set”). Weighting factors used for contained the true frequencies over all of the more than 7 both responses and tweets included: 1) Response weighting million relevant tweets. accounting for tweets that were analysed less or more than the expected number of times (e.g. if one tweet had two In short, bootstrapping entailed getting multiple estimates out of six annotations as Abusive, and one had one out of the frequencies, each on a random resample with of three annotations as Abusive, those contributed in the replacement from the actual set of labelled tweets, and same way); 2) Weighting by tweet collection ‘batch’ used a aggregating over the resamples to gauge the accuracy of uniform sampling per day (e.g. if one date had 20,000 tweets our findings. The rationale for employing the bootstrapping mentioning the women and a second date had 10,000, was that repeated sampling from the empirical distribution the first date should have had twice as many tweets in the of the labelled tweets was the closest approximation to the sample) and 3) Weighting by tweet language distribution aforementioned ideal but unfeasible repeated sampling from accounting for a difference of completion between different the real-world distribution of the over 7 million tweets by languages for the last batch of data.23 conducting the research project. ACCOUNTED FOR SAMPLING UNCERTAINTY: BOOTSTRAPPING LIMITATIONS OF THE STUDY Every statistical analysis based on a sample must evaluate Crude method used for Language detection: Due to limited the robustness of its findings against the randomness resources, we used a very basic method for language induced by its sampling mechanism. In this study, one major detection, namely Google’s language auto detection. This but controllable source of uncertainty stemmed from the method resulted in the language of some tweets being limitation in data collection: Maximum 10,000 tweets per wrongly identified, which could have reduced users' day, sampled by the provider uniformly at random, among all experience as they may have received tweets in a language the tweets of that day mentioning the politicians identified. they had not selected. This is but a small portion of the much larger volume of such English as the preferred language for Decoding: A high tweets from that day. proportion of the Decoders selected English as their primary In an ideal world with unlimited resources, we would have language, possibly limiting the diversity of the study. measured the robustness of our findings by carrying out This could be a result of the recruitment efforts that were primarily in English and the limitations in our outreach as
TROLL PATROL INDIA – EXPOSING ONLINE ABUSE FACED BY WOMEN POLITICIANS IN INDIA 13 an organisation. This also resulted in English tweets being have influenced our findings. exhausted quickly, and lesser engagement on the tweets in Timeline: It is important to note that the findings of this study other languages. are restricted to the time period during which the tweets were Exclusion of first-time politicians: Women politicians who sampled i.e. between 1 March 2019 to 31 May 2019. While contested elections for the first time were excluded in this the elections were held from 11 April 2019 to 23 May 2019, study, since at the time of creating the sample of politicians the period between March and May witnessed extensive for study, the information on their candidature or nomination election campaigning. A longer period of tweet samples may was either not final or available online. Some of these women have enabled the study to capture the abuse experienced by experienced online abuse during their campaign and after women politicians over a longer period of time. It would have elections, and the study would have benefitted from their also shed light on the patterns of abuse during normal and inclusion. peak times, depending on political events and campaigning around the year. Tweet sampling and de-identifying: Decoders analysed random tweets with Twitter handles blurred. We chose to de-identify the tweets to protect individuals sending and receiving problematic and abusive content. This meant that Decoders could not tell if a woman politician was being targeted herself or merely included as one of a list of names. The tweets were also shown without the corresponding thread or conversation due to limited access to Twitter data. This meant that Decoders could not use the rest of a conversation to judge a tweet. Perception of Decoders: The study has relied on the "perception of the Decoder", which is influenced by their experiences and prejudices. What may be considered as problematic or abusive to one may not be considered the same by another. The project addressed this by showing the same tweet to many Decoders, which facilitated "validation" of responses, but it is still possible that “atypical” opinions 23. See, Annexure 4, Weighting
thatThe messagesmessages @twitterhandles showthat andwomen. show against violence violence @mentions have been blurred since the purpose here is to highlight the against women. The @twitterhandles nature of abuse The @twitterhandles and @mentionsandhave and been not @mentions the have blurred persons been since blurred who since the purpose theare here is toabused, purpose nor here isthe highlight to the persons highlight the who abuse. nature nature of abuse and of notabuse and notwho the persons the persons whonor are abused, arethe abused, norwho persons the persons abuse. who abuse. 14 TROLL PATROL INDIA – EXPOSING THE ONLINE ABUSE FACED BY WOMEN POLITICIANS IN INDIA ***** ***** ***** PROBLEMATIC PROBLEMATIC PROBLEMATIC DECODING TWEETS - DEFINITIONS @kavita_krishnan @kavita_krishnan @VanzaraDg @VanzaraDg @NewsHtn @seemi_pasha @NewsHtn @seemi_pasha I havea more I have never seen never ugly seenwoman a more ugly woman thanpig than you. A wild you. A wild would be pig would prettier be prettier than your than yourdirty disgustingly disgustingly dirtyconstant face. Maybe face. Maybe constant rejection rejection @kavita_krishnan from urhas community @VanzaraDg @NewsHtn @seemi_pasha I have never seen a more ugly woman Note you. from ur community caused u has for All: Thepig caused Tweet to eat someusmelly to eat Muslim examples some smelly will comeass. Muslim ass. withdisgustingly Trigger Warning. HaveMaybe not picked the really bad than A wild would be prettier than your dirty face. constant rejection ones.urIf any of you feel ***** hasthat any uone of the following smelly abuses Muslimshould ass. not be considered because it is ***** from community caused to eat some too graphic, let me know. Will use 1-2 examples each. ABUSIVE TWEETS ABUSIVE TWEETS ***** NoteTweet Note for All: The for All: The Tweet examples willexamples come with will comeWarning. Trigger with Trigger Warning. Have Have not picked thenot picked really badthe really bad ones. If any ofones. If any you feel of any that you one feel of that theany one of the following following abuses shouldabuses not beshould not bebecause considered considered it is because it is WARNING: The examples below contain‘Problematic’- Problematic tweets contain hurtful or hostile content, toome too graphic, let graphic, know. let meuse Will know. Will use 1-2 1-2 examples each.examples each. explicit and threatening ABUSIVE TWEETS messages that show violence @ kavita_krishnan @ VanzaraDg against especially women. if it were repeated to an individual on multiple or cumulative WARNING:WARNING: The examples The examples belowexplicit below contain containand explicit and threatening threatening @The NewsHtn messagesmessages that showthat @ seemi_pasha @twitterhandles show violence violence against I have andwomen. @mentions against have been blurredoccasions, women. buthere since the purpose notis toas intense highlight the as an abusive tweet. It can reinforce never seen a more ugly woman than nature of abuse and The @twitterhandles andhave been not @mentions the have persons been blurred who since theare is toabused, purpose nor here isthe to the persons highlight the who abuse. The @twitterhandles nature nature of abuse and and @mentions you. of A persons notabuse the wild and notpig would the persons who are blurred be who abused, since are nor the purpose prettier abused, the than nor persons here the persons who highlight abuse. who abuse. negative or harmful stereotypes against a group of individuals (e.g. your disgustingly dirty face.***** Maybe ***** constant rejection from ur community negative stereotypes about a race or people who follow a certain religion). has caused u to eat some smelly ***** ***** ***** SEXIST ABUSES PROBLEMATIC SEXIST ABUSES PROBLEMATIC Such tweets may have the effect of silencing an individual or groups of Muslim PROBLEMATIC ass. @ kavita_krishnan @ VanzaraDg @blurred name@kavita_krishnan फिर पुname @blurred @kavita_krishnan राने काम @VanzaraDg फिरमेंपुलग @NewsHtn रानेजाइये @VanzaraDg काम में लग जाइये @NewsHtn @seemi_pasha I have never seen a more ugly woman individuals. @seemi_pasha I have never seen a more ugly woman than than you. A name @blurred wild @ you. pig @blurred NewsHtn Aक्लास wild be pigथर्ड @kavita_krishnan थर्ड would name औरत prettier@ would seemi_pasha क्लास तूthan अपनाऔरत तूthan be prettier भूअपना @VanzaraDg पास्ट your your ल गई disgustingly Iतूdirty पास्ट haveभूface. लराजनीफत गई disgustingly कै@NewsHtn से तूdirty Maybe कमें ै से राजनीफत face. ऊपर Maybe constantआयी मेंहै ऊपर ये आयी constant @seemi_pasha rejection I हैhave ये never seen a more ugly woman rejection from urhas community u has caused someusmelly to eat Muslim some smelly Muslim ass. from सबकोurपता हैसबकोnever community पता है seen caused than you. toaeat more A wild pigugly would woman ass. than be prettier than your disgustingly dirty face. Maybe constant rejection you. A wild pig would be prettier *****than *****Muslim ass. from ur community***** has caused u to eat some smelly your disgustingly dirty face. Maybe constant ABUSIVE ABUSIVE TWEETS TWEETS rejection from ur community ***** has caused u to eat some smelly SEXIST ABUSES Muslim ass. ‘Abusive’ – Abusive content violates Twitter’s own rules and includes tweets name फिर पुराने काम में लग जाइये ABUSIVE TWEETS @blurred @blurred name थर्ड क्लास औरत तू अपना पास्ट भूल that गई तू promote कैसे राजनीफतviolence against में ऊपर आयी है ये or threaten people based on their race, सबको पता है ethnicity, national origin, sexual orientation, gender, gender identity, religious affiliation, age, disability, or serious disease. ***** ***** Examples include: physical or sexual threats, wishes for the physical harm or SEXIST ABUSES SEXIST ABUSES death, reference to violent events, behaviour that incites fear or repeated slurs, फिर पुname @blurred @blurred name फिरमेंपुलग राने काम रानेजाइये काम में लग जाइये थर्ड क्लास @blurred @blurred name name थर्ड क्लास औरत तू अपनाऔरत पास्टतू भूअपना ल गईपास्ट भूलराजनीफत तू कैसे गई तू कमें ै सेऊपर राजनीफत आयीमेंहैऊपर ये आयी है ये epithets, racist and sexist tropes, or other content that degrades someone. सबको पता हैसबको पता है ***** SEXIST ABUSES @blurred name फिर पुराने काम में लग जाइये @blurred name थर्ड क्लास औरत तू अपना पास्ट भूल गई तू कैसे राजनीफत में ऊपर आयी है ये सबको पता है
@is MehboobaMufti You should really think to @ ones. If any ofones. If any you feel of any that you one feel of that theany one of the following following abuses shouldabuses not beshould not bebecause considered considered is because ones. Ifitany ofones. Ifitany you feel isthat of any you one feel of that theany one of the following following abuses shouldabuses not beshould not bebecause considered considered is because ones. Ifitany ofones. Ifitany you feel that of any you one feel of that theany following one of the abuses following shouldabuses not beshould considered not bebecause considered it is because it is toome graphic, The @twitterhandles and @mentions have been tooblurred since take proper thelet purposeeducation here in school is1-2 instead toexamples highlight of each.the L too graphic, let The let know. meuse Will know. @twitterhandlesWill use 1-2 1-2 examples examples each. and each. @mentions have been too blurred toome graphic,since let graphic, the know. let meuse purpose Will know. Will use 1-2 1-2here examples is toeach. examples highlight each. the graphic, lettoome The graphic, know. @twitterhandles Will meuse know. 1-2 examples Will use and each. @mentions have been blurred since the Madarsa jehadi aunty. It’s looking clear cut ( naturewho nature of abuse and not the persons who are abused, nor the persons of abuse abuse.and not the persons who are abused, nor the persons who nature of photoshoped abuse. abuse and youand not theissue are making persons of who are abused, nor the per WARNING:WARNING: The examples The examples belowexplicit below contain containand explicit and WARNING: threatening threatening WARNING: The examples The examples TROLL belowPATROL below contain contain explicitINDIA – EXPOSING explicit and THE ONLINE andWARNING: threatening threatening ABUSE Thethis.. WARNING: FACED Use examples Theyour BY WOMEN common examples below POLITICIANS sense contain belowPattharBaaz explicit INexplicit containand INDIA 15 threatening threatening and messagesmessages that showthat show against violence violencewomen. against women. messagesmessages that showthat show against violence violencewomen. against women. thataurat messagesmessages showthatviolence show against violencewomen. against women. The @twitterhandles The @twitterhandles and @mentionsandhave @mentions have been been blurred since blurred sincehere the purpose the purpose***** here@twitterhandles is to highlight The isthe to highlight theand @mentions The @twitterhandles andhave @mentions have been been blurred since blurred sincehere the purpose the purpose ***** here@twitterhandles The is to highlightisthe to highlight the The @twitterhandles and @mentionsandhave @mentions been blurred have been since blurred the purpose sincehere the purpose ***** is to highlight here isthe to highlight the WARNING: nature nature of abuse and of THE EXAMPLES BELOW CONTAIN notabuse and notwho the persons PROBLEMATIC the persons whonor are abused, arethe abused, norwho persons the persons PROBLEMATIC EXPLICIT AND THREATENING abuse. who nature abuse. of abusenature and of notabuse and notwho the persons PROBLEMATIC MESSAGES the persons whonor are abused, arethe abused, norwho persons the persons abuse. whonature abuse. of abusenature and of notabuse the persons and notwho the persons are abused, whonor arethe abused, persons norwho the persons abuse. who abuse. ***** ***** @kavita_krishnan @VanzaraDg THAT @NewsHtn SHOWI have @seemi_pasha VIOLENCE @kavita_krishnan never seen a more AGAINST @VanzaraDg ugly woman ***** WOMEN. *****@NewsHtn @seemi_pasha I have never seen a more @kavita_krishnan ***** ugly woman @VanzaraDg *****@NewsHtn @seemi_pasha I have PROBLEMATIC PROBLEMATIC than you. than PROBLEMATIC A wild pig would be prettier than your disgustingly PROBLEMATIC dirty you. face. A wild Maybe @ kavita_krishnan pig would constant be prettier than yourPROBLEMATIC rejection @ VanzaraDg @ NewsHtn disgustingly dirty thanface. PROBLEMATIC @you. Maybe A wild blurred constant namepigTattika would rejection be prettier than your disgustingly dirty bacteria The @twitterhandles and @mentions have been blurred since the purpose here is to highlight @kavita_krishnan from ur community @kavita_krishnan @VanzaraDg @VanzaraDg has caused @NewsHtn @NewsHtn @seemi_pasha I haveu never @seemi_pashato eat seensome I have neversmelly a more seen ugly Muslim a more woman @kavita_krishnan ass.@from ugly@kavita_krishnan woman ur@VanzaraDg seemi_pasha @VanzaraDg community @NewsHtn I have has seen @NewsHtn never @seemi_pasha caused u tougly @seemi_pasha a more I have never eat I haveasome seen smelly never ugly more seen Muslim a more woman ugly @kavita_krishnanwomanass. frombola @kavita_krishnan @VanzaraDg ur@VanzaraDg community @NewsHtn re has caused @seemi_pasha @NewsHtn I have u @seemi_pasha to eat never I haveasome seen more smelly never ugly seen woman Muslim a more ass.@ ugly woman thanpig you. A wild be pig would be prettier than yourdirty disgustingly dirtyconstant face. Maybe thanconstant rejection thanpig you. A wild pig would be constant rejection h than you. A wild would prettier than your disgustingly face. Maybe rejection you. A wild would woman be than prettier than you. A prettier your than yourdirty wilddisgustingly pig would disgustingly dirtyconstant face. Maybe be prettier face. Maybe than constant rejection rejection you. A wild thanpig you. would A wild be pig prettier would than be prettier your disgustingly than yourdirty disgustingly face. Maybe dirtyconstant face. Maybe rejection from urhas from ur community community caused u has caused to eat someusmelly to eat Muslim some smelly ass. Muslim ass. ***** from urhas community caused u has caused someusmelly to eat Muslim some smelly ass. Muslim ass. ***** from ur community from urhas community caused u has to eat caused someusmelly to eat Muslim some smelly ass. Muslim ass. ***** e the nature of abuse and ***** ***** not the persons who are abused, nor the persons who abuse. from ur community than your disgustingly dirty face. Maybe *****has constant rejection from ur community to eat ***** ***** h ***** caused u to eat some smelly Muslim ass. r ABUSIVE TWEETS ABUSIVE ABUSIVE TWEETS TWEETS ABUSIVE TWEETS ABUSIVE TWEETS ABUSIVE TWEETS ABUSIVEABUSIVE ABUSIVE TWEETS TWEETS TWEETS Note @ for All:xxThe ssssxxxxx Are Tweet gaddariexamples ke had hotiwill come with Trigger Warning. NoteHave not The for All: @ MehboobaMuftipicked You the examples Tweet really should reallybad think Note willtocome with Trigger Warning. forArvindvijay_ Have @ All: not The @ EconomicTimes picked @Tweet @the examples really bad LambaAlka MamataOfficial Putwill this come with Trigger Warning. ChunaavkeBaad ones. ha If anykaofv you ek Randi imanfeel hotathat ha any one of the following abuses should ones.not beeducation If any take proper considered of you feel because that in school it isof of the following abuses should any one instead ones. not Ifidiot any bedunga Laura stupid, considered of you anti feeldangerous that because any lady one it is of the following abuses should https://t.co/DTJYlhohvY national, too graphic, let me know. Will use 1-2 examples each. Madarsa jehadi aunty. It’s looking clear cut either(note: in jailvideo hasher or send been removed These to pakistan. from platform) too graphic, let me know. Will use 1-2 examples each. too graphic, let me know. Will use 1-2 examples each. photoshoped and you are making issue of shameless people are harming the integrity of this.. Use your common sense PattharBaaz the nation. ***** ***** aurat ***** ***** ***** ***** WARNING: The examples below contain explicit WARNING: and threatening The examples below contain explicit WARNING: and threatening The examples below contain explic SEXIST ABUSES SEXIST ABUSES messages NoteTweet for All: The Tweet that show violence against women.SEXIST ABUSES SEXISTforABUSESSEXIST ABUSES Note for All: The examples willexamples come will with comeWarning. Trigger with Trigger HaveWarning. Have not picked thenot picked SEXIST Note really bad the All:really forABUSES Note The messages bad for Tweet All: examples The Tweet that willexamples come with willshow Trigger violence comeWarning. with Trigger HaveWarning. against not picked Have Note thenot really picked bad women. All:really the Note The messages for Tweet bad All: examples The Tweet that willexamples come with willshow Trigger violence comeWarning. with Trigger HaveWarning. against not picked Have thenot really picked women. badthe really bad ones. If any of you feel that any one of the following abuses should not be considered ones. If any of@blurred you feel that any one फिरमेंofपुलग the following रानेजाइये काम में लग abuses जाइये *****it is @blurred name फिर पुराने काम में लग जाइये ones. If anybecause should not be considered because ofones. Ifitany you feel isthat of any you one feel of that theany following one of the abuses following shouldabuses not beshould considered ***** not bebecause considered ones. Ifitany is because ofones. you feel Ifitany isthat of any you one feel of that theany following one of the abuses following shouldabuses not beshould considered ***** not bebecause considered it is because it is फिर पुname रानेme काम फिर पुpurpose रWill ानेmeकाम मेंhere लग @जाइये फिर पुname रWill ानेname काम फिरमेंपुलग रWill ानेजाइये काम में लग जाइयेeach. @blurred name toome too graphic, let Thename know. @twitterhandles SEXISM OR MISOGYNY graphic, let know. examples Will use थर्ड1-2 क्लास Will use 1-2 औरत and each. @mentions examples each. गई have beentoo blurred graphic, since @blurred name let toome the ETHNIC OR RELIGIOUS SLUR @ kavita_krishnan येThe graphic, know. @twitterhandles let use know. 1-2 examples Will is to VanzaraDg use 1-2 highlight and each. @ NewsHtn @mentions examples the each. गई have been @blurred blurred too name graphic,since @blurred let toome@know. blurred the The graphic, RACISM purpose Tattika @twitterhandles let meuseknow. 1-2here bacteria isपास्ट examples use to and 1-2 highlight each. @mentions examples the have been blurred since the @blurred name थर्ड क्लास @blurred औरत तू अपना पास्टतू भूअपना ल गईपास्ट भूलराजनीफत तू कैसे तू कमें ै सेऊपर राजनीफत आयी मेंहैऊपर @blurred ये आयी है थर्ड @blurred name क्लास name @ seemi_pasha औरतथर्ड क्लास haveऔरत तूI अपना पास्टतूseen never भूअपना ल गई पास्ट तू कैसे a more भू लराजनीफत ugly तू कमें ै सेऊपर राजनीफत आयी मेंहैऊपर @blurred ये आयीname है थर्ड @blurred ये क्लास name औरत थर्ड क्लास तू अपना औरत तू भूअपना ल गईपास्ट तू कैसे भूलराजनीफत गई तू कमेंै सेऊपर राजनीफत आयीमेंहैऊपर ये आयी है ये सबको पता हैसबको natureपता है of abuse and not the persons who are abused, nor the persons who abuse. nature of abuse and not the persons who are abused, nor the bola personsre nature of whoabuse abuse. and not the persons who are abused, nor the perso सबको पता है सबको पता है woman than you. A wild pig would be prettier सबको पता है सबको पता है Insulting SEXIST WARNING: or ABUSES The abusive examples below content containand explicit andWARNING: threatening Discriminatory, Thethan WARNING: SEXIST yourThe examples ABUSES offensive disgustingly examples below dirty face. contain below or Maybe insulting explicit contain andexplicit threatening andWARNING: threatening Discriminatory, TheSEXIST WARNING: examples ABUSES offensive orexplicit insulting WARNING: The examples below contain explicit threatening @The examples below TOIIndiaNews contain @below contain priyankagandhi and uexplicit don’tthreatening and threatening directed messagesmessages that showthat at women show against violence based violence women. on their against women. messages contentthatconstant messages showdirectedrejection that violence show from at aur community woman against violence women. has women. againstbased on messages content messages that show directed that have violence show brain toat a against violence decide womanwomen. from wherbased against u women. should fight @blurred The @twitterhandles name and @mentionsफिर have पुरbeen ाने blurred काम में लग since जाइये the purpose ***** here@twitterhandles The isthe to highlight the @blurred The @twitterhandles and @mentions name andhave @mentions फिर caused u to eat some smelly Muslim ass. been blurred पु have beenर ाने काम since blurred the purpose में लग sincehere जाइये the purpose ***** is to highlight The here@twitterhandles isthe to highlight @blurred The @twitterhandles election? name been फिर Brainless पुbeen female, रानेshe काम looks मेंsohere लगpurpose is जाइये ***** The @twitterhandles gender, and @mentions including have been blurred content since the purpose intended here is to highlight her religious beliefs and थर्ड or क्लास ethnicity, onthe and her @mentions race,andhave @mentions including blurred have since contentblurred the purpose since the to highlight here isthe to highlight the nature of abusenature of notabuse and@blurred and notwho the persons the persons name थर्डwhoक्लास are abused, arethe nor औरत abused, persons तू अपना norwho the persons abuse. who पास्ट nature भूल गई abuse.of abuse तू क nature and of ै@blurred not सेtheराजनीफत abuse persons and notwhothe मेंpersons name ऊपर are abused, आयी whonor arethe abused, येऔरत हैpersons theतूpersons norwho अपना abuse. whoपास्ट abuse.भू nature of ल गईandतूof abuse nature क not ै the abuseसेhorrible @blurred राजनीफत persons she and notwho में ऊपर thinks the persons nameare थर्ड आयी क्लास herself abused, who nor arethe हैऔरत Indira abused, येnor तूabuse. Gandhi’s persons who अपना the persons पास्ट भूल गई तू कै who abuse. PROBLEMATIC to shame, intimidate or degrade PROBLEMATIC including content that aims to attack, PROBLEMATIC that aims to attack, harm, belittle, replica my foot. She looks like transgender सबको पता है सबको पता है सबको पता है women. It can include @kavita_krishnan ***** profanity, @VanzaraDg @NewsHtn @seemi_pasha ***** harm, belittle, I have never seen humiliate @kavita_krishnan a more orwoman ugly @VanzaraDg ***** undermine @NewsHtn @seemi_pasha I have ***** humiliate or aundermine @kavita_krishnan never seen more @VanzaraDg ***** her. ugly woman @NewsHtn @seemi_pasha I have n ***** than you. PROBLEMATIC A wild pig would be threats, slurs and insulting epithets. PROBLEMATIC prettier than your disgustingly PROBLEMATIC @ dirty thanface. her PROBLEMATIC you. kavita_krishnan @ andMaybe @ A wild her constant pig would community. VanzaraDg MehboobaMufti You @ should rejection be NewsHtn really prettier think to than your disgustingly PROBLEMATIC dirty thanface. PROBLEMATIC @ @ you. blurredMaybe Arvindvijay_ A wild name @ constant pig would Tattika LambaAlka rejection be bacteria prettier than your disgustingly dirty f ChunaavkeBaad @@ @kavita_krishnan kavita_krishnan kavita_krishnan@@ @VanzaraDg VanzaraDg VanzaraDg@@ @NewsHtn NewsHtn NewsHtn @@ @blurred blurred blurredname name nameTattika Tattika Tattikabacteria bacteria bacteria from ur community @kavita_krishnan @VanzaraDg has caused @NewsHtn u never to eat @seemi_pasha some I have neversmelly seen Muslim a more@ugly ass. woman from urI have take proper seemi_pasha community education in hasseen school caused instead ofu never to eat someneversmelly Muslim ass.from Laura ur https://t.co/DTJYlhohvY dunga community has caused u never to eat someneversmelly Muslim ass. @kavita_krishnan @VanzaraDg @NewsHtn @seemi_pasha I have seen a more @kavita_krishnan ugly woman @kavita_krishnan @@VanzaraDg @ @ seemi_pasha seemi_pasha seemi_pasha Inever @VanzaraDg @NewsHtn IIhave have have seen never never never aseen @seemi_pasha @NewsHtn more seen ugly @seemi_pasha aaaImore have more more ugly ugly uglyseen I havea more ugly seen @kavita_krishnan woman a more ugly @kavita_krishnan woman @VanzaraDg bola bola bola bola re re re @VanzaraDg @NewsHtn @seemi_pasha repig @NewsHtn @seemi_pasha I have seen I havea more ugly seenwoman a more ugly woman thanpig than you. A wild you. A wild would be pig would prettier be prettier than than yourdirty your disgustingly disgustingly dirtyconstant face. Maybe face. Maybe than constant rejection rejection you. A wild thanpig Madarsa you. would A wild jehadi be pig prettier wouldaunty. than It’s disgustingly looking be prettier your clear than your cutface. Maybe dirty disgustingly dirtyconstant face. Maybe than rejection constant you. A wild than(note: rejection pig you. Avideo would wild be haswould beenthan prettier removed from be prettier thanplatform) your disgustingly yourdirty disgustingly face. Maybe dirtyconstant face. Maybe rejection constant rejection woman than woman woman woman you. than thanA you. than wild you. you.AA pig Awildwould wild wild pig pig be prettier pigwould would would be bebeprettier prettier prettier from ur community has caused u to eat some from ur community has caused u to eat some smelly Muslim ass.smelly Muslim ass. ***** from ur community from ur photoshoped has community caused u @ and has to eat you caused someare u EconomicTimes making smelly to eat issue Muslim some of smelly ass. Muslim ass. ***** from ur community from urhas community caused u has to eat caused someusmelly to eat Muslim some smelly ass. Muslim ass. ***** than your than disgustingly than thanyour your your this.. Use dirty @ disgustingly disgustingly disgustingly your common MamataOfficial face. dirty dirty dirty Maybe face. face. senseface. MaybePut this Maybe Maybe PattharBaaz ***** ***** ***** stupid, rejection constantconstant rejection constant constant idiot aurat rejectionfrom rejectionantiur national, community from from from urur ur has***** dangerous community community community haslady has has ***** ***** either caused caused ucaused to eatuuin caused tojail usome toto eator eat eat send some some some smelly her toMuslim smelly smelly smelly Muslim pakistan. Muslim Muslim ass.ass.These ass. ass. ABUSIVE TWEETS shameless people are harming the integrity of ABUSIVE ABUSIVE TWEETS TWEETS ABUSIVE TWEETS ABUSIVE ABUSIVE TWEETS TWEETS ABUSIVE TWEETS ABUSIVE ABUSIVE TWEETS TWEETS the nation. Note for All: The Tweet examples will come with Trigger Warning. Have not picked the really bad @ blurred ones. of youअब If anyname ु that feelतझ े अपनी anyजात one of the following abuses @@@MehboobaMufti shouldMehboobaMufti MehboobaMufti not be You You Youshould considered should shouldbecause really really reallythink think think toto isto @@ @Arvindvijay_ Arvindvijay_ Arvindvijay_ @ TOIIndiaNews @@@LambaAlka LambaAlka LambaAlkauChunaavkeBaad @ priyankagandhi ChunaavkeBaad ChunaavkeBaad don’t @ MehboobaMufti take take takeproper proper You should propereducation education education inin really inschool school think schoolinstead insteadtoofit instead ofof @ have Arvindvijay_ brain Laura Laura Laura to decide dunga dunga dunga @ LambaAlka from wher u should https://t.co/DTJYlhohvY https://t.co/DTJYlhohvY https://t.co/DTJYlhohvYChunaavkeBaad fight और औकात का आभास जनता करा देगी. too graphic, let me know. Will use 1-2 examples each. take proper education Madarsa Madarsa Madarsa jehadi jehadi inaunty. school jehadiaunty. aunty.It’s It’s instead It’slooking looking of lookingclear clear clearcut cut cut Laura election? dunga (note:Brainless (note: (note:video video https://t.co/DTJYlhohvY videohas has hasfemale, been been she looks beenremoved removed removed so platform) from from from platform) platform) Madarsaphotoshoped jehadi aunty. photoshoped photoshoped and and andIt’s you youlooking youare are clear aremaking making making cut ofof issue issue issue of horrible she thinks herself Indira Gandhi’s (note: video has been removed from platform) this.. this.. this..Use Use Useyour your yourcommon common common sense sense sense PattharBaaz PattharBaaz PattharBaaz replica my foot. She looks like transgender photoshoped and you are making issue of ***** ***** ***** ***** ***** ***** aurat aurat aurat this..explicit Use your common sense PattharBaaz WARNING: The examples below contain and threatening aurat Note for All: SEXIST ABUSES Note The @ SEXISTmessages for TweetAll: examples The name blurred ABUSES Tweet that willexamples come with Tattika willshow Trigger bacteria violence comeWarning. with Trigger HaveWarning. against not picked Have thenot really picked SEXISTbadwomen. the really ABUSESSEXISTbad ABUSES SEXIST ABUSES SEXIST ABUSES ones. If any ofones. you feel If any bola that of any you one feel of thattheany following one of the abuses following shouldabuses not beshould considered ***** not bebecause considered it is because it is ***** ***** too graphic,name let toome There फिर graphic, know. @blurred पुname रWill ानेme let कामफिर use मेंपुलग @twitterhandles know. 1-2 रWill ानेजाइये काम examples मेंand use 1-2 लग each. जाइये @mentions examples each. have been@blurred blurred name since theपुname फिर @blurred purpose राने काम here फिरमेंपु लग is toमें highlight रानेजाइये काम लग जाइये the @blurred name फिर पुname @blurred राने काम फिरमेंपुलगरानेजाइये काम में लग जाइये CASTE SLUR PHYSICAL THREATS SEXUAL THREATS @blurred थर्ड क्लास औरत तू भूअपनागईपास्ट भूलराजनीफत गई तू कमें @blurred name थर्ड @blurred क्लास nature name औरत of तू अपना abuse andपास्ट not लthe तू कैसे persons whoै सेare राजनीफत ऊपर आयी abused,मेंहैऊपर @blurred येnorआयी theहैpersons name ये@ EconomicTimes थर्ड @blurred क्लास name औरत थर्ड क्लास who तूabuse. अपना @ औरत पास्टतू भूअपना ल गईपास्ट MamataOfficial तूPut कैthis से भूलराजनीफत गई तू कमें ै सेऊपर राजनीफत आयी मेंहैऊपर @blurredये आयी nameहै थर्ड ये क्लास @blurred name औरत थर्ड क्लास तू अपना औरत पास्टतू भूअपना ल गईपास्ट तू कैसे भूलराजनीफत गई तू कमें ै सेऊपर राजनीफत आयीमेंहैऊपर ये आयी है ये सबको पता हैसबको पता है सबको पता हैसबको पता है idiot anti national, dangerous lady stupid, सबको पता हैसबको पता है WARNING: Discriminatory, SEXIST ABUSES WARNING: The examples The examples below offensive contain below explicitor contain andexplicit threatening and threatening Direct or SEXIST indirect ABUSES threats of either in jail or send her to pakistan. Thesephysical Direct SEXIST or indirect threats ABUSES @priyankagandhi of sexual messagesmessages that showthat violence show against violencewomen. against women. shameless people are harming the integrity of @@ @TOIIndiaNews TOIIndiaNews TOIIndiaNews@@ priyankagandhi priyankagandhiuuudon’t don’t don’t insulting content directed at violence the nation. or wishes for serious violence have have or havebrain brain brainwishes toto todecide decide decidefromfor from rape fromwher wher or other wheruuushould should should fight fight fight The @twitterhandles The @twitterhandles and @mentions @blurred and have name फिर @mentions been blurred have पुरbeen ाने blurred since काम the में लग purpose sincehere जाइये the purpose***** is to highlight here isthe @blurred name फिर पुराने काम में लग जाइये to highlight the @blurred @ TOIIndiaNews name @female,फिर पु priyankagandhi र ाने काम uso में लग जाइये sodon’t nature of abuse a ofnot nature and woman abuse based the persons and not who onwho the persons are abused,hernor caste, arethe abused, personsnorwho the persons abuse. who abuse. physical harm, death, or disease. forms election? election? election? of Brainless sexual Brainless Brainless female,assault. female, she she shelooks looks looksso @blurred PROBLEMATIC name थर्ड क्लास औरत तू अपना पास्ट भू ल गई तू क ै से @blurred राजनीफत nameमें ऊपर थर्ड आयी क्लास है औरतये तू अपना पास्ट भू ल गई तू क have brain to decide from wher u should तू ै से horribleराजनीफत horrible horrible @blurred she she she thinksमें thinks thinks name ऊपर थर्ड herself herself herself आयी क्लास Indira Indira Indira है औरत ये Gandhi’s Gandhi’s Gandhi’s अपना पास्ट भूल गई तू कैस fight including सबको पताcontent है that aims to सबको पता है सबको replica replica replicamy election? पता my my foot.है She foot. foot. Brainless She Shelooks looks lookslike female, like likeshe transgender transgender transgender looks so @kavita_krishnan attack, harm,@belittle, @VanzaraDg ***** @NewsHtn ***** humiliate or @seemi_pasha I have never seen a more ugly woman horrible she thinks herself Indira Gandhi’s @ Arvindvijay_ than you. LambaAlka A wild pig wouldChunaavkeBaad be prettier than your disgustingly dirty face. Maybe constant rejection PROBLEMATIC PROBLEMATIC replica my foot. She looks like transgender undermine her or her hascommunity. Laura dunga https://t.co/DTJYlhohvY from ur community caused u to eat some smelly Muslim ass. @kavita_krishnan @kavita_krishnan @VanzaraDg (note: video @VanzaraDg @NewsHtn @seemi_pasha @NewsHtn has been removed from@seemi_pasha I have never seen platform) I havea more never ugly seenwoman a more ugly woman than you. A wild thanpig you. would A wild be pig prettier would than be prettier your disgustingly than yourdirty disgustingly face. Maybe dirtyconstant face. Maybe rejection constant rejection from ur community from urhas community caused u has to eat caused someusmelly to eat Muslim some smelly ass. Muslim ass. ***** ***** ***** @@@EconomicTimes EconomicTimes EconomicTimes@@ @MamataOfficial MamataOfficial MamataOfficialPut Put Putthis this this stupid, stupid, stupid,idiot idiot idiotanti anti antinational, national, national,dangerous dangerous dangerouslady lady lady either either eitherinin injail jail jailoror orsend send sendher her hertoto topakistan. pakistan. pakistan.These These These ABUSIVE TWEETS ABUSIVE ABUSIVE TWEETS TWEETS shameless shameless shamelesspeople @ EconomicTimes people people @ are are areharming harming harmingthe MamataOfficial the the Putintegrity integrity integrity this ofof of the the thenation. nation. nation. stupid, idiot anti national, dangerous lady either in jail or send her to pakistan. These shameless people are harming the integrity of the nation. @ TOIIndiaNews @ priyankagandhi u don’t have brain to decide from wher u should fight election? Brainless female, she looks so horrible she thinks herself Indira Gandhi’s replica my foot. She looks like transgender ***** ***** SEXIST ABUSES SEXIST ABUSES ***** @blurred name @blurred name फिर पुname @blurred थर्ड क्लास @blurred name HOMOPHOBIA OR राने काम फिरमेंपुलग औरत रानेजाइये थर्ड क्लास तू अपना काम में लग जाइये औरत पास्टतू भूअपना ल गईपास्ट तू कैसे भूलराजनीफत गई तू कमें ै सेऊपर राजनीफत आयीमेंहैऊपर ये आयी है ये OTHER सबको पता हैसबको पता है TRANSPHOBIA There will be some tweets that fall SEXIST ABUSES Discriminatory, offensive or insulting under the ‘other category’ that are @blurred content directed at फिर name पुराने काम a woman में लग जाइये based problematic and/or abusive. For @blurred name थर्ड क्लास औरत तू अपना पास्ट भूल गई तू कैसे राजनीफत में ऊपर आयी है ये on her sexual orientation, gender example, statements that target a सबको पता है identity or gender expression. user’s disability, be it physical or This includes negative comments mental, or content that attacks a towards bisexual, homosexual and woman’s nationality, health status, transgender people. legal status, employment, etc.
You can also read