On this page
Case study: content analysis of fringe platforms
To complement the literature review, Hate and Extremism Insights Aotearoa (HEIA) conducted an in-depth content analysis of New Zealand-based posts on three fringe online spaces (4chan, Telegram and Gab). Numbers of misogynistic posts fluctuated over time and included peaks that coincided with significant offline events, such as the 2022 Wellington Parliament protests.
Methodology
Data collection and sample creation
HEIA used a dataset of over 600,000 posts from New Zealand-based posters between 2019 and 2022. A greater number of posts were available for analysis from early 2021, because data collection began in late 2021 and platforms often delete older content.
Posts containing misogynistic content
A list of misogynistic terms, including common pejorative terms of abuse for women, was compiled. Of the full dataset, 38,472 posts contained at least one of the misogynistic terms.
Categorisation of ‘highly misogynistic’ posts
HEIA conducted a second round of analysis to identify highly misogynistic posts within this sample. HEIA applied three large language models measuring levels of negative emotion, toxicity, and sexually explicit content. Posts meeting a high threshold for negative emotion, toxicity, and sexually explicit content were selected, reducing the dataset to 8,569 posts.
Inter-coding reliability
Initially three HEIA coders independently categorised 50 posts. To ensure consistency, the research team then met to discuss and find consensus on these initial categorised 50 posts. This helped us to cement our parameters and set our high threshold, as any posts that were categorised, for example, as misogynistic by one coder but not others meant that we were unlikely to include it. The team defined categories of misogynistic abuse based on the research contained in this report.
Categorisation of ‘extreme misogynistic’ posts
The research team manually reviewed all 8,569 posts. Posts not meeting a very high threshold of offensive and threatening content were removed, and the remaining 3,109 posts were categorised into 14 distinct categories of misogynistic abuse. These posts are the most misogynistic, toxic, and abusive in the dataset. As such, they represent the ‘worst of the worst’ found in some of the most extreme online forums.
Privacy and ethics
HEIA takes all practical steps to ensure the privacy of research subjects is maintained. All data is anonymised, with personally identifying information removed. The metadata collected from platforms is limited to the minimum level necessary to conduct the research. HEIA does not identify or track the activities of individuals or groups, restricts data collection in line with the policies of the platforms, and only accesses publicly available data. HEIA does not use fake accounts to access closed groups/pages. All data is encrypted and stored on secure AWS servers.
This research method was approved by the University of Auckland Human Participants Ethics Committee on 9 August 2021 for three years (Reference Number UAHPEC22980).
Limitations
We were unable to access data from mainstream platforms such as Meta for the purposes of this content analysis, despite requests. The analysis is therefore based on data from fringe platforms and there are limitations on the extent to which the findings can be generalised.
Key findings
HEIA observed a fluctuating trend in misogynistic posts and usage
Numbers of extreme misogynistic posts varied over time, with noticeable increases that aligned with important real-world events. For example, there was a peak in extreme and misogynistic posts around March 2021, which coincided with the announcement of a change to Alert Level 3 lockdown in Auckland.1 Another significant peak occurred in February and March 2022, which coincided with the Wellington Parliament protests.2
Additional spikes were observed from July to October 2022, but these sharply declined by November 2022. During this period, several significant events occurred, making it difficult to associate these events with specific spikes in extreme misogynistic content on these platforms without further investigation and analysis. These offline events included the detection of the first case of mpox (previously referred to as Monkeypox) in Auckland (9 July); the reopening of the New Zealand borders to all travellers (31 July); the end of New Zealand's COVID-19 Protection Framework, resulting in the removal of most pandemic-related restrictions (12 September); and the confirmation of the first community transmission of mpox in New Zealand (6 October).
HEIA found diverse forms of misogynistic abuse
While general misogynistic abuse represents the majority of posts, substantial proportions belong to categories involving threats, for example threats of rape or violence towards specific individuals, trans people or women in general. Mentions of rape targeting women were often paired with content such as racist, nationalist and transphobic comments.
HEIA also observed that some posts discussing female politicians and other contentious issues such as abortion, false rape allegations, trans issues, and hate speech regulation exhibited an overtly abusive and derogatory tone, asserting misogynistic concepts of gender roles.
HEIA found that misogyny and other ideologies go together
Within the extreme misogynistic dataset, HEIA identified the presence of multiple other forms of ideologies which included racism, anti-government, anti-immigrant, anti-mandate and anti-LGBTQI+ ideologies.
It’s important to note two things here. Firstly, there were a relatively small number of posts of this type within this dataset. Nevertheless, this finding remains significant as it underscores the convergence of multiple potentially harmful ideologies within these online spaces.
Secondly, HEIA's focus on extreme misogynistic content targeting women led to some topics being included while others were excluded. For example, this analysis captured posts relating to abortion, transphobia, rape allegations, and white genocide, all of which were intertwined with misogyny. However, comments regarding men’s rights (i.e. perceived discrimination against men) weren’t categorised as misogynistic as they rarely targeted women.
What does this mean?
In understanding these findings, it's important to recognise that the rise in the use of derogatory terms and the severity of misogynistic posts point to an alarming persistence of online misogyny. While the analysis was focused on specific platforms, the findings have broader implications. They align with results from other studies that highlight similar spikes in problematic content around the same periods of time345 and show how extremist content from fringe platforms spills over into mainstream social media.678 This convergence highlights the need to address these behaviours across various online spaces.8
HEIA’s observations regarding how offline events such as the 2022 Wellington Parliament protests can act as a trigger for hateful content in general are significant and extend to extreme misogynistic hate. This correlation aligns with findings published by the Public Library of Science (PLoS ONE),9 which highlights how real-world events, such as protests and elections, often prompt increases in various forms of online hate speech.910 For example, there was a huge spike in race-related hate speech by 250% following the murder of George Floyd and the Black Lives Matter protests, in addition to a more general wave of online hate.
Furthermore, a 2022 Brookings Institution study on gender-based online violence found spikes after pile-ons by prominent media figures. The study, which used open-source analysis software called Perspective, created by Jigsaw and Google, revealed that when a prominent male media personality targeted a female journalist, there was a significant surge in abusive speech directed at these journalists. This increase in harmful content often persisted for several days before gradually subsiding.
HEIA’s third key observation underscores the connection between misogyny and other ideologies, aligning with findings from various studies.11121314 For instance, ADL highlights the strong link between misogyny and white supremacy.11 A profound aversion to women serves as a common thread among many violent extremist ideologies such as white supremacy, particularly within the extreme far-right movement and their lesser-known counterparts like incels.1516 Misogyny, in fact, was described as the gateway and an early warning system for violent extremism.1117
Helplines
We understand that this research could be confronting or upsetting for some readers. If you or someone you know needs to talk:
- Free call or text 1737 any time for support from a trained counsellor.
- Free call Safe to Talk 0800 044 334 or text 4334 anytime for support about sexual harm.
- Free call OutLine Aotearoa 0800 688 5463 any evening to talk to trained volunteers from Aotearoa's rainbow communities.
-
1
Radio New Zealand (RNZ). (2021, March 1). Covid-19: What happened in New Zealand on 1 March, 2021. Radio New Zealand (RNZ).
-
2
Wikipedia contributors. (2023, August 16). 2022 Wellington protest. Wikipedia – The Free Encyclopaedia. Retrieved August 22, 2023 from https://en.wikipedia.org/w/index.php?title=2022_Wellington_protest&oldid=1170743170
You can access the revised version of this entry: https://en.wikipedia.org/wiki/2022_Wellington_protest
-
3
Hannah, K., Hattotuwa, S., & Taylor, K. (2022). The murmuration of information disorders: Aotearoa New Zealand‚ mis- and disinformation ecologies and the parliament protest. Pacific Journalism Review, 28(1/2), 138–161.
-
4
Bruns, A., Harrington, S., & Hurcombe, E. (2021). Coronavirus Conspiracy Theories: Tracing Misinformation Trajectories from the Fringes to the Mainstream. Communicating COVID-19: interdisciplinary perspectives, 229–249.
-
5
Roose, K. (2017, December 11). The Alt-Right Created a Parallel Internet. It’s an Unholy Mess. New York Times.
-
6
Scott, M. (2020, November 13). After US elections, extremist use fringe social networks to push fraud claims, violence. POLITICO. Accessed August 17, 2023.
-
7
Jigsaw. (2023, July 15). 6 Insights Into How Hate & Violent Extremism are Evolving Online. Jigsaw. Accessed August 17, 2023.
-
8
Forberg, P. L. (2022). From the Fringe to the Fore: An Algorithmic Ethnography of the Far-Right Conspiracy Theory Group QAnon. Journal of Contemporary Ethnography, 51(3), 291–317.
-
9
Lupu, Y., Sear, R., Velásquez, N., Leahy, R., Restrepo, N. J., Goldberg, B., & Johnson, N. F. (2023). Offline events and online hate. PLoS ONE, 18(1), e0278511.
-
10
Geddes, L. (2023, January 25). Real-world events trigger online hate toward unrelated groups, study finds. The Guardian. Accessed August 17, 2023.
-
11
ADL. (2018). When Women are the Enemy: The Intersection of Misogyny and White Supremacy. Anti-Defamation League Center on Extremism.
-
12
Bitar, S. (2015). Sexual violence as a weapon of war: the case of ISIS in Syria and Iraq. [Unpublished master’s thesis]. Uppsala University.
-
13
Phelan, A., White, J., Wallner, C. and Paterson, J. (2023, February 20). Introduction Guide To Understanding Misogyny And The Far-Right. Centre for Research and Evidence on Security Threats (CREST). Accessed August 22, 2023.
-
14
Phelan, A., White, J., Wallner, C. and Paterson, J. (2023). Misogyny And Masculinity: Toward A Typology Of Gendered Narratives Amongst The Far-Right. Monash University. Journal contribution. https://doi.org/10.26180/23293055.v1
-
15
Criezis, M. (2023, May 16). The Allen, Texas Mass Shooting: An Examination of Misogyny, Anti-Asian Racism, and Internalised Racism. Global Network on Extremism & Technology (GNET). Accessed August 22, 2023.
-
16
Hunter, K., & Jouenne, E. (2021). All Women Belong in the Kitchen, and Other Dangerous Tropes: Online Misogyny as a National Security Threat. Journal of Advanced Military Studies, 12(1), 57–85.
-
17
Cannon M. (2022, January 24). Assessing Misogyny as a ‘Gateway Drug’ into Violent Extremism. Global Network on Extremism & Technology (GNET). Accessed August 11, 2023.
On this page