Section: 2.3 Misogyny and violent extremism

Case study: content analysis of fringe platforms

Case study: content analysis of fringe platforms

To complement the literature review, Hate and Extremism Insights Aotearoa (HEIA) conducted an in-depth content analysis of New Zealand-based posts on three fringe online spaces (4chan, Telegram and Gab). Numbers of misogynistic posts fluctuated over time and included peaks that coincided with significant offline events, such as the 2022 Wellington Parliament protests.

Methodology

Data collection and sample creation

HEIA used a dataset of over 600,000 posts from New Zealand-based posters between 2019 and 2022. A greater number of posts were available for analysis from early 2021, because data collection began in late 2021 and platforms often delete older content.

Posts containing misogynistic content

A list of misogynistic terms, including common pejorative terms of abuse for women, was compiled. Of the full dataset, 38,472 posts contained at least one of the misogynistic terms.

Categorisation of ‘highly misogynistic’ posts

HEIA conducted a second round of analysis to identify highly misogynistic posts within this sample. HEIA applied three large language models measuring levels of negative emotion, toxicity, and sexually explicit content. Posts meeting a high threshold for negative emotion, toxicity, and sexually explicit content were selected, reducing the dataset to 8,569 posts.

Inter-coding reliability

Initially three HEIA coders independently categorised 50 posts. To ensure consistency, the research team then met to discuss and find consensus on these initial categorised 50 posts. This helped us to cement our parameters and set our high threshold, as any posts that were categorised, for example, as misogynistic by one coder but not others meant that we were unlikely to include it. The team defined categories of misogynistic abuse based on the research contained in this report.

Categorisation of ‘extreme misogynistic’ posts

The research team manually reviewed all 8,569 posts. Posts not meeting a very high threshold of offensive and threatening content were removed, and the remaining 3,109 posts were categorised into 14 distinct categories of misogynistic abuse. These posts are the most misogynistic, toxic, and abusive in the dataset. As such, they represent the ‘worst of the worst’ found in some of the most extreme online forums.

Privacy and ethics

HEIA takes all practical steps to ensure the privacy of research subjects is maintained. All data is anonymised, with personally identifying information removed. The metadata collected from platforms is limited to the minimum level necessary to conduct the research. HEIA does not identify or track the activities of individuals or groups, restricts data collection in line with the policies of the platforms, and only accesses publicly available data. HEIA does not use fake accounts to access closed groups/pages. All data is encrypted and stored on secure AWS servers.

This research method was approved by the University of Auckland Human Participants Ethics Committee on 9 August 2021 for three years (Reference Number UAHPEC22980).

Limitations

We were unable to access data from mainstream platforms such as Meta for the purposes of this content analysis, despite requests. The analysis is therefore based on data from fringe platforms and there are limitations on the extent to which the findings can be generalised.

Key findings

HEIA observed a fluctuating trend in misogynistic posts and usage

Numbers of extreme misogynistic posts varied over time, with noticeable increases that aligned with important real-world events. For example, there was a peak in extreme and misogynistic posts around March 2021, which coincided with the announcement of a change to Alert Level 3 lockdown in Auckland.1 Another significant peak occurred in February and March 2022, which coincided with the Wellington Parliament protests.2

Additional spikes were observed from July to October 2022, but these sharply declined by November 2022. During this period, several significant events occurred, making it difficult to associate these events with specific spikes in extreme misogynistic content on these platforms without further investigation and analysis. These offline events included the detection of the first case of mpox (previously referred to as Monkeypox) in Auckland (9 July); the reopening of the New Zealand borders to all travellers (31 July); the end of New Zealand's COVID-19 Protection Framework, resulting in the removal of most pandemic-related restrictions (12 September); and the confirmation of the first community transmission of mpox in New Zealand (6 October).

HEIA found diverse forms of misogynistic abuse

While general misogynistic abuse represents the majority of posts, substantial proportions belong to categories involving threats, for example threats of rape or violence towards specific individuals, trans people or women in general. Mentions of rape targeting women were often paired with content such as racist, nationalist and transphobic comments.

HEIA also observed that some posts discussing female politicians and other contentious issues such as abortion, false rape allegations, trans issues, and hate speech regulation exhibited an overtly abusive and derogatory tone, asserting misogynistic concepts of gender roles.

HEIA found that misogyny and other ideologies go together

Within the extreme misogynistic dataset, HEIA identified the presence of multiple other forms of ideologies which included racism, anti-government, anti-immigrant, anti-mandate and anti-LGBTQI+ ideologies.

It’s important to note two things here. Firstly, there were a relatively small number of posts of this type within this dataset. Nevertheless, this finding remains significant as it underscores the convergence of multiple potentially harmful ideologies within these online spaces.

Secondly, HEIA's focus on extreme misogynistic content targeting women led to some topics being included while others were excluded. For example, this analysis captured posts relating to abortion, transphobia, rape allegations, and white genocide, all of which were intertwined with misogyny. However, comments regarding men’s rights (i.e. perceived discrimination against men) weren’t categorised as misogynistic as they rarely targeted women.

What does this mean?

In understanding these findings, it's important to recognise that the rise in the use of derogatory terms and the severity of misogynistic posts point to an alarming persistence of online misogyny. While the analysis was focused on specific platforms, the findings have broader implications. They align with results from other studies that highlight similar spikes in problematic content around the same periods of time345 and show how extremist content from fringe platforms spills over into mainstream social media.678 This convergence highlights the need to address these behaviours across various online spaces.8

HEIA’s observations regarding how offline events such as the 2022 Wellington Parliament protests can act as a trigger for hateful content in general are significant and extend to extreme misogynistic hate. This correlation aligns with findings published by the Public Library of Science (PLoS ONE),9 which highlights how real-world events, such as protests and elections, often prompt increases in various forms of online hate speech.910 For example, there was a huge spike in race-related hate speech by 250% following the murder of George Floyd and the Black Lives Matter protests, in addition to a more general wave of online hate.

Furthermore, a 2022 Brookings Institution study on gender-based online violence found spikes after pile-ons by prominent media figures. The study, which used open-source analysis software called Perspective, created by Jigsaw and Google, revealed that when a prominent male media personality targeted a female journalist, there was a significant surge in abusive speech directed at these journalists. This increase in harmful content often persisted for several days before gradually subsiding.

HEIA’s third key observation underscores the connection between misogyny and other ideologies, aligning with findings from various studies.11121314 For instance, ADL highlights the strong link between misogyny and white supremacy.11 A profound aversion to women serves as a common thread among many violent extremist ideologies such as white supremacy, particularly within the extreme far-right movement and their lesser-known counterparts like incels.1516 Misogyny, in fact, was described as the gateway and an early warning system for violent extremism.1117

Helplines

We understand that this research could be confronting or upsetting for some readers. If you or someone you know needs to talk:

  • Free call or text 1737 any time for support from a trained counsellor.
  • Free call Safe to Talk 0800 044 334 or text 4334 anytime for support about sexual harm.
  • Free call OutLine Aotearoa 0800 688 5463 any evening to talk to trained volunteers from Aotearoa's rainbow communities.