Tiktok logo

Andrea Austria / Media Matters

Research/Study Research/Study

TikTok's algorithm leads users from transphobic videos to far-right rabbit holes

After interacting exclusively with transphobic videos on TikTok, Media Matters was fed extremist content by the company’s “For You” page

TikTok’s “For You” page (FYP) recommendation algorithm appears to be leading users down far-right rabbit holes. By analyzing and coding over 400 recommended videos after interacting solely with transphobic content, Media Matters traced how TikTok’s recommendation algorithm quickly began populating our research account’s FYP with hateful and far-right content.

TikTok has long been scrutinized for its dangerous algorithm, viral misinformation, and hateful video recommendations, yet this new research demonstrates how the company’s recommendation algorithm can quickly radicalize a user’s FYP. 

Transphobia is deeply intertwined with other kinds of far-right extremism, and TikTok’s algorithm only reinforces this connection. Our research suggests that transphobia can be a gateway prejudice, leading to further far-right radicalization. 

To assess this phenomenon, Media Matters created a new TikTok account and engaged only with content we identified as transphobic. This included accounts that had posted multiple videos which degrade trans people, insist that there are “only two genders,” or mock the trans experience. We coded approximately the first 450 videos fed to our FYP. Even though we solely interacted with transphobic content, we found that our FYP was increasingly populated with videos promoting various far-right views and talking points.

That content did include additional transphobic videos, even though such content violates TikTok’s “hateful behavior” community guidelines, which state that the platform does not permit “content that attacks, threatens, incites violence against, or otherwise dehumanizes an individual or group on the basis of” attributes including gender and gender identity.

Key Findings

  • After we interacted with anti-trans content, TikTok’s recommendation algorithm populated our FYP feed with more transphobic and homophobic videos, as well as other far-right, hateful, and violent content.
  • Exclusive interaction with anti-trans content spurred TikTok to recommend misogynistic content, racist and white supremacist content, anti-vaccine videos, antisemitic content, ableist narratives, conspiracy theories, hate symbols, and videos including general calls to violence.
  • Of the 360 total recommended videos included in our analysis, 103 contained anti-trans and/or homophobic narratives, 42 were misogynistic, 29 contained racist narratives or white supremacist messaging, and 14 endorsed violence.

Trends in TikTok's "For You" page (FYP) narratives following engagement with transphobic content

  • Transphobia and homophobia

  • Of the 360 videos analyzed, 103 (or 28.6%) were found to contain homophobic and/or anti-trans content. The transphobic and homophobic content fed to our FYP began with repetitive “jokes” berating transgender and LGBTQ people, frequently involving a popular meme saying that these individuals are “fatherless.” As we watched and interacted with more videos, the content fed to our FYP became increasingly violent. In one example, a video with over 200,000 views featured a video game gunman shooting and killing characters at a gay pride celebration with the text: “Don’t mind me. Just doing god’s work rn.” The comments section was filled with users praising these violent actions. The most liked comment on this video: “Beautiful now do it in real life.”

    homophobic tiktok video

  • Coded calls to violence 

  • Many of TikTok’s popular videos involve the combination of viral audio with user-created videos and text. Several videos fed to our FYP employed this memetic format to convey anti-LGBTQ views. These videos are examples of how TikTok users are utilizing the platform’s unique combination of audio, video, and text to create hateful -- and sometimes violent -- content. While the text itself might not explicitly violate TikTok’s community guidelines, put in context, these videos appear to be both hateful and dangerous. 

    For instance, one video depicted the user pretending to cry at the text on his screen, reading: “50% of transgenders take their own lives. This fills me with so much sadness.” However, the accompanying audio was the line “We’re halfway there,” cut from Bon Jovi’s Livin’ on A Prayer -- implying that the user wanted the rate of suicides to be higher.

    tiktok_transphobic

    Another video fed to our FYP amassed over 166,000 views and featured a person shaking their head with the accompanying text: “I can’t believe people used to get killed because they were gay.” Although the text alone may not appear problematic, in combination with the video’s audio -- the line “Wish we could turn back time, to the good old days,” from the Twenty One Pilots song Stressed Out -- the meaning is modified to imply the user wants to return to a time when rates of violence against LGBTQ people were even higher.

    tiktok_"I can't believe people used to get killed because they are gay"

    In another example, a user followed the same pattern -- using the Twenty One Pilots’ audio with overlaid text, reading: “I can’t believe that trans people used to be put in mental institutions.” Again, this juxtaposition of image, text, and audio imparts a more sinister meaning than may initially appear.

    tiktok transphobic

  • TedPilled 

  • Several of the videos fed to our FYP featured audio called “Teddy” which says, “The industrial revolution and its consequences have been a disaster for the human race,” a quote taken from the first line of “Unabomber” Ted Kaczynski’s manifesto, “Industrial Society and Its Future.” Kaczynski killed three people and injured 23 others in 1978 with homemade bombs. The audio, which already has been used in over 1,200 unique videos, has become popular on right-wing TikTok and particularly in anti-LGBTQ content, based on the frequency with which it emerged during our study. In the videos we reviewed, it was frequently paired with montages of screenshots of LGBTQ people livestreaming on TikTok. These videos not only use an audio that pays homage to a terrorist, but they also promote the harassment of LGBTQ TikTok users. 

    image of tiktoks with certain tiktok sound

  • Prominent far-right figures

  • In 27 of the 360 videos fed to our FYP, users praised prominent right-wing or far-right figures, such as former President Donald Trump, U.K. Prime Minister Boris Johnson, far-right media personality Steven Crowder, far-right and white nationalist political commentator Nick Fuentes, the Daily Wire’s Ben Shapiro, former leader of the British Union of Fascists Oswald Mosley, and Paul Nicholas Miller (a.k.a. The GypsyCrucader, a far-right internet personality known for cosplaying as the Joker and promoting white supremacist views on online chatting service Omegle).

  • Hate symbols and dog whistles

  • Our account was fed three videos containing far-right dog whistles and seven videos containing hate symbols, including videos and profile pictures containing black suns, skull masks, and a rising sun flag. Users also included antisemitic dog whistles in their usernames, such as “jewlover_14” and “1400.8000,” which reference the “14 words” and Hitler (H being the eighth letter in the alphabet). 

    tiktok_hate symbolstiktok_hate symbols

  • Racism and white supremacy 

  • Racist and white supremacist content was not immediately fed to our FYP, but it was increasingly recommended as we reviewed and engaged with more transphobic content. One viral video with 5.2 million views displayed screenshots of Google search results for “three Black teenagers” vs. “three white teenagers.” When this search result first went viral in 2016 to illustrate the ways that algorithms reinforce racism, the TikTok users sharing it presented the findings differently. Instead, the most-liked comments on the video perpetuate the hateful and racist narrative that Black people are criminals. The top comment, with 72,200 likes says, “Well that’s not Google’s fault.”

    tiktok_Racism and white supremacy2

    One user with a black sun in their profile picture responded to an educational video about the racist roots of classical music. The video begins with the educational TikTok “Classical music is racist” but then cuts to the user playing classical music on his computer while including the hashtags #racist #racism #sigma #chad #alphamale.

    tiktok_racism

    Twenty-nine of the 360 videos reviewed contained racist themes, including overtly white supremacist content. For instance, one video with nearly 900,000 views from the account “wh1t3prideworldwide” (using numbers instead of letters to evade the ban on white supremacist rhetoric), said, “I love being wh1te… look at my peoples land and what we built #proud #white #colonizer #whiteproud.”

    tiktok_i love being white

  • Far-right

  • Within the first 100 videos, the algorithm began to show us content promoting far-right talking points and figures. For example, one video on our FYP was from an account called “lithuanian_nationalistas” which celebrated Paul Miller, a far-right extremist currently facing up to 30 years in federal prison. Of the 360 total videos reviewed, 27 overtly promoted far-right political ideologies. These videos included the celebration of fascist historical figures such as Oswald Mosley, edits intended to praise Polish nationalism, the far-right slogan “Reject modernity, embrace tradition,” and repetitive criticism of “degeneracy” (often referring to LGBTQ people).

    tiktok_far right

    tiktok_far right

  • Conclusion

  • This research illustrates that on TikTok, engaging with anti-trans content alone is enough to turn an individual’s FYP into a cesspool of extremism, hatred, and violence. In many instances, videos containing harmful or violative content expressed more than one form of dangerous rhetoric. For example, many videos containing racist rhetoric or tropes also frequently included instances of transphobia, homophobia, misogyny, and/or antisemitism. One video, for instance, celebrated the idea of kids failing their “transgender people of colour history test.” The intersectional nature of these videos means creators can incorporate various strains of bigotry and hate and thus introduce viewers to new, dangerous ideologies. 

    Other social media companies have struggled with their platforms radicalizing users into the far right, but TikTok’s rapid supply of content appears to allow exposure to even more hateful content in a fraction of the time it takes to see such content on YouTube. When a user opens TikTok, a video immediately starts playing. Even the seemingly simple requirement of clicking a post or video on sites like Reddit or YouTube is largely removed from TikTok’s user interface. While nearly 400 may sound like a large number of videos, if a user watches videos for an average of 20 seconds each, they could consume 400 videos in just over two hours. A user could feasibly download the app at breakfast and be fed overtly white supremacist and neo-Nazi content before lunch. 

  • Methodology

  • Media Matters created a new TikTok account using a device used solely for related research and engaged with only transphobic content. We initially followed 14 known transphobic accounts and then began watching  videos on the platform. 

    As we were fed videos on our FYP, we engaged with only transphobic videos, followed the creators of this content, and catalogued the ensuing hateful narratives fed to our FYP after this engagement. In total, we watched and evaluated 498 videos, 64 of which were sponsored advertisements and not user-driven content, 35 that were removed before we could double code them, and 39 which were viewed while creating the account or to determine the nature of FYP narratives throughout the review. The remaining 360 videos were evaluated with the following definitions:

    • “Transphobic accounts”: accounts that have posted multiple videos which degrade trans people, insist on there being “only two genders,” and/or mock the trans experience. These videos or accounts also often contain other anti-LGBTQ rhetoric. 
    • “Engaging with”: liking videos, following creators, saving sounds, watching videos in full.

    This set of 360 videos were then sorted into 30 different categories. Most relevant for this study were categories related to transphobic and homophobic content, misogyny, violence, the far right, hate symbols, racism, nationalism, white supremacy, dog whistles, antisemitism, and violence. To be assigned a final code, a majority of three researchers had to agree on a code after a blind review.

    We defined “transphobic” posts as those that attack or mock trans people (including nonbinary people), say there are “only two genders,” or mock or deface the trans flag. 

    We defined “homophobic” posts as those that attack or mock lesbian, gay, or bisexual people or mock or deface the pride flag. 

    We defined “violent” posts as those that depict violent imagery or celebrate, encourage, or advocate for past or future acts of violence.

    We defined “prominent far-right figures/prominent far right talking points” posts as those that contain prominent far-right figures such as Ben Shapiro, Alex Jones, Paul Joseph Watson, Steven Crowder, Paul Nicholas Miller, and Nick Fuentes, or content which includes common far-right talking points such as “reject modernity, embrace tradition,” criticism of so-called “degeneracy,” or celebration of “western civilization.” 

    We defined “hate symbol” posts as those which include symbols of hate such as the swastika, sonnenrad, or rising sun. 

    We defined “misogynistic” posts as those which attack, degrade, or mock women or support anti-feminism and the men’s rights movement. 

    We defined “dog whistle” posts as those which contain white supremacist and neo-Nazi dog whistles such as references to the “14 words” or the number “88” used as a vieled “Heil Hitler,” in the video, caption, or username. 

    We defined “racist/white supremacist” posts as those which attack, degrade, or mock a particular race or ethnic group or support white supremacy by using white supremacist talking points such as “white pride” or “white lives matter.”   

    We defined “antisemitic” posts as those which attack, degrade, or mock Jewish people, deny the Holocaust, or promote antisemitic conspiracy theories. 

    We defined “COVID-19 or vaccine misinformation” posts as those which spread false information about the pandemic or the vaccines or advocated against life-saving protective measures. 

    We defined “sponsored” posts as any video that is fed directly to the FYP by TikTok and is labeled as sponsored content. 

    We defined “other” content as any that did not fall into any of the aforementioned categories.