These personal experiences tell us that TikTok clearly suppresses content from creators of marginalized communities.

And that these communities have found and developed ways to resist the algorithm’s negative effects. 

Of course if TikTok’s algorithm just appeared one day, then each of these instances of  suppression could be written off as randomness and coincidence. But the algorithm did not simply appear one day. It was written by humans. Humans with biases. And these human biases translated from their mind, whether consciously or unconsciously, into the code that they wrote, resulting in the censorship of marginalized communities.

Thomas Mullaney introduces the collaborative book, Your Computer Is On Fire, by centering the message and reminding readers,

“Every single thing that ‘happens online,’ ‘virtually,’ and ‘autonomously’ happens offline first- and often involves human beings whose labor is kept deliberately invisible. Everything is IRL. Nothing is virtual” (Mullaney 6).

It would be ignorant for us to think that a social media platform that was created by flawed humans would be flawless. We often think that increasing our presence online will bring us closer to a utopia where we can escape the human features that divide us. But in reality we are further flooding the internet with our own prejudices and bigotry. We are coding injustice and oppression into the fabric of the internet.

Furthermore, the justification that TikTok gives in removing or muting content is called their Community Guidelines. In a study done to rank the most popular social media platforms and compare their community guidelines, TikTok ranked in the middle with 48 topics covered in their Community Guidelines. With content addressed concerning things like human trafficking, hate speech, and violence in order to protect their viewers from having to encounter those types of content. However, TikTok’s goal is to protect their users, and promoting diversity is truly a priority as they have stated it was, then they have failed miserably and actually actively gone against their own word and guidelines by censoring content from creators from marginalized communities.  

One of the areas that TikTok’s community guidelines address is “Hateful Behavior.” Here’s what their website has to say about it:

They claim to protect attacks against certain attributes including:

  • Race 
  • Ethnicity
  • National origin 
  • Religion
  • Caste 
  • Sexual orientation
  • Sex
  • Gender
  • Gender identity
  • Serious disease
  • Disability
  • Immigration status

Based on the personal testimonies of TikTok users and creators, we’ve answered the question of whether or not TikTok suppresses content from minority communities. Participants of both studies make it clear that TikTok does favor majority creators and makes it harder for creators of color and those in the LGTBQ+ community to reach viewers. 

According to their own guidelines, these videos should be protected to promote diversity and inclusion. These are the videos by creators with the explicitly listed protected attributes in their community guidelines. While whether or not silencing can be considered a form of hate speech can be debated, my point here is that TikTok still directly opposes its own community guidelines by not fostering a diverse and inclusive space. They use these same community guidelines to take down videos and de-platform creators that they claim to protect. By their own standards, TikTok should be held accountable for their acts of suppression.