top of page

Scary "Safe" TikTok Changes: Who's Being Silenced?

  • Writer: McKenna Cupidro
    McKenna Cupidro
  • Jun 10
  • 3 min read


TikTok’s algorithm is evolving toward an SEO-based ranking system.



Recent updates to TikTok’s algorithm, pointed out by Metricool’s blog, show a clear shift. Content is now being prioritized based on search terms, not just watch time or engagement. This marks a move toward an SEO-style content discovery model, which rewards creators who align their content with what users are actively searching for.


In theory, this might sound like a good thing— it could help users find exactly what they’re looking for. But in practice?


Let’s break it down:


  • TikTok is now prioritizing content based on search terms, not just watch time or engagement.

  • It pushes videos that align with trending or “safe” topics more than controversial or niche ones.

  • This algorithmic shift limits the reach of advocacy, political critique, or non-

    mainstream voices, especially if they

    don’t match monetizable or brand-safe themes.


What this means:


The platform promotes videos focusing more on popular or safe topics than on controversial or niche subjects. This change in the algorithm reduces the visibility of advocacy, political critique, or non-mainstream voices, particularly if they do not fit in with brand-friendly or monetizable themes. This could mean that creators who talk about systemic injustice, marginalized identities, or political accountability often see their content ignored or restricted from the new algorithm updates. This could happen no matter how they express their ideas or how accurate or well-intentioned they are.


So, is that what we really see?


Please check out this Instagram post from Vote in Or Out, highlighting extreme concern in their comment sections.


Screenshot from Instagram about TikTok concerns


What we see in the screenshot and video doesn’t match those new guidelines now, does it?

  • Creators and users (like @voteinorout) are experiencing suppressed visibility or outright blocks on harmless words.

  • Some users are reporting reduced engagement, even when their content is non-harmful, often affecting liberal, progressive, or activist accounts.


What we are seeing is a form of algorithmic silencing. This fits into a larger pattern of platform governance under political influence, where tech companies align with or respond to government pressures—in this case, the Trump administration’s agenda.

What we’re seeing, both in the algorithm article and the comments, is a live example of platform governance in action. TikTok and Instagram are shifting from engagement-first platforms to policy-influenced ecosystems. We’re watching what happens when government pressure and algorithmic power collide, especially under a political climate where leadership is becoming increasingly unapologetic.




A Pattern of Political Influence in Platform Governance


This issue isn't isolated. It aligns with a broader trend in which tech platforms respond quietly to government pressures. Especially under the Trump administration’s renewed influence.




Here’s what that pattern looks like:


  • Targeting dissenting or marginalized voices, including liberal, LGBTQ+, disabled, and BIPOC creators

  • Promoting “anti-woke” rhetoric under the guise of "neutrality" or "fairness"

  • Introducing moderation tools and AI filters that flag content aligned with social justice movements while allowing inflammatory or bigoted content to remain visible

  • Encouraging tech companies to enforce vague standards of “bias” or “safety” that disproportionately impact progressive voices


Why This Matters?!


If it wasn’t obvious already. This raises major questions:


  • Who decides what’s “harmful” or “inappropriate”?

  • How are all moderation tools being influenced politically?

  • Is shadowbanning a type of hidden censorship based on beliefs they're trying to feed us?


This isn’t just about TikTok being banned or possible AI Bots on Instagram. It’s about the erosion of free expression in digital spaces under the pressure of politics, profit, and a false sense of neutrality. TikTok and Instagram are no longer just engagement-driven platforms.


They’re becoming gatekeepers of discourse, selectively amplifying what fits a certain narrative. A narrative I remind you that is shaped by power, not people. What we’re watching unfold across algorithms, flagging systems, and “invisible” blocks is not just censorship.


It’s a quiet recalibration of truth. It’s a subtle but thoughtful adjustment in our understanding of what is true. This process unfolds quietly, allowing us to reassess our beliefs and perceptions without disruption.


And if we don’t call it out now? Our political landscape will only be further divided.





 
 
 

Comments


bottom of page