Roblox is in hot water again, and this time, it’s over user-made games referencing the assassination of political commentator Charlie Kirk.
Over the past 24 hours, Roblox moderators have been scrambling to remove a wave of “experiences” (Roblox’s term for user-created games) that depicted or referenced Kirk’s killing. These weren’t just simple text memes as some included playable simulations of the shooting, avatars designed to look like Kirk, and other shock-value content that violated Roblox’s rules on real-world violence and exploitation of tragedies.
Roblox confirmed it had deleted over 100 of these experiences in a single day, starting late September 11, 2025.
“We removed dozens of in-game experiences in the last 24 hours after seeing violative content around Charlie Kirk,” a Roblox spokesperson said in a statement.
The removals happened fast, but not before the issue got political. Rep. Anna Paulina Luna posted on X on September 11, tagging Roblox and threatening to push Apple to remove the Roblox app from the App Store and even contact the FCC to take down Roblox’s site, if the content wasn’t dealt with immediately.
A few hours later, she posted an update saying Roblox (and other platforms like TikTok) had “fully cooperated” and removed the games.
The situation also lit up X overnight. Some users blasted Roblox for failing to catch the content before it went viral, saying kids were being exposed to violent, real-world tragedy roleplay. Screenshots of Kirk avatars in unrelated games spread quickly, with some calling for Roblox to be “canceled.”
Others pushed back, arguing that Roblox’s automated moderation simply can’t predict or instantly catch content tied to breaking news. The platform has millions of active developers publishing new games every day which means something shocking can pop up and get discovered by players before moderators even know it exists.
This isn’t the first time Roblox has faced backlash for moderation slip-ups. The company has spent the past few years tightening its safety systems, including stricter chat filters for under-13 players but critics say moments like this show there’s still a gap between Roblox’s promises and what players actually see in-game.
And honestly, they have a point. While Roblox deserves credit for pulling the content fast once it was flagged, the fact that players could stumble into assassination roleplay in what’s supposed to be a kid-friendly platform is pretty alarming.
This is one of those situations where both sides are kind of right. Roblox’s moderation team was fast once the issue blew up, but for parents and players, the damage was already done. The idea of kids loading into a game and seeing something tied to a real-world murder is… unsettling, to say the least.
It raises a bigger question: can a platform with this much user-generated content ever truly be “safe” in real-time? Or are these occasional flare-ups just the price of giving millions of people the freedom to create?
Either way, Roblox is going to have to keep proving to parents that it’s on top of situations like this because one high-profile incident can undo a lot of trust, fast.