Home > Media News > How safe is 'Roblox' for kids? Platform brings in real-time AI moderation to ...

How safe is 'Roblox' for kids? Platform brings in real-time AI moderation to make gameplay safer
31 Mar, 2026 / 10:48 AM / ROBLOX

26 Views

Khaleej Times: Around 5,000 servers are being shut down daily on Roblox as part of a new AI moderation system that analyses entire gameplay scenes in real time. The rollout marks a significant shift in how the platform detects and responds to violations.

The system, called RM3 (Real-time multimodal moderation), moves beyond traditional moderation tools that focus on individual elements like chat messages or in-game assets. Instead, it evaluates a combination of signals: what players are saying, how avatars are behaving, and what’s happening within a scene in, as the name suggests, real-time.

For years, Roblox has relied heavily on communication-based moderation like filtering text and flagging conversations that break its rules. But in a user-generated platform built on dynamic 3D environments, that approach has limitations.

“Moderating 3D environments has always been a challenge,” says Matt Kaufman in a chat with Khaleej Times. “The content is dynamic. People move things around and behaviour changes.”

RM3 builds on that idea by layering multiple forms of real-time analysis. The system periodically captures what players see on screen and runs those snapshots through AI models trained to detect violations. At the same time, it evaluates avatar movement patterns and in-game interactions to identify behaviour that deviates from expected gameplay.

“Normal gameplay follows certain patterns,” Kaufman explains. “When something deviates from that, it can signal behaviour that shouldn’t be happening.”

Large Roblox experiences often run across thousands of servers simultaneously. While most of these operate as intended, issues can arise in isolated instances. Previously, moderation could impact an entire experience, even if only a small portion of it was problematic.

“We’ve identified cases where most of a game is fine, but a small number of servers are problematic,” Kaufman says.

With RM3, Roblox can now target those instances directly by shutting down specific servers where violations occur, while allowing the rest of the experience to continue uninterrupted. The approach reduces disruption for players while maintaining stricter enforcement where needed.

Balancing automation with oversight
Introducing real-time, context-aware moderation also raises questions around accuracy and overreach.

To address this, Roblox tested RM3 in what it calls “shadow mode” before deploying it. The system would flag servers without taking action, allowing human reviewers to assess whether those flags were justified.

“We look at both sides,” Kaufman says. “Did the system take something down correctly? And did it miss something it should have caught?”

This evaluation process focuses on both false positives and missed violations, using additional data such as screenshots and abuse reports to refine the system. According to Kaufman, this remains an ongoing process as the algorithm continues to evolve.

The rollout of RM3 comes as platforms like Roblox face increasing scrutiny over user safety, particularly given the scale of its younger audience.

Kaufman positions the system as part of a broader effort to maintain a controlled and consistent environment on the platform.

“We want Roblox to be a safe and stable place for people to come together,” he says. “We can’t control what happens outside the platform, but we can control what happens on Roblox.”

While RM3 was not designed specifically for younger users, Kaufman describes it as “age-agnostic,” built to enforce community standards across all users. In practice, however, systems like this are expected to play a key role in protecting younger audiences, who make up a significant portion of the platform.

What more is Roblox doing?
Beyond real-time moderation, Roblox is continuing to invest in other safety measures, including age verification and parental tools.

The company’s global age verification rollout now covers roughly 45 per cent of its daily users, giving Roblox greater confidence in how it manages interactions between different age groups.

“When you don’t know how old users are, it makes certain decisions difficult,” Kaufman says. “As we get more confidence in that, it opens up more opportunities from a safety perspective.”

Parental engagement is another area of focus. Rather than relying on a single control or setting, Roblox is working to make it easier for parents to understand and manage their children’s activity on the platform.

“It’s not about one setting,” Kaufman says. “It’s about having that dialogue.”

As a platform driven by user-generated content, Roblox places responsibility on both itself and its developer community. And Kaufman says that It’s a "shared responsibility”. Developers are expected to follow platform guidelines and make adjustments if their experiences begin attracting problematic behaviour, even if they are technically compliant.

At the same time, Roblox continues to provide tools to help creators monitor and manage their own environments.

Kaufman also affirms that there's no challenge they would call out as the single most important one. isn't a single biggest challenge. Communication moderation, age verification, and systems like RM3 all work together. "They’re equally important," he says. "And we continue to invest in all of them"