Roblox, Discord, Openai and Google launch Non -profit organization called RoostOr reliable open online safety tools that hope to “create a scalable, compatible safety infrastructure suitable for the AI era.”
The organization plans to provide free open -source security tools for state and private organizations for use on their own platforms, with a special emphasis on children's safety. The press release that Roost specifically causes plans to offer “tools for detecting, consideration and messages about sexual violence over children (CSAM).” Partner companies provide funding for these tools and technical examination for their creation.
The ROOST operating theory is that access to generative AI quickly changes the online landscape, which makes the need for “reliable and affordable safety infrastructure” even more urgent. And instead of expecting that a small company or organization will create its own security tools from scratch, Roost wants to provide them for free.
Children's online security was the problem of Du Joura after the law on protecting the confidentiality of children and adolescents (Coppaand children's law on online security (But it was) They began to make their way through the Congress, although both could not go in the house. At least some companies participating in Roost, in particular Google and Openai, Also already promised To prevent the use of CSAM generation tools.
The problem of children's safety insists even more on Roblox. Since 2020, Two -thirds of all children of the United States from nine to 12 Play Roblox, and the platform historically struggled to cope with the safety of children. Bloomberg Businessweek reported that the company had a “pedophile problem” in 2024, which caused some Changes in politics And new Restrictions around children's DMSThe ROOST field will not force all these problems to disappear, but should facilitate work with them for any other organization or company that is in the Roblox position.