Privacy experts who spoke to WIRED described Rumble, Quora and WeChat as unusual suspects but declined to speculate on the rationale behind including them in the investigation. Josh Golin, executive director of the nonprofit Fairplay, which advocates for digital safety for children, said the concerns are not always clear-cut. For example, few advocacy groups worried about Pinterest until the case of a British teenager he said, who died of self-harm after being exposed to sensitive content on the platform.
Paxton's press release last month called his new investigation “an important step toward ensuring that social media and AI companies comply with our laws designed to protect children.” children from being exploited and harmed.”
The US Congress has never passed it Comprehensive privacy lawsand it has not significantly updated online safety rules for children for a quarter of a century. That has left state lawmakers and regulators playing a big role.
Paxton's investigation focuses on compliance with the Texas Children's Online Protection through Parental Empowerment, or SCOPE, Act. has taken effect in September. It applies to any website or app that has a chat or social networking function and registers users under the age of 18, making it broader than the federal law, which only covers services that cater to users under 13 years old.
SCOPE requires services to ask users' ages and give parents or guardians permission over children's account settings and user data. Companies are also prohibited from selling information collected about minors without parental permission. In October, Paxton sued TikTok for allegedly violating the law by providing inadequate parental controls and disclosing data without consent. TikTok has denied the allegations.
The investigation announced last month also addressed the Texas Data Privacy and Security Act, or TDPSAeffective from July and requires parental consent before processing data about users under 13 years of age. Paxton's office asked the companies under investigation to detail their compliance with both the SCOPE Act and TDPSA, according to a legal request obtained through a public records request.
In total, the companies must answer eight questions next week, including the number of Texas minors they count as users and are prohibited from registering incorrect dates of birth. The list of persons to whom the minor's data is sold or shared must be transferred. It's impossible to know whether any company has met the demand.
Tech company lobbies are challenging the constitutionality of the SCOPE Act in court. In August, they won an initial and partial victory when a federal judge in Austin, Texas, ruled that the provision requires companies to take steps to prevent minors from seeing content self-harm and abuse are too vague.
But even an outright victory won't be the end of the world for tech companies. Ariel Fox Johnson, an attorney and principal at the consulting firm Digital Smarts Law & Policy, said states including Maryland and New York are expected to implement similar laws starting later this year. And state attorneys general may resort to pursuing narrower cases under their tried-and-true laws aimed at preventing deceptive business practices. “What we see is that information is often shared, sold or disclosed in ways that families do not expect or understand,” Johnson said. “As more and more laws are introduced that create firm requirements, it seems increasingly clear that not everyone is complying.”