California wants AI Chatbots to warn users that they are not people.


Although Chatbots is successful in testing, they will have to stop playing games if they work in California. New money Proposed Steve Padilla's Steve Steve Padilla's senator must use the chat that interacts with children to warn from time to time that in reality, machinery and not real people.

bill SB 243It is introduced as part of the effort to control the prevention that the company that operates the Chatbots must be placed to protect the child. In the midst of the specifications, the money will be determined: will prohibit the company from the “reward” to users to increase participation or use that the company must report to the Ministry of Health care that the minor shows the killing signal. Frequent death The chat is created and not humans.

The final bit is special, especially in the current period because the children are shown that they are quite risky to these systems. Last year, 14 years old Sad to live his own life After emotional development and connection with Chatbot, it can be accessible by Character.ai. Service for creating a simulated chatbots according to different pop culture characters. Children's parents have The character sued Above the death, accusing the platform of “Unreasonable danger” and there is no enough safety fence in the place, despite being marketing for children.

Researchers at Cambridge University meet Children are more likely than adults to watch AI Chatbots as reliable, even looking at them as humans. That may cause children to have an important risk when chatting responds to their promotion without any protection. For example, the researchers can accept the AI ​​in Snapchat. Give advice to users aged 13 years, hypothesis About how to let her parents measured in 30 years and lose her innocence

Vein Benefits that may occur For children, they feel free to share their feelings as well. If they help them show in a place where they feel safe. But the risk of separation is true A little reminder that no one in the other side of the conversation may be useful and intervention in the drug cycle of drug addiction that technology platforms have expertise in child trapping. Dapamine hit It is a good starting point. The failure in that type of intervention when social media began to be part of the methods that we came here from the beginning.

But these protection does not mention the roots that lead to children who are looking for the support of the chat bots at first. There is a lack of victory resources to facilitate real life for children. The classroom is too subjected and Not receiving fundsThe program after school is reduced.Third place“Still disappearing and has Children's shortage shortage To help the children carry out everything they have to face It is good to warn the children that the chat is not true. But it is better to keep them in a situation where they don't feel that they have to talk to the bot from the beginning.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *