Meta admitted CNBC This Instagram experiences a mistake that floods user accounts using videos that usually do not appear in its algorithms. “We establish a mistake that forced some users to see the content in their tape on Instagram, which should not have been recommended,” said News Organize. “We apologize for the mistake.” Users turned to the platforms of social networks to ask other people whether they were also recently flooded with coils that contain violent and sexual topics. One User on Reddit He said that their pages on the drums were littered with school shootings and murders.
Other said They receive the reverse video, such as shock, decapitation and castration, nudity, Extranster of porn And direct rape. Some said that they still see similar videos, even if they allowed their confidential content control. Social networks algorithms are designed to show you the video and other content, similar to those that you usually watch, read, as or interaction. In this case, however, Instagram shows graphic videos even to those who did not interact with similar drums, and sometimes even after the user has took time To click “not interested” on a coil with violent or sexual content.
Meta representative did not say CNBC The fact that it was the error, but some of the videos that people reported should not have been on Instagram, primarily on the basis of the company's own policy. “To protect users … we delete the most graphic content and add warning marks to another graphic content so that people know that it can be sensitive or alarming before they press,” the company's policy ReadsMeta Rules field also the state That this delements “Real photos and videos about nudity and sexual activity.”
If you buy something by reference in this article, we can earn a commission.