Agra Teen’s ‘Fake Suicide’ Video Triggers Meta AI Alert: Here’s How The System Detects Self-Harm Content | Viral News


Last Updated:

The girl later admitted her mistake and was counselled by the police. She was even warned against posting misleading content related to self-harm.

font
The incident took place in Agra's Fatehpur Sikri. (representative image)

The incident took place in Agra’s Fatehpur Sikri. (representative image)

To gain more followers, a 17-year-old girl in Agra’s Fatehpur Sikri filmed a ‘fake’ suicide attempt video and posted it on social media. The incident on Thursday triggered an alert from the AI monitoring system of Meta, following which the police reached her home.

According to police, the clip showed the teenager drinking a liquid from a bottle and later collapsing. However, the content of the clip was flagged as a potential suicide attempt by Meta’s AI system and an immediate alert was sent to the concerned authorities to take action.

Soon after, the social media cell alerted local police about the incident and the location of the girl was traced. Later on, they found out that it was a ‘staged’ video and the girl was fine. She consumed only water and not any poisonous substances. After admitting her mistake, the girl was counselled by the police and warned against posting misleading content on social media related to self-harm.

How Does Meta AI Work?

Meta AI and other safety technologies on Facebook and Instagram identify potential crimes and content related to self-harm through a combination of proactive artificial intelligence monitoring, machine learning as well as human review. After detecting imminent risk or illegal activity, these systems take multiple actions that range from displaying resource helplines to notifying law enforcement.

Providing Resources

When a person expresses suicidal thoughts, it becomes critical to get help as quickly as possible. The suicide prevention resources, available on Facebook and Instagram, have been developed with leading mental health organisations by taking inputs from those with personal experience.

With the help of machine learning technology, Meta has expanded its ability to identify possible suicide or self-injury content. In several countries, it utilises this technology to get timely help for those in need.

The technology uses pattern-recognition signals like phrases and comments of concern for identifying possible distress.

“We use artificial intelligence to prioritise the order that our team reviews reported posts, videos and live streams. This ensures that we can efficiently enforce our policies and get resources to people quickly. It also lets our reviewers prioritise and evaluate urgent posts, contacting emergency services when members of our community might be at risk of harm. Speed is critical,” according to Meta.

The content is then escalated to the community operations team, which decides whether it violates the policies or whether to recommend contacting local emergency responders.

How to Get Immediate Help

Meta’s technology for identifying possible suicide and self-injury integrates into both Facebook and Instagram posts, along with Facebook and Instagram Live.

During a live video, if somebody is considering self-harm, then those watching can contact the person directly or even report the matter.

Once it gets reported, a review is done by a member of the Community Operations team.

“In serious cases, we work with emergency services to conduct a wellness check. Thanks to Meta technology, we’ve helped first responders quickly reach people in distress,” Meta stated.

DISCLAIMER: If you or someone you know needs help, call any of these helplines: Aasra (Mumbai) 022-27546669, Sneha (Chennai) 044-24640050, Sumaitri (Delhi) 011-23389090, Cooj (Goa) 0832- 2252525, Jeevan (Jamshedpur) 065-76453841, Pratheeksha (Kochi) 048-42448830, Maithri (Kochi) 0484-2540530, Roshni (Hyderabad) 040-66202000, Lifeline 033-64643267 (Kolkata)

News viral Agra Teen’s ‘Fake Suicide’ Video Triggers Meta AI Alert: Here’s How The System Detects Self-Harm Content
Disclaimer: Comments reflect users’ views, not News18’s. Please keep discussions respectful and constructive. Abusive, defamatory, or illegal comments will be removed. News18 may disable any comment at its discretion. By posting, you agree to our Terms of Use and Privacy Policy.

Read More



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *