Your Web News in One Place

Help Webnuz

Referal links:

Sign up for GreenGeeks web hosting
March 30, 2022 04:00 pm GMT

Google applies advanced AI to suicide prevention and reducing graphic search results

A smartphone showing a Google search bar.

Google has been rolling out smarter Artificial Intelligence systems, and says it is now putting them to work to help keep people safe.

In particular, the search giant shared new information Wednesday about how it is using these advanced systems for suicide and domestic violence prevention, and to make sure people don't see graphic content when that's not what they're looking for.

When people search phrases related to suicide or domestic violence, Google will surface an information box with details about how to seek help. It populates these boxes with phone numbers and other resources that it creates in partnership with local organizations and experts.

But Google found that not all search phrases related to moments of personal crisis are explicit, and many are geographically specific. For example, there are some places known as suicide "hot spots." Previously, Google manually tagged searches for known hotspots that would surface harm reduction information boxes. But thanks to new AI tools, the search engine may know that a search is related to suicide — and surface these informational boxes — without explicit human direction.

Google gives the example that in Australia, people ideating suicide may search for "Sydney suicide hot spots." Google says its new language processing tools allow it to understand that what a person is really looking for here is jumping spots — and that they may be in need of help.

"Not all crisis language is obvious, particularly across languages and cultures," Anne Merritt, a Google product manager who worked on the harm reduction project, said.

Similarly, sometimes long, complex searches about relationships might contain information that suggests a person is being abused. Previous systems might have had trouble identifying that crucial piece of information amid other noise. But Google says MUM is better adept at understanding long queries, so it can surface domestic violence information boxes in these instances.

A smartphone screen showing a Google information box that gives resources for someone seeking help for domestic violence
An information box that will surface if MUM detects a person might be experiencing abuse. Credit: Google

Google's other innovation is making sure users don't accidentally stumble across graphic content if that's not what they're looking for. Even without the "safe search" setting enabled, Google says putting a new AI system to work on this challenge has reduced graphic search results by 30 percent.

To demonstrate the technique, Google gave the example of a person searching for a music video. Many music videos contain, well, plenty of nudity or partial nudity. So Google will opt to show videos that don't contain graphic content like nudity or violence, unless that's what a person is explicitly looking for. This might sound a bit prudish, and potentially unfair to musicians who include nudity or other graphic content as part of their art. But as the arbiters of search, Google has apparently chosen to be safe instead of sorry (lest they accidentally traumatize someone).

"We really want to be very clear that a user is seeking something out before we return it," Emma Higham, a product manager who works on safe searches, said.

The new AI engines powering these changes are called MUM and BERT. The newest tech, MUM, stands for Multitask Unified Model. Google says MUM is better at understanding the intention behind a person's search, so it gives more nuanced answers than older models provided. It is also trained in 75 languages, so it can answer questions using information from sources written in languages other than the one a person is searching in. Google will be employing MUM for its crisis prevention efforts "in the coming weeks."

Google says MUM is 1,000x more powerful than the second newest search innovation, BERT, which stands for Bidirectional Encoder Representations from Transformers. But don't underestimate BERT, which is able to understand words in terms of the context of the words that surround it, and not just for a word's standalone meaning. That's what makes it an effective tool for reducing graphic content searches.

An information box and cleaner search results can only stem so much of the tide that is the stress and trauma of daily life, especially on the internet these days. But that's all the more reason for Big Tech to invest in technological tools with applications like these.


Original Link: https://mashable.com/article/google-suicide-domestic-violence-ai-prevention

Share this article:    Share on Facebook
View Full Article

Mashable

Mashable is the top source for news in social and digital media, technology and web culture.

More About this Source Visit Mashable