Google Explains How AI Keeps Search Safe Via MUM & BERT

A new blog post from Pandu Nayak, Google Fellow and Vice President of Search, explains how Google Search uses MUM and BERT to serve safer search results.

Highlights:

  • Google is using MUM to better detect when the query indicates a searcher is in crisis and will begin rolling out these improvements within weeks.
  • Google uses BERT to improve its understanding of when a searcher is looking for explicit content.
  • Using BERT in this way has reduced “unexpected shocking results” for searchers by 30% in the past year, according to Google.

Google Using MUM to Better Serve Searchers in Personal Crisis

“…people in personal crises search in all kinds of ways, and it’s not always obvious to us that they’re in need. And if we can’t accurately recognize that, we can’t code our systems to show the most helpful search results,” Nayak wrote.

Using machine learning to improve its understanding of language is helping Google to more accurately detect when search results should include the phone numbers of relevant crisis lines, for example.

“MUM can better understand the intent behind people’s questions to detect when a person is in need,” Nayak explained, adding that this helps Google “more reliably show trustworthy and actionable information at the right time.”

Google plans to roll out these improvements in the coming weeks.

Google Has Reduced Shocking Search Results by 30% This Year

Unexpected search results are rarely a good experience – and sometimes, they can be harmful and cause distress.

That’s why it’s essential that Google is able to better read into the intent of each searcher so that the results they’re presented align with their expectations.

SafeSearch mode enables searchers to filter out explicit results. However, there are occasions when that’s exactly what a person might be searching for.

“BERT has improved our understanding of whether searches are truly seeking out explicit content, helping us vastly reduce your chances of encountering surprising search results,” Nayak wrote.

Over the past year, using BERT in this way has reduced “unexpected, shocking” results by 30%, he revealed.

According to Nayak, BERT has been “especially effective in reducing explicit content for searches related to ethnicity, sexual orientation and gender, which can disproportionately impact women and especially women of color.”

Google to Use MUM to Scale Spam Fighting in Multiple Languages

Google uses AI to reduce spam and unhelpful results in various locations.

And in the coming months, it will put MUM to work to scale these safety measures even where it has very little training data.

This is possible because, as Nayak explained, “When we train one MUM model to perform a task — like classifying the nature of a query — it learns to do it in all the languages it knows.”

Google assured searchers that these latest changes have been and will continue to be tested rigorously, including being assessed by manual search raters.

Google promoted the post via its @SearchLiaison account:


Image source: Shutterstock/metamorworks