A new blog post from Pandu Nayak, Google Fellow and Vice President of Search, explains how Google Search uses MUM and BERT to serve safer search results.
- Google is using MUM to better detect when the query indicates a searcher is in crisis and will begin rolling out these improvements within weeks.
- Google uses BERT to improve its understanding of when a searcher is looking for explicit content.
- Using BERT in this way has reduced “unexpected shocking results” for searchers by 30% in the past year, according to Google.
Google Using MUM to Better Serve Searchers in Personal Crisis
“…people in personal crises search in all kinds of ways, and it’s not always obvious to us that they’re in need. And if we can’t accurately recognize that, we can’t code our systems to show the most helpful search results,” Nayak wrote.
Using machine learning to improve its understanding of language is helping Google to more accurately detect when search results should include the phone numbers of relevant crisis lines, for example.
“MUM can better understand the intent behind people’s questions to detect when a person is in need,” Nayak explained, adding that this helps Google “more reliably show trustworthy and actionable information at the right time.”
Google plans to roll out these improvements in the coming weeks.
Google Has Reduced Shocking Search Results by 30% This Year
Unexpected search results are rarely a good experience – and sometimes, they can be harmful and cause distress.
That’s why it’s essential that Google is able to better read into the intent of each searcher so that the results they’re presented align with their expectations.
SafeSearch mode enables searchers to filter out explicit results. However, there are occasions when that’s exactly what a person might be searching for.
“BERT has improved our understanding of whether searches are truly seeking out explicit content, helping us vastly reduce your chances of encountering surprising search results,” Nayak wrote.
Over the past year, using BERT in this way has reduced “unexpected, shocking” results by 30%, he revealed.
According to Nayak, BERT has been “especially effective in reducing explicit content for searches related to ethnicity, sexual orientation and gender, which can disproportionately impact women and especially women of color.”
Google to Use MUM to Scale Spam Fighting in Multiple Languages
Google uses AI to reduce spam and unhelpful results in various locations.
And in the coming months, it will put MUM to work to scale these safety measures even where it has very little training data.
This is possible because, as Nayak explained, “When we train one MUM model to perform a task — like classifying the nature of a query — it learns to do it in all the languages it knows.”
Google assured searchers that these latest changes have been and will continue to be tested rigorously, including being assessed by manual search raters.
Google promoted the post via its @SearchLiaison account:
Learn more about how AI systems like BERT and MUM are helping and will help Google Search to better provide people with personal crisis support information, as well as further reduce chances of getting unexpected explicit content or spam in our results: https://t.co/TMOIgKvx9e
— Google SearchLiaison (@searchliaison) March 30, 2022
Image source: Shutterstock/metamorworks
!function(f,b,e,v,n,t,s) if(f.fbq)return;n=f.fbq=function()n.callMethod? n.callMethod.apply(n,arguments):n.queue.push(arguments); if(!f._fbq)f._fbq=n;n.push=n;n.loaded=!0;n.version='2.0'; n.queue=;t=b.createElement(e);t.async=!0; t.src=v;s=b.getElementsByTagName(e); s.parentNode.insertBefore(t,s)(window,document,'script', 'https://connect.facebook.net/en_US/fbevents.js');
if( typeof sopp !== "undefined" && sopp === 'yes' ) fbq('dataProcessingOptions', ['LDU'], 1, 1000); else fbq('dataProcessingOptions', );
fbq('trackSingle', '1321385257908563', 'ViewContent', content_name: 'google-ai-safe-search', content_category: 'digital-experience news seo ' );