Google is introducing a new online safety feature to help users avoid inadvertently seeing graphically violent or pornographic images while using its search engine. Announced as part of the company’s Safer Internet Day event on Tuesday, the new default setting enabled for everyone will automatically blur explicit images that appear in search results, even for users that don’t have SafeSearch enabled.
“Unless your account is supervised by a parent, school, or administrator, you will be able to change your SafeSearch setting at any time” after the feature launches in “the coming months,” Google spokesperson Charity Mhende tells The Verge, allowing users to select between explicit content being blurred, filtered, or shown in your search results. Parents and guardians can add supervision to Google accounts for children below the applicable age to self-manage their own accounts, which allows them to monitor and block access to certain websites or apps.
SafeSearch is already the default for signed-in users under the age of 18, as it helps to filter out explicit content such as pornography, violence, and gore when using Google to search for images, videos, and websites. When the blur feature launches, it will appear as a new item within the SafeSearch menu, alongside the option to disable SafeSearch entirely and a filter option to additionally hide explicit text and links. Disabling SafeSearch entirely provides the most relevant results without hiding any content.
You can modify your SafeSearch filter by following Google’s instructions, but you’ll have to wait a while before the blur option is rolled out.
Update, February 7th, 1:40PM ET: Google says the blur setting can be disabled by anyone not using a supervised account, not just those over 18 years old, as the company originally told us.