This time last year, Google Images, the site used by most people to find photos on the Internet, exposed children to countless graphic hardcore pornography images in less than 1 second, even for innocent or educational searches. Searches for basic anatomical terms did not yield scientific drawings but instead returned endless pages of images of and links to hardcore pornography in Google Images. Innocent phrases like “happy black teens” returned results of sexual abuse and torture. Tragically, we’ve even heard of school children being exposed to degrading and graphic sexual content through Google Images while doing research for school.
If someone typed in “sex” with SafeSearch turned OFF (a thing countless kids are likely doing to learn more about the subject not readily discussed by the adults in their life), one could scroll for a long time and would not see scientific images or drawings. Instead, you were met with thousands of hardcore pornography images, where penetration was clearly visible with a focus on the genitalia. A number of the images depicted what looked like gangbangs and were coming directly from hardcore pornography websites.
Over the years, we have brought these concerns to Google. However, when we realized last year that the amount and extremity of the images had become much worse, we made a big push for Google to improve their policies and algorithms. In meetings, Google said there was nothing they could do about it.
So we kept bringing it up, persistently, and we shared stories and examples from many of you. This week, we are happy to announce that they have made significant improvements!
Now Google Images will return educational drawings for anatomical search terms. Even sexualized terms are no longer yielding hardcore pornography (it’s now Cosmo-like images with no nudity). Even if someone types in a graphic pornographic term, there is dramatically reduced hardcore pornography in Google Images search results.
It is important to note that Google Images will still yield many hypersexualized images, and some images with bare breasts, and if one types in search terms frequently used in porn, it will yield some hardcore images as well. This is why it is EXTREMELY IMPORTANT that whenever children are online, they should be monitored by an adult, preferably a parent if possible.
Google also provides a tool called Google SafeSearch that dramatically reduces accidental stumbling into these places (here’s how to turn it on), and a third-party filter is also highly recommended (here are some we think are great).
We still have more we’re working on with Google, including further improvements to Google Images, but we are glad to report this big step forward thanks to the help of thousands of concerned citizens.
Think how many kids who are innocently searching will now no longer be thrown into the darkest parts of humanity in a fraction of a second.
Please consider these three actions right now:
- Sign this petition to thank Google for these improvements
- Send an email to Google to let them know they still need to do more!
- Help spread the word to families around you the need for diligence in safety online. We have many resources here through the CESE Safeguard Alliance.
- Consider a gift of financial support. Without your generosity, we would not be able to represent your interests with global corporations like Google.
We know that this will prevent many people, particularly school children, from being unintentionally exposed to pornography.
We’re glad to report this important step forward in our ongoing quest to keep children safe online.
Thank you for being diligent and watchful on this matter and for joining with us where you can! We do feel like this is a huge improvement and will prevent untold numbers of kids from accidentally stumbling into the most depraved and abusive environments online through innocent searches.