A new online tool aims to give control back to teenagers to take down explicit images and videos of themselves from the internet. The warning typically goes, once you send the photo you cannot take it back.
But it ignores the reality that many teenagers do send explicit photos, and do not consent to those photos being shared wider.
Called Take It Down, the tool is operated by the National Center for Missing and Exploited Children, and funded in part by Meta Platforms, the owner of Facebook and Instagram.
It marks a step to crack down on the spread of “revenge porn” images of teenagers, as well as any image shared without explicit consent.
The site lets anyone anonymously — and without uploading any actual images — create what is essentially a digital fingerprint of the image. This fingerprint (a unique set of numbers called a “hash”) then goes into a database and the tech companies that have agreed to participate in the project remove the images from their services.
Several platforms are participating. Meta’s Facebook and Instagram, OnlyFans, Mindgeek’s Pornhub, and Yubo are on board.
If the image is on another site, or if it is sent in an encrypted platform such as WhatsApp, it will not be taken down.
There are some issues though. If someone alters the original image — for instance, cropping it, adding an emoji, or turning it into a meme — it becomes a new image and thus needs a new hash.
Images that are visually similar — such as the same photo with and without an Instagram filter, will have similar hashes, differing in just one character.
Meta’s efforts come nearly a year and a half after, Antigone Davis, Meta’s global safety director, was grilled by Senators about the impact its apps have on younger users after an explosive report indicated the company was aware that Facebook-owned Instagram could have a “toxic” effect on teen girls.
Although the company has rolled out a handful of new tools and protections since then, some experts say it has taken too long and more needs to be done.