Meta to integrate ‘Take it Down’ tool to help minors remove explicit images from Facebook, Instagram
A free new tool backed by Meta and developed by The National Center for Missing & Exploited Children (NCMEC) will help remove sexually explicit images of minors from the internet. Named Take It Down, the tool allows users to anonymously report and remove “partially nude, or sexually explicit photos and videos depicting a child under 18 years old.”
Meta products Facebook and Instagram have signed on to integrate the tool into their platforms already, and are joined by OnlyFans, Pornhub, Mindgeek, and Yubo. While Take It Down is designed for minors to self-report images and videos of themselves, the tool can also be used by adults who appeared in such media when they were below 18. Parents and other trusted adults can make a report on behalf of a child too.
Take It Down works by assigning a unique digital fingerprint, called a hash value, to specific images and videos. These hash values are then used to detect and remove the imagery, without the image/video ever leaving a device or anyone viewing it.
“Having explicit content online can be scary and very traumatizing, especially for young people,” said Gavin Portnoy, vice president of Communications & Brand at NCMEC. “The adage of ‘you can’t take back what is already out there’ is something we want to change. The past does not define the future and help is available.”
Take It Down is similar to StopNCII – a free tool designed to support victims of Non-Consensual Intimate Image (NCII) abuse, which also uses hash values to detect and remove explicit content on platforms like Facebook, Instagram, and Bumble.
Lately, Meta has been focusing on upgrading its safety and privacy norms for minors and teens on its platform. In November, the company announced that it’ll automatically set more private settings for teens and minors joining Facebook or Instagram. And last month, it updated the advertisement policy on Facebook and Instagram, restricting the amount of data advertisers can use to target teens.
A free new tool backed by Meta and developed by The National Center for Missing & Exploited Children (NCMEC) will help remove sexually explicit images of minors from the internet. Named Take It Down, the tool allows users to anonymously report and remove “partially nude, or sexually explicit photos and videos depicting a child under 18 years old.”
Meta products Facebook and Instagram have signed on to integrate the tool into their platforms already, and are joined by OnlyFans, Pornhub, Mindgeek, and Yubo. While Take It Down is designed for minors to self-report images and videos of themselves, the tool can also be used by adults who appeared in such media when they were below 18. Parents and other trusted adults can make a report on behalf of a child too.
Take It Down works by assigning a unique digital fingerprint, called a hash value, to specific images and videos. These hash values are then used to detect and remove the imagery, without the image/video ever leaving a device or anyone viewing it.
“Having explicit content online can be scary and very traumatizing, especially for young people,” said Gavin Portnoy, vice president of Communications & Brand at NCMEC. “The adage of ‘you can’t take back what is already out there’ is something we want to change. The past does not define the future and help is available.”
Take It Down is similar to StopNCII – a free tool designed to support victims of Non-Consensual Intimate Image (NCII) abuse, which also uses hash values to detect and remove explicit content on platforms like Facebook, Instagram, and Bumble.
Lately, Meta has been focusing on upgrading its safety and privacy norms for minors and teens on its platform. In November, the company announced that it’ll automatically set more private settings for teens and minors joining Facebook or Instagram. And last month, it updated the advertisement policy on Facebook and Instagram, restricting the amount of data advertisers can use to target teens.
A free new tool backed by Meta and developed by The National Center for Missing & Exploited Children (NCMEC) will help remove sexually explicit images of minors from the internet. Named Take It Down, the tool allows users to anonymously report and remove “partially nude, or sexually explicit photos and videos depicting a child under 18 years old.”
Meta products Facebook and Instagram have signed on to integrate the tool into their platforms already, and are joined by OnlyFans, Pornhub, Mindgeek, and Yubo. While Take It Down is designed for minors to self-report images and videos of themselves, the tool can also be used by adults who appeared in such media when they were below 18. Parents and other trusted adults can make a report on behalf of a child too.
Take It Down works by assigning a unique digital fingerprint, called a hash value, to specific images and videos. These hash values are then used to detect and remove the imagery, without the image/video ever leaving a device or anyone viewing it.
“Having explicit content online can be scary and very traumatizing, especially for young people,” said Gavin Portnoy, vice president of Communications & Brand at NCMEC. “The adage of ‘you can’t take back what is already out there’ is something we want to change. The past does not define the future and help is available.”
Take It Down is similar to StopNCII – a free tool designed to support victims of Non-Consensual Intimate Image (NCII) abuse, which also uses hash values to detect and remove explicit content on platforms like Facebook, Instagram, and Bumble.
Lately, Meta has been focusing on upgrading its safety and privacy norms for minors and teens on its platform. In November, the company announced that it’ll automatically set more private settings for teens and minors joining Facebook or Instagram. And last month, it updated the advertisement policy on Facebook and Instagram, restricting the amount of data advertisers can use to target teens.
comments
Meta to integrate ‘Take it Down’ tool to help minors remove explicit images from Facebook, Instagram
comments