The tool is for “adults over 18 years old who think an intimate image of them may be shared; or has already been shared, without their consent,” Meta said in a blogpost on Thursday.
The new platform, which Meta developed together with the UK Revenge Porn Helpline and 50 other NGOs, aims to prevent the publication of ‘revenge porn’; rather than just removing the delicate files after they’ve already appeared online.
Concerned users are being asked to submit photos; or videos of themselves naked or having sex to a hash-tagging database through the StopNCII.org (Stop Non-Consensual Intimate Images) website.
The special hashtags, or “digital fingerprints,” are then assigned to those materials by the tool, and can be used to instantly detect and curb attempts to upload them online by the perpetrators.
Meta said that the system had been developed “with privacy and security at every step.” Only the hashtags are being shared with StopNCII.org and the tech platforms participating in the project; while the explicit images and clips never leave the user’s device and remain “securely in the possession of the owner,” it assured.
The new tool represents “a sea-change in the way those affected by intimate image abuse can protect themselves;” Revenge Porn Helpline manager Sophie Mortimer insisted.
But the question remains whether people will actually be willing to use it; considering Meta’s bad rap for mishandling user data.