Facebook and Instagram Build New Tool to Stop Revenge Porn

Facebook and Instagram, through their parent company Meta, along with a United Kingdom-based nonprofit Revenge Porn Helpline have built a tool designed to prevent the non-consensual spread of those photos on its platforms.

Facebook has been working on a solution to the spread of non-consensual intimate images, often referred to as “revenge porn” for some time. In 2017, it launched a pilot program in Australia where it solicited photos from the public in order to create “hashes” that could be used to detect similar images on its platform. This new tool builds on that system and allows those who are worried that their intimate images or videos could be shared online to take action to help stop their spread.

Facebook’s initial approach was to act as a first-party site that would act as a barrier for the spread of revenge porn but faced pushback because the images were reviewed by human moderators during the process before they were converted into unique digital fingerprints, or hashes, NBC News reports. Facebook also stored the images for a short time before deleting them.

Facebook, now Meta, has decided to move that system to an independent third party, Stop Non-Consensual Intimate Images (StopNCII), which was developed with the consultation of 50 worldwide partners that specialize in image-based abuse and online safety, This non-profit will act at the reporting point and Meta’s properties will only receive the data afterward and not the original offending images as was the case in the pilot program.

Those who are concerned that, for example, disgruntled ex-partners might publish their intimate images online can submit the photos through a service provided by StopNCII. During that process, StopNCII receives consent and asks for confirmation that the submitter is the one in the photo. The photos submitted are then converted into hashes, which are then passed on to partner companies starting with Facebook and Instagram.

StopNCII does not have access to the original images. Instead, they are converted into hashes in the submitter’s browser and StopNCII only ever receives the hashes.

Meta’s Facebook and Instagram are the first two companies to be part of the program, but others have reportedly expressed interest in joining. Those who are part of the program would use the hashes to look for matches on their platforms. If one is found, content moderators would review the images to see if a content violation has occurred and if one has, they can delete all instances of the photo and block it from being uploaded again.

This process might be a more feasible method of protecting privacy than what has been previously attempted. Both Facebook and Apple have faced harsh criticism for the methods each have tried to employ to combat revenge porn, with Apple the latest to pull back on plans to scan all photo libraries for offending images after considerable backlash.


Image credits: Header image via Meta.