Meta Bringing Forth New Nudity Filter for Instagram

In a move that some might applaud and others criticize for various reasons, Instagram, owned by Meta (Facebook’s parent company) is reportedly testing a new nudity exposure protection feature for photos that may contain nude people or body parts in direct messages.

This nudity protection addition in Instagram isn’t live yet though the company has notified media sources that it is indeed developing it. Before this, the new nudity protection function was discovered by an app developer named Alessandro Paluzzi, who has made something of a hobby of reverse engineering apps to dig up early versions and clues of upcoming updates.

Following Paluzzi’s Twitter publication of his discovery, Meta confirmed that the feature is in development and will let users filter or block unsolicited nude photos in direct messages and thus “shield themselves from nude photos as well as other unwanted messages” according to a report by The Verge, with which Meta confirmed its upcoming filter.

According to the social media giant, the nudity filter will be somewhat similar to its existing “Hidden Words” option in Instagram, which lets users filter direct message requests based on certain types of content and keywords that might trigger or offend them.

Meta also claims that the upcoming update will not let the company itself view the content of these potentially offensive messages, and it won’t share the content with third parties either. According to Liz Fernandez, a spokesperson for the company, “We’re working closely with experts to ensure these new features preserve people’s privacy while giving them control over the messages they receive,”

Meta hasn’t shared further details about just how the technology will work at filtering photos or what else it will do.

Image credit: Alessandro Paluzzi

That there is a problem with rude messages or ones containing outright sexual harassment