Instagram is developing a tool that can block unwanted nude images sent via direct message (DM), a spokesperson from the parent company Meta confirmed. The feature is said to work by detecting a nude image and covering it up before giving the user the option to view it or not. More details are expected to be released in the coming weeks, but Instagram claims it will not be able to view or share actual images with third parties. This was confirmed by Liz Fernandez, Meta's Product Communications Director, who stated that it will help users protect themselves from nude images as well as other unwanted messages. She told The Verge, "We are working closely with experts to ensure that these new features maintain people's privacy while giving them control over the messages they receive."
The news about the feature was first announced on Twitter by mobile developer Alessandro Paluzzi. He indicated that Instagram is working on protection against nudity in conversations, sharing a screenshot of what users might see when opening the feature. Fernandez compared the feature to Instagram's "hidden words" feature introduced last year, which allows users to automatically filter messages containing words, phrases, and emojis they do not want to see. She also confirmed that the protection against nudity will be an optional feature that users can toggle on and off at their discretion. It is still in the early stages of development, but it is hoped that it will help reduce incidents of " flashing," which occurs when an unsolicited sexual image is sent to someone on their mobile devices by an unknown nearby person, potentially through social media, messaging, or other sharing functions like Airdrop or Bluetooth.


