Receiving an unsolicited image is an unpleasant experience at the best of times, and one that technology has made all too common. At WWDC, Apple announced iOS 17 will use an on-device machine learning model to scan both images and videos for nudity. When detected, you’ll get a pop-up, telling you the system thinks the file may be inappropriate.
Recommended Article
- Intrigues, Power play over Akeredolu’s ill-health
- Tom Cruise is extremely interested in pursuing Shakira and sent her flowers
- Reddit app developer says the site’s new API rules will cost him $20 million a year
- Sport EPL: Harry Maguire told to leave Manchester United for rivals
- Wema Bank SME Business School Trains 250 SMEs In Benin City (PHOTO NEWS)
I wonder how much of this is a response to the practice of AirDropping inappropriate images to an unsuspecting person’s phone. One notable incident from 2022 saw a person removed from a flight after they had shared an image of themselves with other passengers. That AirDrop images have visible previews, too, currently makes it harder for people to avoid catching an eyeful.

Discover more from Freelanews
Subscribe to get the latest posts sent to your email.
Discussion about this post