Several U.S. police departments have raised concerns about a new feature in iPhones following the iOS 17 update, named “NameDrop.” This feature, which is enabled by default, facilitates the wireless sharing of contact information and images between iPhones when they are held close together. The Middletown Division of Police in Ohio and the Watertown CT Police Department in Connecticut have both issued alerts via Facebook, emphasizing the potential risks this feature poses, especially to children and other vulnerable groups.

To disable the NameDrop feature, users are advised to go to their iPhone’s “Settings,” select “General,” then “AirDrop,” and finally turn off the “Bringing Devices Together” option.

The Oakland County Sheriff’s Office in Michigan and the Greenville County Sheriff’s Office in South Carolina have also issued warnings. The Greenville office highlighted that for the feature to work, both phones must be unlocked, held closely, and the user must accept the transfer prompt, which could be easily overlooked by children, the elderly, or vulnerable individuals.

Cybersecurity expert Amir Sachs, in an interview with CBS 12, acknowledged the concerns but suggested that the risk might not be as significant as perceived, noting that phones must be very close for the feature to work. He explained that NameDrop presents two options: “Receive Only” or “Share,” with the former allowing only receiving information without sharing any.

 

 

The feature, while raising security concerns about unintended information sharing, is designed for convenience, allowing users to quickly add multiple contacts at events like meetings.

Additionally, there is growing apprehension regarding the safety of children in the digital realm, particularly the risk of image manipulation. The FBI has warned that sending images to strangers could lead to their manipulation and misuse on various platforms. Digital safety expert Yaron Litwin recommended caution with online image sharing, suggesting the use of closed networks with known contacts.

In response to these concerns, a group of attorneys general from 52 U.S. States and territories has urged Congress to examine the potential misuse of AI in creating child sexual abuse material (CSAM) and to formulate laws to prosecute such offenses. Their letter highlighted the risk posed by AI-generated deepfakes of children. A report by the National Center for Missing and Exploited Children indicated a significant rise in reports of CSAM images, underscoring the urgency of addressing these digital safety issues.

 

 

Floating Vimeo Video