Apple plans to scan your photos for child porn, why some tech experts are concerned

WASHINGTON — Apple has unveiled new plans to scan your iPhone and other personal devices for child porn.

The company said this is an effort to find sexual predators and report them to authorities.

Apple said they aren’t scanning every photo in our iCloud.

“I don’t like it because it does invade our privacy and my pictures are my pictures and I don’t want them shown all about,” said iPhone user Robert Phillipson.

Apple said this new feature is about flagging child porn.

The company said the software will try to match photos from the National Center for Missing and Exploited Children database.

If enough photos are flagged by the system, an Apple review team will disable the person’s account and report it to authorities.

Johns Hopkins University computer science professor Matthew Green has written about Apple’s privacy method. He said this is uncharted territory.

“We don’t know how it’s going to be abused or what the privacy concerns are because it’s the first time someone has gone this far into the actual things that are on your device but there are concerns,” said Green.

Apple claims there’s a one in a trillion chance of flagging an incorrect account.

But Green isn’t convinced.

“At the end of the day what Apple scans for is up to apple and the assumption that this technology works perfectly and will never miss fire or will never trigger on the wrong kind of photo and right now this technology is not very well tested,” said Green.

But others are willing to give this new technology a chance.

“It’s a serious crime and if there’s a way for us to detect it, then I think that we should at least consider,” said iPhone user Kimisha Cassidy.

Apple said people who believe their accounts were wrongly flagged can file an appeal.

This company said this process isn’t happening immediately. The company says the new feature will start with the next Apple update scheduled for later on this year.