scan iPhones for Child Abuse

Apple is not so keen for now to scan iPhones for Child Abuse

In a shocking move, Apple has delayed its plan to scan iPhones for child abuse. The company announced that they would use the technology as part of their new iOS update in order to prevent children from being exposed to too much violence and hate online.

However, there have been concerns over privacy laws because it means scanning any iPhone user’s photos without consent or warrant first.

In a shocking turn of events, Apple decided not to implement an automatic filter on all users’ phones after receiving backlash regarding people’s right-to-privacy issues with such findings going against current digital law standards set forth by GDPR (General Data Protection Regulation).

Apple was planning to use NeuralHash technology which scans images that are about to be uploaded onto iCloud Photos. The NCMEC technology has the power to compare the iCloudphotos with a database that has known child sexual abuse material.

Acting on the NCMEC technology could have been a fine plan because this way the guilty party could be handed over to the authority but for now Apple is not so keen on acting on it. 

Apple has revealed that it needs more time to review this step because there are some considerable parties involved in it.

It further stated that we have taken time to be deliberate because it is an important decision.

Apple stated that it aims for this technology to not just improve child safety but also empower people with more control over their privacy.

The upcoming steps are still in shadow and we are not certain how Apple will respond in the future.

Also Read:  The Oculus Quest is back in stock at Amazon and elsewhere

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *