Apple criticised for system that detects child abuse;Worry Over authoritarian governments could expand and use technology to monitor their citizens

Washington:
Apple has come under fire for discovering a new Child Sexual Abuse Material (CSAM) system on a user device in the US, U2019. The technology will look for known CSAM matches before the image is stored in iCloud Photos. But some people worry that authoritarian governments could expand and use technology to monitor their citizens. WhatsApp CEO Will Cathcart called Apple’s move ‘very worrying.’ Apple said that new versions of iOS and iPadOS that will be released later this year will have new encryption apps to help limit the online spread of CSAM, while also aiming to protect user privacy. u201d Matches will be reported and then reviewed manually. You can then take steps to disable the user’s account and report it to the authorities.