Saturday, November 23 Bitcoin là gì? Có nên đầu tư vào bitcoin hay không?

As the old saying goes: If you aren’t doing anything illegal, then you have nothing to fear from surveillance.

Smartphones already act like tracking devices broadcasting the whereabouts of their owners, but Apple is about to open the door to far more advanced forms of smartphone-based voluntary surveillance by launching a new program designed to detect and report iPhone users who are found to have child pornography – known by the academic-speak acronym CSAM – which stands for Child Sexual Abuse Materials. According to a handful of academics who were offered a sneak preview of the company’s plans – then promptly spilled the beans on Twitter, and in interviews with the press.

The new system, called “neuralMatch”, is expected to be unveiled by Apple later this week. The software is expected to be installed on American iPhones via a software update. According to the FT, the automated system can proactively alert a team of human reviewers if it believes CSAM is present on a user’s iPhone. If the reviewers can verify the material, law enforcement will be contacted.

This is how “neuralMatch” will work, per the FT:

Apple’s neuralMatch algorithm will continuously scan photos that are stored on a US user’s iPhone and have also been uploaded to its iCloud back-up system. Users’ photos, converted into a string of numbers through a process known as “hashing”, will be compared with those on a database of known images of child sexual abuse.

[…]

The system has been trained on 200,000 sex abuse images collected by the US non-profit National Center for Missing and Exploited Children.

One academic who was offered a preview of the software explained why this could create serious privacy risks. Apple has gotten a lot of positive press for its commitment to user privacy – remember when it refused to crack an iPhone belonging to one of the San Bernardino shooters? Well, this encryption technology has become a perennial headache for law enforcement. Last January, Apple quietly abandoned plans to allow users to fully encrypt their iCloud backups due to complains from law enforcement.

Now, Apple has found a middle ground: it will assume responsibility for policing iPhones – well, at least to a degree. To accomplish this, the company is rolling out a new machine-learning tool that will scan iPhones for images that match certain “perceptual hashes” known to represent child pornography. But as academics have complained, could potentially be misled.

What’s more, the tool that’s today being used to unearth child pornography could one day be abused by authoritarian governments (like the CCP). And once Apple has committed to using this type of surveillance, governments will demand it from everyone.

Continue: Zerohedge.com

Share.

Leave a Reply