Geekzone: technology news, blogs, forums
Guest
Welcome Guest.
You haven't logged in yet. If you don't have an account you can register now.


GSManiac

415 posts

Ultimate Geek


#289023 7-Aug-2021 15:29
Send private message

I’ve always appreciated Apple and their commitment to privacy but after yesterdays news about upcoming changes to iCloud photos, they’ve lost my trust.

I honestly can’t take anything apple says about their privacy features seriously anymore now that apple have a back door into photos stored on iCloud.
A sad day.

View this topic in a long page with up to 500 replies per page Create new topic
 1 | 2 | 3
Affiliate link
 
 
 

Affiliate link: Find your next Lenovo laptop, desktop, workstation or tablet now.
Linux
8999 posts

Uber Geek

Trusted
Lifetime subscriber

  #2756543 7-Aug-2021 15:32
Send private message

Link off to this news?


jarledb
Webhead
2811 posts

Uber Geek

Moderator
ID Verified
Trusted
Lifetime subscriber

  #2756545 7-Aug-2021 15:34
Send private message

Not sure what I think about it, but this is how the Guardian article about Apple's neuralMatch starts:

 

"Apple’s tool, called neuralMatch, will scan images before they are uploaded to the company’s iCloud Photos online storage, comparing them against a database of known child abuse imagery. "

 

 

 

Sounds to me like it will be done on device and not by scanning iCloud.


zocster
1914 posts

Uber Geek

ID Verified
Trusted
Lifetime subscriber

  #2756547 7-Aug-2021 15:35
Send private message

https://www.theregister.com/2021/08/05/apple_csam_scanning/



MaxineN
1037 posts

Uber Geek

ID Verified
Subscriber

  #2756549 7-Aug-2021 15:48
Send private message

And this will currently only be available in the USA. Other countries will be done case by case. 

 

 

 

Quote from that guardian article.

 

"Apple’s tool, called neuralMatch, will scan images before they are uploaded to the company’s iCloud Photos online storage, comparing them against a database of known child abuse imagery. "

 

Keyword "known". Which means images from honey pots and previous child abuse cases have already been fed into this system so it probably won't falsely trip easily and if it does I imagine there will be some human intervention?

 

 

 

Another quote from that article. 

 

"Alongside the neuralMatch technology, Apple plans to scan users’ encrypted messages as they are sent and received using iMessage. An AI-based tool will attempt to automatically identify sexually explicit images, enabling parents to turn on automatic filters for their children’s inboxes. That system, which is purely aimed at providing tools to “warn children and their parents when receiving or sending sexually explicit photos”, will not result in sexually explicit images being sent to Apple or reported to the authorities. But parents will be able to be notified if their child decides to send or receive sexually explicit photos."

 

So the way iCloud accounts and family groups work is that you'd have to have accounts set to children for this to even trip the warning.

 

 

 

Edit: This happens on the device NOT on their servers. If CSAM IS detected it will go to their servers through an encrypted voucher compared to matched CSAM database where if CSAM is detected it will be reported and the iCloud account will be permanently banned.

 

Sure you're right to be concerned about what else they could do with this such as anti government and/or expressing freedom but again this is just for CSAM in the USA currently which I think is as far as it should go.

 

 

 

I'm not trying to down play your concerns but at the same time maybe this isn't as evil?





Ramblings from a mysterious lady who's into tech. Warning I may often create zingers.

 

Opinions are my own. They don't represent my employer.


antonknee
1085 posts

Uber Geek

Subscriber

  #2756558 7-Aug-2021 16:45
Send private message

There’s a bit of misinformation floating around about this, and the one people are most concerned about (rightly so) is the scanning of photos on one’s iPhone for CSAM. Things to note:

 

  • It only applies in the US (but could be rolled out to other countries selectively)
  • It only scans photos which will be uploaded to iCloud (so if you aren’t using iCloud photos, your photos will not be scanned - but obviously that could change at any time, all you have is Apple’s word)
  • The fingerprint is matched to known CSAM (so it won’t capture photos of your kids in the bath)
  • There is a threshold of qualifying images, once this threshold is reached, a human review is flagged before anything is passed on to law enforcement

Personally, I’m very concerned about this functionality. Beyond Apple saying “we won’t”, there is nothing to stop this being used for other purposes. Take the example of the Chinese government requesting that Apple use this tech to look for images that match the tank man photo. Apple have previously confirmed to the demands of various governments  

 

I think this marks a distinct step away from Apple’s previous approach and the effective promise they’ve made their customers, and I’m disappointed frankly. This won’t effect me as I don’t use iCloud photos anymore, however I am concerned about where this could lead  

 

I do not want to hear “if you have nothing to hide, you have nothing to fear” which is a commonly trotted out fallacy. Do you not close the door when you use the bathroom? 


GSManiac

415 posts

Ultimate Geek


  #2756559 7-Aug-2021 16:49
Send private message

It’s all very well to say “this is to catch child predators” and of course no one is going to argue with that.
But the potential for future misuse is definitely there and that’s the concerning part of it all. It’s a very slippery slope.

What if the NZ Govt ask apple to scan ICloud for copies of the Chch mosque shooters manifesto or Russia to find anti govt pictures on iCloud. Feels very Minority Report to me. Suddenly the original remit for the feature is out the window. Then what next.


It only takes the smallest of backdoors to creep slowly further open. It’s very concerning.


Check out this thread on Macrumors which I feel explains things better

https://www.macrumors.com/2021/08/06/apple-to-consider-csam-detection-per-country/


MaxineN
1037 posts

Uber Geek

ID Verified
Subscriber

  #2756572 7-Aug-2021 17:41
Send private message

GSManiac: It’s all very well to say “this is to catch child predators” and of course no one is going to argue with that.
But the potential for future misuse is definitely there and that’s the concerning part of it all. It’s a very slippery slope.

 

 

 

That's exactly my point. It's only for this and that's where it should stop(I am biased due to being a victim of sexual abuse so I am all for this noble mission that some folks are calling it to protect young users and children from sexual abuse). Anything further and Apple would be doing more than just back tracking on promises but a massive privacy risk(iCloud is not the most secure thing in the world and it's proven many times that Apple doesn't have the best track with security with their services).

 

GSManiac:
What if the NZ Govt ask apple to scan ICloud for copies of the Chch mosque shooters manifesto or Russia to find anti govt pictures on iCloud. Feels very Minority Report to me. Suddenly the original remit for the feature is out the window. Then what next.

It only takes the smallest of backdoors to creep slowly further open. It’s very concerning.

 

 

 

I hear ya but the manifesto is (probably, am assuming) is illegal to have and/or share in NZ(for GOOD reason too Christ we don't need more people radicalized by it, this isn't expression of freedom here) and your other example of anti government pictures/documents/messages of Russia is 1, a political nightmare 2, yes an actual concern for people who want to express their freedom(not a fan of Putin if you've figured this out) and 3 I imagine that other countries will react to innocent people expressing and/or having anti government content for a government that is quite frankly a human rights violator.

 

Again in my first sentence in this post, if Apple does anything more than just CSAM, it's a massive privacy breach and it's back tracking on it's promises completely and that's where I will draw the line myself and will be throwing in my hat, switching back to Android(reluctantly) and cancelling my Apple subs.

 

 





Ramblings from a mysterious lady who's into tech. Warning I may often create zingers.

 

Opinions are my own. They don't represent my employer.




Behodar
8308 posts

Uber Geek

Trusted
Lifetime subscriber

  #2756576 7-Aug-2021 17:47
Send private message

antonknee: once this threshold is reached, a human review is flagged

 

This is the bit that irks me. If it's all "on-device" and there's no backdoor, then how can a human review it?


antonknee
1085 posts

Uber Geek

Subscriber

  #2756584 7-Aug-2021 18:47
Send private message

Behodar:

 

antonknee: once this threshold is reached, a human review is flagged

 

This is the bit that irks me. If it's all "on-device" and there's no backdoor, then how can a human review it?

 

 

My understanding is that the scanning of the images, creation of the fingerprint/hash, and the comparison to the fingerprint/hash of known CSAM is all done on device, prior to the image being uploaded to iCloud (note again that this only applies to images which are going to be uploaded to iCloud). A security voucher is created and also uploaded, which indicates whether a match was found by the device. If there are more than x matches for a given user, then all the matches are reviewed by a human at Apple - which includes the security voucher and a low-res version of the image.

 

Worth pointing out that in theory and technically speaking, a human at Apple could presumably already look at your photos which are uploaded to iCloud - as I don't think iCloud Photos is encrypted anyway. 


antonknee
1085 posts

Uber Geek

Subscriber

  #2756585 7-Aug-2021 18:55
Send private message

MaxineN:

 

...switching back to Android(reluctantly) and cancelling my Apple subs...

 

 

 

 

I believe that Google already does this for any content uploaded to Google Photos and Google Drive, as does Microsoft for OneDrive. Google even shares their technology and API for doing this. DropBox, Twitter, FaceBook, YouTube, Reddit, Cloudflare all make use of similar technology. I guess the main difference is that the scanning takes place on the service post-upload rather than on the device pre-upload - but in practical terms it's not materially different.

 

It's safe to assume that all cloud providers are monitoring content you've uploaded to them for CSAM and using techniques like image hashing/fuzzy hashing to determine if the material is CSAM.

 

 


MaxineN
1037 posts

Uber Geek

ID Verified
Subscriber

  #2756586 7-Aug-2021 19:10
Send private message

antonknee:

 

MaxineN:

 

...switching back to Android(reluctantly) and cancelling my Apple subs...

 

 

 

 

I believe that Google already does this for any content uploaded to Google Photos and Google Drive, as does Microsoft for OneDrive. Google even shares their technology and API for doing this. DropBox, Twitter, FaceBook, YouTube, Reddit, Cloudflare all make use of similar technology. I guess the main difference is that the scanning takes place on the service post-upload rather than on the device pre-upload - but in practical terms it's not materially different.

 

It's safe to assume that all cloud providers are monitoring content you've uploaded to them for CSAM and using techniques like image hashing/fuzzy hashing to determine if the material is CSAM.

 

 

 

 

Oh don't worry. At that point I'll start rolling my own solution for photo back ups and switch over to an end to end encryption messaging(I have the storage at home now! And signal is pretty cool). I am well into the google eco system(not by choice) and I know exactly what they do and what they can look at. If I wanted full on privacy I'd run a pixel and run a much de-googled version of Android.

 

 

 

The main kick that would get me to switch is it literally happening on both my device and servers. 

 

 

 

Like no thanks Apple or Google.





Ramblings from a mysterious lady who's into tech. Warning I may often create zingers.

 

Opinions are my own. They don't represent my employer.


Technofreak
5440 posts

Uber Geek

Trusted

  #2756598 7-Aug-2021 19:56
Send private message

MaxineN:

 

MaxineN:

 

...switching back to Android(reluctantly) and cancelling my Apple subs...

 

 

 

 

Oh don't worry. At that point I'll start rolling my own solution for photo back ups and switch over to an end to end encryption messaging(I have the storage at home now! And signal is pretty cool). I am well into the google eco system(not by choice) and I know exactly what they do and what they can look at. If I wanted full on privacy I'd run a pixel and run a much de-googled version of Android.

 

 

 

The main kick that would get me to switch is it literally happening on both my device and servers. 

 

 

 

Like no thanks Apple or Google.

 

 

If you really want to get away from Apple and Google you could try Sailfish OS. Linux based with Terminal functionality. It has some wrinkles but works pretty damn well overall. 





Sony Xperia XA2 running Sailfish OS. https://sailfishos.org The true independent open source mobile OS 
Samsung Galaxy Tab S6
Dell Inspiron 14z i5


GSManiac

415 posts

Ultimate Geek


  #2756605 7-Aug-2021 21:01
Send private message

 

I hear ya but the manifesto is (probably, am assuming) is illegal to have and/or share in NZ(for GOOD reason too Christ we don't need more people radicalized by it, this isn't expression of freedom here) and your other example of anti government pictures/documents/messages of Russia is 1, a political nightmare 2, yes an actual concern for people who want to express their freedom(not a fan of Putin if you've figured this out) and 3 I imagine that other countries will react to innocent people expressing and/or having anti government content for a government that is quite frankly a human rights violator.

 

 

 

 

Oh I agree the manifesto should be illegal. Im more pointing out there is potential for the NZ govt requesting Apple to scan iCloud for it. Today it's one thing, and tomorrow its something else. Then suddenly the flood gates open and the genie is out of the bottle.


michaelmurfy
/dev/ttys0
10979 posts

Uber Geek

Moderator
ID Verified
Trusted
Lifetime subscriber

  #2756609 7-Aug-2021 21:32
Send private message

Guys - I am seeing a tonne of misinformation on this subject and so and there is also misinformation in this thread.

 

This is nothing new: Google, Facebook and many other companies do this. Apple are doing this on-device with known file signatures just like these other companies. I don't see it personally as a breach of trust. There are honeypots all over the place for catching these sorts of people and to be honest I fully support Apple doing this.

 

No matter what operating system or device you use, if it is connected to the internet and you don't own the full source code for everything running on said device then effectively see it as "compromised". I personally use Apple products as the operating system isn't owned by an ad company and privacy is still important to them, an iPhone doesn't ping the Google Mothership every 30secs with tracking information by default for example.

 

Lastly, this took me out by surprise only because I thought Apple were already doing this sort of thing. But I am not freaked out and nobody here should be either. I don't see this as a privacy breach. But also, I don't want to see misinformation on this subject spreading through these forums. If your photos are stored anywhere on the internet just assume either humans or machines could technically view these images too.

 

I do understand the cause for concern but lets stay away from the "what if's" please.





Michael Murphy | https://murfy.nz | https://keybase.io/michaelmurfy - Referral Links: Sharesies | Electric Kiwi
Are you happy with what you get from Geekzone? Please consider supporting us by making a donation.


Batman
Mad Scientist
27808 posts

Uber Geek

Trusted
Lifetime subscriber

  #2756611 7-Aug-2021 21:37
Send private message

if you're on Pegasus' list then all these would be moot





Involuntary autocorrect in operation on mobile device. Apologies in advance.


 1 | 2 | 3
View this topic in a long page with up to 500 replies per page Create new topic





News and reviews »

D-Link G415 4G Smart Router Review
Posted 27-Jun-2022 17:24


New Zealand Video Game Sales Reaches $540 Million
Posted 26-Jun-2022 14:49


Github Copilot Generally Available to All Developers
Posted 26-Jun-2022 14:37


Logitech G Introduces the New Astro A10 Headset
Posted 26-Jun-2022 14:20


Fitbit introduces Sleep Profiles
Posted 26-Jun-2022 14:11


Synology Introduces FlashStation FS3410
Posted 26-Jun-2022 14:04


Intel Arc A380 Graphics First Available in China
Posted 15-Jun-2022 17:08


JBL Introduces PartyBox Encore Essential Speaker
Posted 15-Jun-2022 17:05


New TVNZ+ streaming brand launches
Posted 13-Jun-2022 08:35


Chromecast With Google TV Review
Posted 10-Jun-2022 17:10


Xbox Gaming on Your Samsung Smart TV No Console Required
Posted 10-Jun-2022 00:01


Xbox Cloud Gaming Now Available in New Zealand
Posted 10-Jun-2022 00:01


HP Envy Inspire 7900e Review
Posted 9-Jun-2022 20:31


Philips Hue Starter Kit Review
Posted 4-Jun-2022 11:10


Sony Expands Its Wireless Speaker X-series Range
Posted 4-Jun-2022 10:25









Geekzone Live »

Try automatic live updates from Geekzone directly in your browser, without refreshing the page, with Geekzone Live now.



Are you subscribed to our RSS feed? You can download the latest headlines and summaries from our stories directly to your computer or smartphone by using a feed reader.