this post was submitted on 20 Jul 2023
1827 points (98.3% liked)

Technology

59696 readers
2468 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] zettajon@lemmy.ml 34 points 1 year ago* (last edited 1 year ago) (9 children)

Nope, Apple sells your data just as much as Google does: https://www.insiderintelligence.com/content/apple-ad-revenues-skyrocket-amid-its-privacy-changes https://www.vox.com/recode/2022/12/22/23513061/apple-iphone-app-store-ads-privacy-antitrust#luMMel

While people noticed their new policies against 3rd party apps, that masked the fact that those policies carved out an exception for first party apps, meaning they collect (anonymous) data on you through Health, Journal, Music, etc. just like every other company. "Trusting them more" is simply a result of you and everyone else getting hit with their privacy ads recently.

Edit: "just like every other company" meant Google and Microsoft, i.e. the other big equivalent tech companies, my fault for not being specific.

[–] steal_your_face@lemmy.ml 141 points 1 year ago (1 children)

While I’m all for calling out companies for abusing your privacy, your own links show that they don’t collect as much data as google. They could (and should) be better though.

[–] khajimak@lemmy.world 84 points 1 year ago (3 children)

Nope apple is literally worse than hitler, spez, and elon musk confirmed. Tim apple fucked my wife in front of me.

[–] WarmSoda@lemm.ee 52 points 1 year ago

You lucky sonofabitch. You got to witness the ol Apple Pie with your own two eyes.

[–] Dark_Blade@lemmy.world 13 points 1 year ago

Wow, your wife must be really hot if a gay guy saw her and said ‘would’.

[–] AngrilyEatingMuffins@kbin.social 86 points 1 year ago (2 children)

Anonymous data is actually pretty different to the data everyone else collects, which literally has your name and picture

Apple’s data is useful for trends but it can’t be used to study who I am.

[–] generalpotato@lemmy.world 11 points 1 year ago (1 children)

This comment needs to be further up rather than the idiotic takes that don’t understand the difference between anonymized data collection (Apple) vs identifiable data collection (Meta/Google/most other tech).

[–] QuadratureSurfer@lemmy.world 7 points 1 year ago (2 children)

Well, then there's also the people that don't realize that there are all sorts of programs out there that will try to take that "anonymized" data and then tie it right back to a persons profile.

For example, you can anonymize GPS location data, but just because you strip away identifying information doesn't mean that you're truly anonymous. It can still be obvious where you live and where you work. And once you figure out where they live (again based on anonymous data) you can tie that information right back into their profile and continue to track them as if nothing has changed. https://www.popularmechanics.com/technology/security/a15927450/identify-individual-users-with-stravas-heatmap/

[–] Yendor@sh.itjust.works 10 points 1 year ago (1 children)

That won’t work on Apples data - they group all the data into cohorts, so the anonymising isn’t reversible.

[–] QuadratureSurfer@lemmy.world 3 points 1 year ago

Can you explain a bit more about Apple grouping their data into cohorts? I haven't heard much about this before. For example, how would grouping data into cohorts work with GPS data?

[–] generalpotato@lemmy.world 2 points 1 year ago (1 children)

Not all anonymization techniques are created equal? I’m pretty sure this is fairly obvious at this point to anybody remotely familiar with how data collection works when it comes to privacy and device metrics.

So, how is this relevant to this conversation besides adding more FUD and misinformation?

[–] QuadratureSurfer@lemmy.world -1 points 1 year ago (1 children)

You sound like you know a lot more than everyone else on this subject so I thank you for your responses as a means to educate others.

Just a word of advice, be sure to treat others with respect rather than assuming the worst of their intentions or calling them idiots because they don't know as much as you.

My response is still relevant to the conversation as we are talking about "anonymized data". The link in my comment above proves that just because you are told your data has been "anonymized" does not truly mean that it's impossible to re-attribute it back to an individual.

So if you trust that Apple has great techniques for data anonymization, that's awesome, feel free to expand on that and explain why. Just don't go around telling others that simply having any sort of anonymization technique makes it so you don't have to worry.

[–] generalpotato@lemmy.world 3 points 1 year ago* (last edited 1 year ago) (1 children)

Thanks for the “advice”. Now, let me expand on my position.

The reason why I’m slightly annoyed but everyone’s take here is:

  1. The demeanor that folks here have in passing on ill informed opinion as fact and then speculating details.
  2. Not looking at the actual privacy policy of a company and the history of how said company has been involved in data collection, privacy, implementation of features in that realm and their handling of customer data.
  3. Bringing up random points just to win an argument instead of conceding that they do not what they are talking about.

Here’s a few links to put things in perspective as to what and how Apple anonymizes data and how seriously it takes privacy:

https://www.apple.com/privacy/docs/Differential_Privacy_Overview.pdf

https://www.apple.com/privacy/labels/

https://www.apple.com/privacy/control/

Read through those, look at Apple’s implementation of TouchID, FaceID and their stance on E2E encryption and tell me again why Apple isn’t serious about privacy, masking and anonymizing data, implementing differential privacy and informing users of what they collect and how users can opt-out of it.

Edit- Further evidence and reading:

https://www.techradar.com/news/fbi-says-apples-new-encryption-launch-is-deeply-concerning

https://www.digitaltrends.com/mobile/apple-data-collection/

https://www.apple.com/privacy/docs/A_Day_in_the_Life_of_Your_Data.pdf

[–] QuadratureSurfer@lemmy.world 2 points 1 year ago (3 children)

I've been reading through the links you posted as well as looking through other sources. I agree Apple is definitely taking more care with how they anonymize data compared to companies such as Netflix or Strava.

In Netflix's case they released a bunch of "anonymized data" but in just over 2 weeks some researchers were able to de-anonymize some of the data back to particular users: https://www.cs.utexas.edu/~shmat/netflix-faq.html

I've already linked Strava's mistake with their anonymization of data in my above comment.

and tell me again why Apple isn’t serious about privacy,

I think you must have me confused with someone else, up to this point in our discussion I never said that. I do believe that Apple is serious about privacy, but that doesn't mean they are immune to mistakes. I'm sure Netflix and Strava thought the same thing.

My whole point is that you can't trust that it's impossible to de-anonymize data simply because some organization removes all of what they believe to be identifying data.

GPS data is a fairly obvious one which is why I brought it up. Just because you remove all identifying info about a GPS trace doesn't stop someone (or some program) from re-attributing that data based on the start/stop locations of those tracks.

I appreciate that Apple is taking steps and using "local differential privacy" to try to mitigate stuff like this as much as possible. However, even they admit in that document that you linked that this only makes it difficult to determine rather than making it impossible:
"Local differential privacy guarantees that it is difficult to determine whether a certain user contributed to the computation of an aggregate by adding slightly biased noise to the data that is shared with Apple." https://www.apple.com/privacy/docs/Differential_Privacy_Overview.pdf


Now for some counter evidence and reading:

Here's a brief article about how Anonymized data isn't as anonymous as you think: https://techcrunch.com/2019/07/24/researchers-spotlight-the-lie-of-anonymous-data/

And if you just want to skip to it, here's the link to the study about how anonymized data can be reversed: https://www.nature.com/articles/s41467-019-10933-3/

informing users of what they collect and how users can opt-out of it.

It would be great if users could just opt-out, however Apple is currently being sued for continuing to collect analytics even on users that have opted out (or at least it appears that way, we'll have to let the lawsuit play out to see how this goes).
https://youtu.be/8JxvH80Rrcw
https://www.engadget.com/apple-phone-usage-data-not-anonymous-researchers-185334975.html
https://gizmodo.com/apple-iphone-privacy-settings-third-lawsuit-1850000531

That DigitalTrends article you linked was okay, but it was written in 2018 before Mysks's tests.

As for your TechRadar link to Apple's use of E2EE, that's great, I'm glad they are using E2EE, but that's not really relevant to our discussion about anonymizing data and risks running afoul of the #3 point you made for why you are frustrated with the majority of users in this post.

I understand it can be frustrating when people bring up random points like that, I'm assuming your comment for #3 was directed at other users on this post rather than myself. But feel free to call me out if I go too far off on a tangent.

I have tried to stick to my main point which is: just because data has been "anonymized" doesn't mean it's impossible to de-anonymize that data.

It's been a while since I've looked up information on this subject, so thank you for contributing to this discussion.

[–] PipedLinkBot@feddit.rocks 3 points 1 year ago

Here is an alternative Piped link(s): https://piped.video/8JxvH80Rrcw

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source, check me out at GitHub.

[–] generalpotato@lemmy.world 1 points 1 year ago

:-) Thanks for the detailed response. Let me take a look and get back to you.

[–] generalpotato@lemmy.world 1 points 1 year ago* (last edited 1 year ago) (1 children)

My whole point is that you can’t trust that it’s impossible to de-anonymize data simply because some organization removes all of what they believe to be identifying data.

GPS data is a fairly obvious one which is why I brought it up. Just because you remove all identifying info about a GPS trace doesn’t stop someone (or some program) from re-attributing that data based on the start/stop locations of those tracks.

Looking at all the links you’ve posted… so there’s been cases and studies stating that data can re-identified, but do we have insight into what exact data sets they were looking it at? I tried looking at the Nature study but it doesn’t say how they got the data and what exact vectors they were looking at outside of mention of 15 some parameters such as zip code, address etc. Data pipelines and implementation of metrics vary vastly, per implementation, I’m curious to see where the data set came from, what the use case was for collection, the company behind it, the engineering chops it has etc.

If from a data collection standpoint you’re collecting “zip code” and “address”, you’ve already failed to adhere to good privacy practices, which is what I’m arguing in Apple’s case. You could easily salt and hash a str to obfuscate it, why is it not being done? Data handling isn’t any different than a typical technical problem. There’s risks and benefits associated to an implementation, the question is how well you do it and what are you doing to ensure privacy. The devil is in the detail. Collecting “zip code” and “address” isn’t good practice, so no wonder data become re-identifiable.

https://youtu.be/8JxvH80Rrcw https://www.engadget.com/apple-phone-usage-data-not-anonymous-researchers-185334975.html https://gizmodo.com/apple-iphone-privacy-settings-third-lawsuit-1850000531

More FUD. Why aren’t they testing iOS 16? Ok, sure, it’s sending device analytics back… but it could just be a bug? The YT video is showing typical metrics, this isn’t any different to literally any metrics call an embedded device makes. A good comparison would be an Android phone’s metrics call and comparison to it side by side. I’m sorry, I refuse to take seriously a video that says “App Store is watching you” and tries my skews my opinion prior to showing my the data. The data should speak for itself. I see the DSID bit in the Gizmodo article, but that’s a long shot, without any explanation of how to the data is identifiable specifically.

Lastly,

As for your TechRadar link to Apple’s use of E2EE, that’s great, I’m glad they are using E2EE, but that’s not really relevant to our discussion about anonymizing data and risks running afoul of the #3 point you made for why you are frustrated with the majority of users in this post.

Privacy is fundamental to designing a data pipeline that doesn’t collect “zip code” in plain str if you want to data to be anonymized at any level. So it is absolutely relevant. :-)

Edit: To clarify, if it wasn’t clear, relying on just data anonymization and collecting everything under the sun isn’t a good way to design a data pipeline that allows for metrics collection. The goal should always be collecting as little as possible, then using masking, anonymization and other techniques to obfuscate it all. No solution is perfect, but that doesn’t there aren’t shitty ways of implementing things leading to the fiascos you see on the web.

[–] PipedLinkBot@feddit.rocks 1 points 1 year ago

Here is an alternative Piped link(s): https://piped.video/8JxvH80Rrcw

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source, check me out at GitHub.

[–] Marsupial@quokk.au 6 points 1 year ago

Metadata is anonymous yet people still get fingerprinted by it.

[–] circuitfarmer@lemmy.sdf.org 67 points 1 year ago

As much as Google? Likely not. Does their carefully curated pro-privacy image actually match their practices? Also likely not.

[–] C4ptFuture@lemmy.world 41 points 1 year ago

“Just as much as Google.” LMAO. We have an expert here.

[–] SidneyGrant@sh.itjust.works 39 points 1 year ago (1 children)

I feel like wuth the amount of stuff done on device and not in the cloud with iPhones and other Apple products, saying that Apple sells just as much as Google is at the very least disingenuous…

[–] webghost0101@sopuli.xyz 18 points 1 year ago

There is a massive leap between collecting data and selling your data.

I am against both but in the digital age actually knowing who has your data is such a relief. My old email got sold to third party’s a bit to many times and to this day 80% of the incoming messages are blatant generic America targeted phishing.

[–] Platform27@lemmy.ml 11 points 1 year ago (1 children)

Health is on-device, and is E2EE. To my knowledge, that's always been the case. They do allow optional data linking services, but those need to be setup by the end-user. Apple should have no knowledge of this data, by default. Notes can be E2EE (with ADP), and with Journal (a new iOS feature) being E2EE. Music is a paid for service, with no ads, and is one of the more privacy respecting options. Data is needed for Music to help serve the user, and suggest artists/songs... it's literally one of the platforms benefits, over self-hosting.

[–] zettajon@lemmy.ml 4 points 1 year ago (1 children)

None of the major players literally sell your true name and address. All mask the data, and then do stuff with it like create trends to know which ads to display to "users that search for tiktok on the app store/play store"

[–] Platform27@lemmy.ml 6 points 1 year ago* (last edited 1 year ago) (1 children)

Apple does not sell user data. By all means, look at their Privacy Policy (it's easy to read), and show me where this is mentioned. They do collect it, and use it for their own marketing platform, but they don't sell/trade it. In fact they DO anonymise the data they collect. Take a look: https://www.apple.com/privacy/docs/Differential_Privacy_Overview.pdf This is just one document, found after a quick search. They also disclose other details on their security, and other privacy (or lack thereof) aspects.

Now show me where other ad agencies, not just one or two, that goes to the same lengths, while also giving decent documentation. I'm not saying Apple is perfect (far from it).

[–] zettajon@lemmy.ml 5 points 1 year ago (3 children)

They do collect it, and use it for their own marketing platform

Right

but they don’t sell/trade it

Then what are they collecting it for? To line their servers? It's being used to train services, and those services that have ads have those ads targeted using the data collected in the first sentence I quoted.

In fact they DO anonymise the data they collect

So does google. Again, to the broader thread audience replying to my original comment, what is the difference?

[–] JshKlsn@lemmy.ml 8 points 1 year ago

You're right. Not sure why you're downvoted.

Google would be stupid to sell your data. Instead they keep it private, and when people go to Google, they tell them to push ads to certain groups or take surveys from certain groups, and Google does so. They do not hand those advertisers your data, otherwise those advertisers would never come back. They have the data.

[–] seukari@lemmy.world 2 points 1 year ago

I recently learned that one method for companies to get around data selling laws is to give the data away for free in order to attract certain types of advertisers, then, they sell ad slots for people with specific demographics or interests.

They don't sell the data because that is harder to do with laws restricting it, so they just use it as advertiser bait in ways that bypass the law.

Further reading: https://www.eff.org/deeplinks/2020/03/google-says-it-doesnt-sell-your-data-heres-how-company-shares-monetizes-and

[–] Rakn@discuss.tchncs.de 0 points 1 year ago* (last edited 1 year ago)

The difference is that there are actually companies out there that will sell you the raw data they collected. E.g. your name and address if they have, your browsing history obtained through shady extension and so on.

So there is a difference between selling the data and hoarding it to show targeted ads.

And while both may not be cool, to me anyone with some money being able to buy my data is clearly worse. So it's helpful distinguishing there. It's not all "selling your data". You are also doing your argument a disservice by lumping it all into the same bucket.

[–] elthesensai@mastodon.social 2 points 1 year ago

@zettajon @hardypart there is nothing stating that Apple is using your data, selling your data, or even getting your data. While it did create a situation where ad dollars are going to App Store it’s still not targeted other than by search. Your own posted link says nothing about what you claimed. There are plenty of issues to bring up about Apple without the need of fabricating one.

[–] Yendor@sh.itjust.works -2 points 1 year ago (1 children)

Did you read the article you posted? Apple serve you ads, they don’t sell your data. And they allow you to opt out of tracking. It’s all right there in your article.

[–] JshKlsn@lemmy.ml 2 points 1 year ago

I know this is off topic, but Apple isn't innocent.

It's almost worse to think your privacy is protected when it's not, than to know it's not. At least I know Google is sending my Google Assistant sound clips to be analyzed. Sucks when you learn the person you thought you could trust is fucking behind your back.