Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Good writeup, thank you for sharing.

Even before the discovery phase (which is where this issue sits), the two devices apparently create a TLS tunnel with both client and server certificates, signed by Apple and containing UUIDs linked to the device and Apple ID [0].

I have no idea if/where/how these certs are used elsewhere, but this seems like another avenue of identification and tracking even if it doesn't directly expose the phone number or email address. I'm pondering early IMSI catchers didn't expose the MSISDN, but enough listeners in various places seeing the same IMSI sure helped for correlation. Does anyone know of any writeups on Apple's internal (device) CA infrastructure?

[0] Section 2.4: https://www.usenix.org/system/files/sec21-heinrich.pdf



>Even before the discovery phase (which is where this issue sits), the two devices apparently create a TLS tunnel with both client and server certificates, signed by Apple and containing UUIDs linked to the device and Apple ID [0].

From the abstract

>We propose a novel optimized PSI-based protocol called PrivateDrop that addresses the specific challenges of offline resource-constrained operation and integrates seamlessly into the current AirDrop protocol stack

Is section 2.4 how it works today, or what they're proposing for the future?


2.4 seems to be part of the description of the current specification, where as their PrivateDrop suggestion doesn't come in until part 4. Even if a different PSI is used it's still (I believe) happening over the TLS connection so it wouldn't fully eliminate the fingerprinting.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: