The desire to protect confidentiality at all costs is a benefit for the user of Apple products, but what a headache for the engineers responsible for developing new functions! An article by The Information looks back at the many challenges they face, which sometimes end in failure because of the primacy of respect for privacy.
In 2019, Apple considered making it possible to purchase content or services in apps with Siri. But the project was abandoned, partly because of the lack of a technical solution to link the voice request to the user’s Apple ID. The home assistant is on this point, as on others and for similar reasons, behind Alexa.
The publication gives other examples. In 2015, engineers attached to the Photos app proposed a function allowing the user to consult the list of the last places visited in chronological order and the photos taken in these places. A failed proposal: it was opposed to them that this function could be exploited by authoritarian governments to track individuals.
If Apple TV + is less strong than Netflix or Prime Video in suggesting programs adapted to the tastes of subscribers, it is because the streaming service cannot analyze how they move from one content to another.
The development of the Raise the wrist function, which since watchOS 5 allows you to talk to the Apple Watch without having to mumble “Hey Siri” beforehand, has suffered its share of difficulties related to confidentiality issues. To work, it must indeed collect data from the accelerometer and the microphone, very sensitive information…
To overcome these difficulties, teams must be creative to access the data essential for their functions. This is how Apple adhered to the principles of “differential privacy” in 2016: the user’s data is merged into communities of shared tastes and ideas, thus preventing them from being precisely traced back to them. But some functions require much more precise information.
“Differential privacy”: how Apple collects your data without collecting your data