0519 GMT December 01, 2022
In a letter announcing his decision, sent to all European data protection regulators, Thomas le Bonniec said, “It is worrying that Apple (and undoubtedly not just Apple) keeps ignoring and violating fundamental rights and continues their massive collection of data.
“I am extremely concerned that big tech companies are basically wiretapping entire populations despite European citizens being told the EU has one of the strongest data protection laws in the world. Passing a law is not good enough: It needs to be enforced upon privacy offenders.”
Le Bonniec, 25, worked as a subcontractor for Apple in its Cork offices, transcribing user requests in English and French, until he quit in the summer of 2019 due to ethical concerns with the work.
“They do operate on a moral and legal grey area,” he told the Guardian at the time, “and they have been doing this for years on a massive scale. They should be called out in every possible way.”
Following the revelations of Le Bonniec and his colleagues, Apple promised sweeping changes to its “grading” program, which involved thousands of contractors listening to recordings made, both accidentally and deliberately, using Siri. The company apologized, brought the work in-house, and promised that it would only grade recordings from users who had explicitly opted-in to the practice.
“We realize we have not been fully living up to our high ideals,” the company said in a statement in August. It eventually released a software update in late October that allowed users to opt-in or out of their voice recordings being used to “improve Siri dictation”, and to choose to delete the recordings that Apple had stored. The company also emphasized that, unlike its competition, Siri recordings are never linked to a specific Apple account.
But, Le Bonniec argues, the company never really faced the consequences for its years-long program in the first place.
“I listened to hundreds of recordings every day, from various Apple devices (e.g. iPhones, Apple Watches, or iPads). These recordings were often taken outside of any activation of Siri, e.g. in the context of an actual intention from the user to activate it for a request. These processings were made without users being aware of it, and were gathered into datasets to correct the transcription of the recording made by the device,” he said.
“The recordings were not limited to the users of Apple devices, but also involved relatives, children, friends, colleagues, and whoever could be recorded by the device. The system recorded everything: Names, addresses, messages, searches, arguments, background noises, films, and conversations. I heard people talking about their cancer, referring to dead relatives, religion, sexuality, pornography, politics, school, relationships, or drugs with no intention to activate Siri whatsoever.
“These practices are clearly at odds with the company’s ‘privacy-driven’ policies and should be urgently investigated by data protection authorities and Privacy watchdogs. With the current statement, I want to bring this issue to your attention, and also offer my cooperation to provide any element substantiating these facts. Although this case has already gone public, Apple has not been subject to any kind of investigation to the best of my knowledge.”