Share

Voice Assistant Spying on Us. Apple Pays, But Doesn't Admit Wrong

The tech giant will refund up to $20 per device to anyone who used Siri in the U.S. and will swear under oath that they were heard

Voice Assistant Spying on Us. Apple Pays, But Doesn't Admit Wrong

Spy Voice Assistants. Conversation thieves to analyze our habits to use for the great game of “profiling” which serves to offer us tailor-made goods and services, officially for our needs but really to fuel someone else's business. All true, all proven. In the wake of the hypocrisy of the architects, as demonstrated by the story that emerged in the last few hours in the country that is a leader in technology but also in its side effects: the United States. The news is serious in itself, but even more serious if we look at what is behind it: Apple has committed to paying 95 million dollars to the participants in a class action lawsuit (just one of the many “class actions” that are taking place on similar issues) filed a few years ago by a good number of users of Crab, the Apple voice assistant. But he persists in not admitting his mistake.

Apple Siri and the 95 million compensation: the class action that makes noise

The action refers to the period between September 17, 2014, when “Hey Siri” was launched with the IOS 8 version of the operating system of the IT giant, and December 31. Anyone who can demonstrate before a judge (for now only in the US) to have used a Siri-enabled device (a voice assistant but also a smartphone) during this period will be able to receive a sum that is little more than symbolic, up to $20 per device for up to five devices.

The refund procedure will not be very simple. It will be necessary to document some specific episode of open violation of privacy, formally swearing on the truthfulness of what is declared. A practice, that of the oath relating to a testimony, which in the United States takes on the characteristics of an act of great formal and substantial importance, with very harsh penalties in the event of proven falsehood.

Minimal compensation and serious violations of privacy

The expected compensation is slight, however. What is behind it is heavy. Apple does not admit its guilt. It continues to entrench itself in its motivations, in its justifications and on the “techniques” of listening to private conversations through these means, exactly as the other protagonists of the game have done and continue to do, namely Google with its ecosystem of assistants and Amazon with its Alexa system. The mantra of the official justifications is essentially this: if you listen, you do it on a sample basis, you process it in a strictly confidential and anonymous manner, you use all of this to perfect the algorithms of the service. Could something have gotten out of hand? Maybe, but the procedures are constantly revised and fraud – they insist – does not exist.

Apple claims to have settled the $95 million settlement in the recently closed lawsuit to keep the peace, in the spirit of cooperation for the benefit of all. But things, when analyzed carefully, appear to be very different.

Siri and privacy violation: the report that revealed the listening of sensitive conversations

The class action was triggered by a 2019 report which demonstrated how a good number of "contractors" (external collaborators to the company) in charge of quality control regularly listened to conversations with sensitive information. And that this information had produced, in a certain number of proven cases, targeted offers of services and products made immediately after conversations intercepted by Siri.

That something, more than something, was true is demonstrated by the fact that Apple in the summer of 2019 had interrupted the program of "refinement" of Siri through external collaborators by firing more than 300 of them and then restarted the program. The connection between the suspension of the program and the layoffs is more than suspicious. But to confirm the indisputable truth, that is, that we are spied on regularly from voice assistants, it is an empirical practice that each of us can implement.

Voice Assistants and Spying: When the Light Betrays Your Privacy

How many of us have found ourselves with the voice assistant's listening activity light on, maybe just for a few moments, without having said any activation command? How many of us have been surprised more than once by the smartphone talkativeness who asks us, without being consulted in any way, to reformulate a question that he has not understood? Miracles of algorithms, which by spying on us secretly, fortunately also make some missteps to be discovered.

Small consolation (so to speak) in all this: the battalion of spies, with the attached fertile material for hackers, is much more nourished by that represented by vocal assistants. Just to remain in the circle of cell phones and PCs, the frantic game of profiling is now evident to all of us when we accept the "cookies" of Internet sites or when we simply give consent to privacy questionnaires, seeing offers appear on our screen curiously connected to what we are simply exploring. And it is only, believe us, the tip of the iceberg.

comments