Home / Lifestyle / Mobile / Apple / Apple apologises for Siri privacy issues and will implement an opt-in

Apple apologises for Siri privacy issues and will implement an opt-in

Last month, Apple found itself in the middle of a privacy controversy after it came to light that the tech giant had contractors employed to listen in on secretly obtained Siri recordings. This practice was quickly called out by privacy advocates, leading to Apple suspending the program indefinitely. Now, Apple has released its own statement on the situation, apologising for the misstep.

Other major tech companies also have contractors listening on recordings from digital assistants in an effort to improve voice recognition and commands. The downside to this is that sometimes these assistants activate when they weren't called upon and end up snooping on private conversations, which could then be passed on and heard by these third-party contractors.

On Apple's side, the company does plan to reinstate the Siri grading program. However, they will only do so after iOS 13 starts rolling out with new updates and options for Siri users: “we realise we haven’t been fully living up to our high ideals, and for that we apologise. As we previously announced, we halted the Siri grading program. We plan to resume later this fall when software updates are released to our users”.

Those updates include the following changes:

  • First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.
  • Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.
  • Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.

KitGuru Says: Some human intervention is required to make improvements to digital assistants like Siri. Still, recordings should have never been made and sent without a user's express knowledge. Hopefully now that proper opt-outs are being put into place, everyone can be better informed and make the right choice for themselves. 

Become a Patron!

Check Also

Sonic x Shadow Generations

Sonic x Shadow Generations hits new sales milestone

Just one month after release, the remaster/expansion Sonic x Shadow Generations has sold 1.5 million copies – far outpacing the 2011 original.