Home / Lifestyle / Mobile / Apple / Apple apologises for Siri privacy issues and will implement an opt-in

Apple apologises for Siri privacy issues and will implement an opt-in

Last month, Apple found itself in the middle of a privacy controversy after it came to light that the tech giant had contractors employed to listen in on secretly obtained Siri recordings. This practice was quickly called out by privacy advocates, leading to Apple suspending the program indefinitely. Now, Apple has released its own statement on the situation, apologising for the misstep.

Other major tech companies also have contractors listening on recordings from digital assistants in an effort to improve voice recognition and commands. The downside to this is that sometimes these assistants activate when they weren't called upon and end up snooping on private conversations, which could then be passed on and heard by these third-party contractors.

On Apple's side, the company does plan to reinstate the Siri grading program. However, they will only do so after iOS 13 starts rolling out with new updates and options for Siri users: “we realise we haven’t been fully living up to our high ideals, and for that we apologise. As we previously announced, we halted the Siri grading program. We plan to resume later this fall when software updates are released to our users”.

Those updates include the following changes:

  • First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.
  • Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.
  • Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.

KitGuru Says: Some human intervention is required to make improvements to digital assistants like Siri. Still, recordings should have never been made and sent without a user's express knowledge. Hopefully now that proper opt-outs are being put into place, everyone can be better informed and make the right choice for themselves. 

Become a Patron!

Check Also

AMD Ryzen 9 9950X3D shows up on PassMark with improved single-core performance

Early benchmarks are promising for AMD's upcoming Ryzen 9 9950X3D, with the 16-core CPU reportedly …

We've noticed that you are using an ad blocker.

Thank you for visiting KitGuru. Our news and reviews teams work hard to bring you the latest stories and finest, in-depth analysis.

We want to be as informative as possible – and to help our readers make the best buying decisions. The mechanism we use to run our business and pay some of the best journalists in the world, is advertising.

If you want to support KitGuru, then please add www.kitguru.net to your ad blocking whitelist or disable your adblocking software. It really makes a difference and allows us to continue creating the kind of content you really want to read.

It is important you know that we don’t run pop ups, pop unders, audio ads, code tracking ads or anything else that would interfere with the KitGuru experience. Adblockers can actually block some of our free content, such as galleries!