Apple also uses humans to listen to some Siri recordings
By Raymond Wong
“We are committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”
The original story follows below.
News alert: your Siri voice recordings may not be entirely private.
According to The Guardian, Apple hires contractors to listen to Siri recordings in order to improve the accuracy and quality of the voice assistant.
These contractors “regularly hear confidential medical information, drug deals, and recordings of couples having sex, as part of their job providing quality control, or ‘grading,'” the report claims.
Not exactly a good look for a company that prides itself on taking privacy more seriously than other tech companies.
The exposé on Apple comes from an anonymous whistleblower who spoke with The Guardian, voicing concerns on how the undisclosed Siri data could be potentially misused.
Though, Apple’s contractors are hired to review a small portion of Siri recordings and told to only report Siri recordings for technical problems such as accidental activations, not on the content itself, the whistleblower said it’s uncomfortable to hear conversations such as ones where people are engaging in sexual acts or drug dealings.
“There’s not much vetting of who works there, and the amount of data that we’re free to look through seems quite broad,” the whistleblower told The Guardian. “It