News Security

Apple QC workers often hear bits of private conversations in Siri recordings

Apple is paying contractors to listen to recorded Siri conversations, according to a new report from The Guardian. A former contractor revealing that workers have heard accidental recordings of users’ personal lives, including doctor’s appointments, addresses, and even possible drug deals.

According to that contractor, Siri interactions are sent to workers, who listen to the recording and are asked to grade it for a variety of factors, like whether the request was intentional or a false positive that accidentally triggered Siri, or if the response was helpful.

Apple doesn’t really explicitly say that it has other humans listening to the recordings, and whatever admissions it does make to that end are likely buried deep in a privacy policy that few (if any) Siri users have ever read.

The reason is that a small number of Siri interactions are passed on to outsourced quality-control (QC) contractors. Their job is to grade these interactions on various aspects. They look at whether the activation was on purpose or on accident, whether it was a request that Siri could even handle, and whether its response was appropriate.

The recordings are “pseudonymized” (stripped of identifiable data) to protect the user’s identity, but can contain request related information including app info, contacts, and locations. Apple says it only uses this information to determine what happened after the command and whether Siri’s responses were appropriate.

“A small portion of Siri requests are analyzed to improve Siri and dictation,” an Apple spokesperson told The Guardian. “User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities, and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”

Apple also noted that only a small “subset” of Siri interactions (less than one percent) are analyzed and these snippets are mostly less than a few seconds.

The fact humans listen to interactions is not explicitly mentioned in its Terms of Service, which is where Apple seems to have dropped the ball. It also does not let users opt-out of sharing either. Siri (and sharing) is either on or off.

Short of completely stopping use of smart assistants, there likely isn’t much that Siri customers will be able to do to avoid the issue, other than being careful what they say around their iPhones and HomePods. Still, it’s a good reminder that when you agree to use these products, you’re often giving up a lot more privacy than you think.

(Visited 67 times, 1 visits today)

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.