Anytime you use a video teleconferencing app, you’re sending your audio data to the company hosting the services. And, according to a new study, that means all of your audio data. This includes voice and background noise whether you’re broadcasting or muted.
Researchers at the University of Wisconsin-Madison investigated “many popular apps” to determine the extent to which video conferencing apps capture data while users employ the in-software ‘mute’ button.
According to a university press release, their findings were substantial:
They used runtime binary analysis tools to trace raw audio in popular video conferencing applications as the audio traveled from the app to the computer audio driver and then to the network while the app was muted.
They found that all of the apps they tested occasionally gather raw audio data while mute is activated, with one popular app gathering information and delivering data to its server at the same rate regardless of whether the microphone is muted or not.
Unfortunately, as this research remains unpublished, we’re unable to confirm the specific apps tested. So, for now, we can not name and shame them.
However, the efficacy of this paper is not necessarily in doubt due to the fact that it has been accepted to the 2022 Privacy Enhancing Technologies Symposium. We’ll just have to wait and see who gets name-dropped when the paper is published in June.
However, that does not mean we can not draw some conclusions. According to the researchers, this data could be used to extract meaningful information. And, with a little machine learning, that information can paint an incredibly vivid picture of a user’s reality – again, even with your microphone muted in the app.
The research team was able to determine what specific audio was being sent during testing and, by extrapolating that data, they were able to predict what inferences big tech could make.
Of course, big tech uses AI to parse everything. So the researchers built their own algorithms to study the data. What they found was astonishing.
Per the unpublished paper’s abstract:
Using network traffic that we intercept en route to the telemetry server, we implement a proof-of-concept background activity classifier and demonstrate the feasibility of inferring the ongoing background activity during a meeting – cooking, cleaning, typing, etc. We achieved 81.9% macro accuracy on identifying six common background activities using intercepted outgoing telemetry packets when a user is muted.
In other words: grad students at the University of Wisconsin-Madison were able to build machine learning models capable of determining what a teleconference app user was doing while their microphone was muted with greater than 80% accuracy.
File under: Imagine what Google or Microsoft could do with their massive AI models. Yikes!
Neural’s take: Should you be concerned? Yes. Absolutely. Should you stop using these apps? No, because that’s not really an option for everybody.
Our real concern is that big tech either does not care whether users know they’re being recorded even when they’re muted, or it never stopped to think users would care. Either way, it shows a disturbing level of detachment from the user experience.
Big tech’s unspoken mantra is “data at all costs,” and this just goes to prove how thirsty the beast is. There’s no good reason not to explicitly inform users in large print that the mute button does not stop their audio feed to the server.
Luckily, you’ve got options. If you truly want to mute your audio feed, you need to perform a double mute. If you’re fortunate enough to own a headset that has a physical mute button on it, use that to mute your headset and then use the button in the app as an extra layer of muting.
If your headset does not have a physical mute button, or you’ve using an onboard microphone to communicate, you’ll need to do a system mute by muting your microphone from the system settings in your operating system, as well as muting in the app.
At the end of the day, it’s unclear exactly what big tech’s doing with this data. The scope of the study did not involve investigating big tech – and we’ll update this piece if Zoom, Microsoft, or Google get back to us with a statement.
But, no matter what, the fact that it’s being collected under such misleading circumstances is cause for major concern.
Forcing users to go into operating system menus in order to ensure they’re achieving a modicum of privacy is an anti-user policy. And, furthermore, it demonstrates how much more sensitive our audio data can be than our video data.
As lead author on the study Kassem Fawaz said in the university press release:
With a camera, you can turn it off or even put your hand over it, and no matter what you do, no one can see you. I do not think that exists for microphones.
Do not forget to double mute, folks. You’ll probably forget to double unmute from time to time, but the trade-off is keeping Google, Microsoft, and all the others from training their machines on the ambient sounds of your private life.