Here is what changed with WhatsApp and what didn’t:
Facebook bought WhatsApp in 2014, and since 2016, almost everyone using the messaging app has been (usually unknowingly) sharing information about their activity with Facebook.
Facebook knows the phone numbers being used, how often the app is opened, the resolution of the device screen, the location estimated from the internet connection and more.
Facebook uses this information to make sure WhatsApp works properly and to help a shoe company show you an ad on Facebook.
Facebook can’t peer at the content of texts or phone calls because WhatsApp communications are scrambled. Facebook also says that it doesn’t keep records on who people are contacting in WhatsApp, and WhatsApp contacts aren’t shared with Facebook.
WhatsApp has a lot of positives. It is easy to use, and communications in the app are secure. But yes, WhatsApp is Facebook, a company many don’t trust.
There are alternatives, including Signal and Telegram — both of which have gotten a surge of new users recently. Digital privacy group Electronic Frontier Foundation says Signal and WhatsApp are good choices for most people.
The reason WhatsApp recently notified app users about revised privacy rules is that Facebook is trying to make WhatsApp a place to chat with an airline about a missed flight, browse for handbags and pay for stuff.
WhatsApp’s policies changed to reflect the possibility of commercial transactions involving the mingling of activity among Facebook apps — a handbag you browse in WhatsApp could pop up later in your Instagram app, for example.
I also want to touch on deeper reasons for the misunderstandings.
First, this is a hangover of Facebook’s history of being cavalier with our personal data and reckless with how it is used by the company or its partners. It is no wonder that people assumed Facebook changed WhatsApp policies in gory ways.
Second, people have come to understand that privacy policies are confusing, and we really don’t have power to make companies collect less data.
“This is the problem with the nature of privacy law in the United States,” Hill said. “As long as they tell you that they’re doing it in a policy that you probably don’t read, they can do whatever they want.”
That means digital services including WhatsApp give us an unappealing choice. Either we give up control over what happens to our personal information, or we don’t use the service. That’s it.
Another false belief floating around about WhatsApp — and again, this is WhatsApp’s fault, not yours — is that the app is just now removing an option for people to refuse to share their WhatsApp data with Facebook.
Not quite right.
Yes, when Facebook made major changes to WhatsApp privacy policies in 2016, there was a brief moment of choice. People could check a box to order Facebook not to use their data from WhatsApp for commercial purposes.
Facebook would still collect the data from WhatsApp users, as I explained above, but the company would not use the data to “improve its ads and product experiences,” like making friend recommendations.
But that option in WhatsApp existed for only 30 days in 2016. That was a lifetime ago in digital years and approximately 4 million Facebook data scandals ago.
For anyone who started using WhatsApp since 2016 — and that is many people — Facebook has been collecting a lot of information without an option to refuse.
“A lot of people didn’t know that until now,” Gennie Gebhart of the Electronic Frontier Foundation told me. And, she said, we are not to blame.
Understanding what happens with our digital data feels as if it requires advanced training in computer science and a law degree. And Facebook, a company with oodles of cash and a stock value of more than $700 billion, didn’t or couldn’t explain what was happening in a way that people could grasp.