Facebook Can Reportedly Access and Read Your WhatsApp Messages [UPDATED]
Written by Furqan Shahid
[UPDATE] Just moments after the original story was published, Android Central reached out to WhatsApp regarding a comment, and this is what the spokesperson had to say.
Every day WhatsApp protects over 100 billion messages with end-to-end encryption to help people communicate safely. We’ve built our service in a manner that limits the data we collect while providing us the ability to prevent spam, investigate threats, and ban those engaged in the worst kind of abuse. We value our trust and safety team who work tirelessly to provide over two billion users with the ability to communicate privately.
As per the by ProPublica report, over 1,000 contract workers in Austin, Texas, Dublin, and Singapore go through “millions of pieces of users’ content.” The workers use a special Facebook software to go through private messages, photos, and videos that are reported by users as improper. The messages are then said to be screened by Facebook’s AI systems.
WhatsApp reviewers gain access to private content when users hit the “report” button on the app, identifying a message as allegedly violating the platform’s terms of service. This forwards five messages — the allegedly offending one along with the four previous ones in the exchange, including any images or videos — to WhatsApp in unscrambled form, according to former WhatsApp engineers and moderators. Automated systems then feed these tickets into “reactive” queues for contract workers to assess.
[ORIGINAL STORY] After the controversial terms and conditions introduced in WhatsApp, many people started talking about how the company could actually read your private messages and share their contents with Facebook. WhatsApp has gone on record to deny these claims and talked about how neither they nor Facebook can read your messages or hear your calls placed on the network thanks to end-to-end encryption being a part of this all. Not just that, the Facebook-owned service even went ahead and called out Telegram for not offering end-to-end encryption.
WhatsApp and Facebook Could Face Some Serious Legal Troubles if the Report is Accurate
However, the latest report suggests that Facebook can somehow view the contents of your private messages. The report is coming from ProPublica, a non-profit investigative journalism organization with a solid record. The report claims that both Facebook and WhatsApp can view the contents of your private WhatsApp messages. The report says the following.
[An] assurance automatically appears on-screen before users send messages: “No one outside of this chat, not even WhatsApp, can read or listen to them.”
Those assurances are not true. WhatsApp has more than 1,000 contract workers filling floors of office buildings in Austin, Texas, Dublin and Singapore, where they examine millions of pieces of users’ content. Seated at computers in pods organized by work assignments, these hourly workers use special Facebook software to sift through streams of private messages, images and videos that have been reported by WhatsApp users as improper and then screened by the company’s artificial intelligence systems. These contractors pass judgment on whatever flashes on their screen — claims of everything from fraud or spam to child porn and potential terrorist plotting — typically in less than a minute
Many of the assertions by content moderators working for WhatsApp are echoed by a confidential whistleblower complaint filed last year with the U.S. Securities and Exchange Commission. The complaint, which ProPublica obtained, details WhatsApp’s extensive use of outside contractors, artificial intelligence systems and account information to examine user messages, images and videos. It alleges that the company’s claims of protecting users’ privacy are false. “We haven’t seen this complaint,” the company spokesperson said. The SEC has taken no public action on it; an agency spokesperson declined to comment.
Since WhatsApp has maintained that it uses end-to-end encryption, the “moderators” mentioned above should not be able to see the contents of your messages. That is because end-to-end encryption should mean that only the sender and the recipient have the ability to decrypt messages, but that does not seem to be the case here.
The report goes on and talks about the following.
Because WhatsApp’s content is encrypted, artificial intelligence systems can’t automatically scan all chats, images and videos, as they do on Facebook and Instagram. Instead, WhatsApp reviewers gain access to private content when users hit the “report” button on the app, identifying a message as allegedly violating the platform’s terms of service. This forwards five messages — the allegedly offending one along with the four previous ones in the exchange, including any images or videos — to WhatsApp in unscrambled form, according to former WhatsApp engineers and moderators. Automated systems then feed these tickets into “reactive” queues for contract workers to assess.
Following the article, a WhatsApp spokesperson said, “We build WhatsApp in a manner that limits the data we collect while providing us tools to prevent spam, investigate threats, and ban those engaged in abuse, including based on user reports we receive.
This work takes extraordinary effort from security experts and a valued trust and safety team that works tirelessly to help provide the world with private communication.”
While the spokesperson failed to address the alleged lack of end-to-end encryption, they did say that “Based on the feedback we’ve received from users, we’re confident people understand when they make reports to WhatsApp, we receive the content they send us.”
Whatever the situation might be, if the details mentioned in the ProPublica report are accurate, both Facebook and WhatsApp could be in some serious legal and consumer trouble. We cannot make more comments on the situation as there is a severe lack of information, and this is a developing story.