By Peter Makossah
WhatsApp, a global popular free social media application used by more than two billion people in over 180 countries to stay in touch with business partners, friends, and family, anytime and anywhere may be blocked in Britain if the United Kingdom passes the Online Safety Bill into law.
Reacting to the new bill on online safety, which is expected to be tabled in Westminster, London, the head of WhatsApp, Will Cathcart, has announced that the firm may block the UK’s service if the Online Safety Bill forces it to break users’ privacy and scan messages for harmful content.
Will Cathcart said the firm would not comply with the proposed legislation if it required the company to search for abuse material in users’ messages,
Cathcart said: “WhatsApp would stop British residents from using the mobile app rather than allow the UK Government to compel it to violate users’ privacy.”
According to the WhatsApp website, more than two billion people in over 180 countries use the service – with the app’s end-to-end encryption, meaning ‘your messages, photos, videos, voice messages, documents, status updates and calls are secured from falling into the wrong hands’.
This also means that WhatsApp cannot read users’ messages, with ‘only you and the person you’re communicating with’ able to read or listen in.
“Our users all around the world want security – 98% of our users are outside the UK; they do not want us to lower the security of the product. We’ve recently been blocked in Iran, for example. We’ve never seen a liberal democracy do that,” Cathcart said.
The WhatsApp CEO added: “We won’t lower the security of WhatsApp. We have never done that – and we have accepted being blocked in other parts of the world. When a liberal democracy says, ‘Is it OK to scan everyone’s private communication for illegal content?’ that emboldens countries around the world that have very different definitions of illegal content to propose the same thing.
“If companies installed software onto people’s phones and computers to scan the content of their communications against a list of illegal content, what happens when other countries show up and give a different list of illegal content?”
The Online Safety Bill is the UK’s landmark piece of internet safety legislation designed to give technology companies a legal duty of care toward their users by protecting them from illegal content and activity, including certain types of pornography and fraud.
The death of Molly Russell from Harrow in north-west London prompted calls for tougher rules to be imposed on online services used by teenagers.
14-year-old Molly died in November 2017 after seeing content about suicide and self-harm.
In a witness statement read to the court on behalf of retired Metropolitan Police detective Michael Walker, the coroner was told Molly watched videos uploaded by an American social media influencer who spoke about “suicide and depression on a regular basis.”
The court was also told the youngster had also followed a Twitter account which “displays depressing quotes.”
The Bill, which continues to work through Parliament and was published in draft form in May 2021, aims to clamp down on online trolling and illegal forms of pornography, placing more responsibility on the platforms.
However, human rights defenders and campaigners strongly believe the bill could limit freedom of expression.
The government’s Online Safety Bill, which announced in the Queen’s Speech, comes with a promise of protecting debate but with an emphasis that It is “especially” geared at keeping children safe and says “democratically important” content should be preserved.
According to Gov.uk, the bill is a new set of laws to protect children and adults online, making social media firms more responsible for their users’ safety and that It could make social media companies legally responsible for ensuring the online safety of children and young people.
The UK government says the bill when it is passed into law It would protect children by making social media platforms for the following reasons:
- Remove illegal content quickly or prevent it from appearing in the first place – this includes removing content promoting self-harm.
- Prevent children from accessing harmful and age-inappropriate content.
- Enforce age limits and age-checking measures.
- Ensure the risks and dangers posed to children on the largest social media platforms are more transparent, including by publishing risk assessments.
- Provide parents and children with clear and accessible ways to report problems online when they do arise.
Are you a WhatsApp user? If it was banned in the UK, would it affect you? Let us know in the comments section.
“We won’t lower the security of WhatsApp. We have never done that – and we have accepted being blocked in other parts of the world.”Will Cathcart