Snapchat is kicking few kids off the app in Britain, figures given to regulators show
LONDON, March 6 (Reuters) – Snapchat kicks dozens of children off its platform in Britain each month, compared with tens of thousands blocked by rival TikTok, according to internal data the companies shared with Britain’s media regulator Ofcom. also seen by Reuters.
Social media platforms such as Meta’s Instagram, ByteDance’s TikTok and Snap Inc.’s Snapchat require users to be at least 13 years old. These restrictions are designed to protect the privacy and safety of small children.
Ahead of the UK’s draft online safety law, which aims to protect social media users from harmful content such as child pornography, Ofcom asked TikTok and Snapchat how many suspected children under the age of 13 were targeted on their platforms in a year.
According to data seen by Reuters, TikTok told Ofcom that it suspended an average of around 180,000 suspected underage accounts in Britain every month between April 2021 and April 2022, or around 2 million in that 12-month period.
During the same period, Snapchat disclosed that it removed approximately 60 accounts per month, or just over 700 in total.
A spokesperson for Snap told Reuters that the numbers misrepresented the company’s work to keep out children under 13. The spokesperson declined to provide further context or detail specific blocking measures taken by the company.
“We take these obligations seriously and every month in the UK we block and delete tens of thousands of attempts by underage users to create Snapchat accounts,” a Snap spokesperson said.
According to Ofcom’s latest research, both apps are equally popular among minors. Compared to TikTok, kids are more likely to create their own private account on Snapchat instead of using a parent.
“It doesn’t make sense that Snapchat blocks a fraction of the number of kids that TikTok does,” said one Snapchat source, who asked not to be named.
Snapchat prohibits users from registering with a date of birth that makes them under 13. Reuters was unable to determine what protocols are in place for removing underage users once they have accessed the platform, and the spokesman did not elaborate.
Ofcom told Reuters that it remains a primary focus area to assess the steps taken by video-sharing platforms to protect children online, and that the non-government regulator will report on its findings later this year.
Currently, social media companies are responsible for setting age limits on their platforms. But under the long-awaited Online Safety Act, they are required by law to enforce those limits and demonstrate how they do so, such as by using age-verification technology.
Companies that do not comply with the terms of service will be fined up to 10% of their annual turnover.
In 2022, Ofcom research found that 60% of children aged 8 to 11 had at least one social media account, often created using a false date of birth. The regulator also found that Snapchat is the most popular app for underage social media users.
RISKS TO CHILDREN
Social media poses serious risks to young children, say child safety advocates.
According to figures recently released by the NSPCC (National Society for the Prevention of Cruelty to Young Children), Snapchat was responsible for 43% of the cases where social media was used to distribute indecent images of children.
Richard Collard, from the NSPCC’s child safety online unit, said it was “incredibly alarming” how few underage users Snapchat removes.
Snapchat “needs to take much stronger steps to ensure that young children are not using the platform and that older children are safe from harm,” he said.
Britain, like the European Union and other countries, is looking for ways to protect social media users, especially children, from harmful content without infringing on freedom of expression.
Age enforcement is expected to be a key part of his online safety bill, along with ensuring companies remove content that is illegal or prohibited by their terms of service.
A TikTok spokesperson said the figures show the strength of the company’s efforts to remove suspected underage users.
“TikTok is strictly a 13+ platform and we have processes in place to enforce minimum age requirements, both at sign-up and through ongoing proactive removal of suspected underage accounts from our platform,” they said.
(Reporting by Martin Coulter; Editing by Matt Scuffham and Catherine Evans)