Sick sexualized ‘face swap’ apps target children as young as nine

Sick sexualized ‘face swap’ apps target children as young as nine

VIDEO apps that can create instant false comparisons of sex are targeting children as young as nine, a Sun on Sunday investigation has found.

Youngsters simply enter a photo of a boy or girl’s face, and within moments it will be on a poor body in a provocative pose.

Video apps that can create instant fake similes sexually target children as young as nineCredit: Alamy
One AI-powered app, Facemega, was removed from both the Apple App Store and Google Play this week.

One of the AI-driven apps, Facemega, was removed from the Apple App Store and Google Play this week following our probe.

Still, online watchdog group App Magic estimates the app has already been downloaded more than a million times since it launched last year.

The use of artificial intelligence-driven fake videos and photos has increased by 900 percent since 2019.

Carolyn Bunting, chief executive of children’s cyber safety organization Internet Matters, said: “The creation of sexualized, non-consensual deepfakes in these apps is incredibly disturbing, as is the impact of this kind of content on children.

“Sexual use of someone’s likeness without their consent is extremely harmful to the child. This can lead to complex and persistent problems that affect their well-being.”

Before Facemega was removed from the app stores, it had climbed to number 77 on the entertainment list – above Lego. It cost £7.49 a week and was rated as suitable for ages ‘nine and over’.

While putting young lives at risk, it has made millions of pounds for both the app stores and the developer, Ufoto Ltd, which is owned by Chinese parent Wondershare.

Users are never asked to prove their age when accessing the image-editing technology, but to select videos to put faces on to include scantily clad women in bikinis and Hot.

See also  The best event planning apps for Android and iOS in 2023

Within ten seconds of uploading your chosen mugshot, the AI ​​wizard matches it to another body, often with alarming results.

Following our investigation, Facemega’s developer removed the Hot and For Women categories – which contained sexually provocative videos – from its app.

Apps like Facemega remain on mainstream platforms, weaving a troubling web of distorted reality.

FaceSwap was announced in the App Store this week as an app for children aged four and upCredit: Delivered

Face Swap Video by Deep Fake, made by the US company Deepfaker LLC, was announced in the App Store this week as an app for children aged four and over. The ad features a young woman swapping her face with another person’s social media image — albeit in a non-sexual way.

Then the video starts, so it’s hard to tell what’s real and what’s fake.

A three-day free trial leads to a £7.99 weekly subscription.

Faceswap, which is also available on the App Store for over 9s, gives kids access to deep fakes for free before paying a £19.99 annual subscription.

Our revelations come just months after ministers announced they would target deep-fake pornography, the unauthorized creation and sharing of images, under the Online Safety Act, which has been passed but not yet introduced.

The children’s charity NSPCC has called on the government to impose a legal obligation on major app publishers to help protect those targeted by these apps, particularly women and girls.

Rani Govender, from the NSPCC, added: “App stores play an important role in preventing the risks of deep counterfeiting technology at source. The Government can also act through the Online Safety Bill by placing a legal obligation on companies to tackle online violence against women and girls.”

See also  GNOME 44 Alpha has been released with the new Stream Deck app with a file selector grid view

Communications regulator Ofcom said last year that fake or misleading images and videos were among the top 20 online potential harms facing UK internet users.

According to education platform Safer Schools, the number of online fakes increased from roughly 14,000 to 145,000 between 2019 and 2021, a 900 percent increase. Of these, 96 percent contained pornographic material, while approximately 90 percent contained indecent images of young women.

Children’s charity NSPCC has called on the government to protect those targeted by these apps, particularly women and girlsCredit: Alamy

The NSPCC’s Ms Govender added: “Deep counterfeiting technology is already having an insidious effect on children as it becomes ever easier to produce and share this degrading and harmful substance.

“This rapidly evolving technology is quickly becoming a child abuse risk as it is introduced without consideration of the ways in which it fuels the abuse of intimate images.

“Girls and women suffer the most from apps like this, which exist in a toxic online culture of misogyny that is growing at a disturbing rate.”

Apple said it had removed Face Mega from the App Store and said it had no specific rules for deep-fake apps. It claimed to ban apps that contain pornographic, defamatory or discriminatory content.

A Google Play spokesperson confirmed that they had removed Face Mega from its platform, but had no comment on other apps.

Tory MP Siobhan Baillie called the deep spoofing technology frightening and added: “Clear age verification and additional protection should be considered.

“I applaud The Sun on Sunday for removing this app from the Apple App Store and Google Play. Our children must be protected from the profoundly false threat.”

See also  From apps to StretchLab, these new Hampton Roads businesses will help you reach your fitness goals - Daily Press

Three victims tell their stories

CHILDLINE has shared the details of three teenagers who were threatened with fake videos and photos as the charity shows how traumatic it can be.

One 14-year-old girl told how she was threatened online with a fake video of her if she refused to send nude pictures to an abuser.

They said: “I was being friendly, just chatting with someone on Snapchat. They asked me what I looked like, so I sent a picture of my face, and then they kept asking for nudes.

“I told them no, but they said if I didn’t, they would edit my face into nude and sell it. I know I should report them, but it won’t change anything, as I’ll still have my pictures on their camera. Please help, I’m really worried.”

A terrified 13-year-old girl said: “Someone I know is threatening to post a fake file and say it’s me if I don’t send them real files.

“He says he’s tagging my friends and showing them ‘it’s me.’ I’ve never sent nudes before and I’m afraid that if I do, my real friends will judge me.

“I met this person online, we were friends, but we haven’t spoken in a while. I don’t understand why you are doing this to me. I do not know what to do.”

A third teenager told Childline that police detectives were involved because of fake pornographic images.

She said: “I feel embarrassed and angry. Someone created a fake account on Instagram under my name and posted inappropriate pictures (porn) with my face.

“I have reported the account and the police are trying to trace the person. I feel a little safer knowing that, but I’m worried that my friends will find out and I’ll get bullied.”

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *