How can we prevent widespread abuse of dating apps?

How can we prevent widespread abuse of dating apps?

Dating apps are widely used because they promote conversation, personal relationships, and intimacy. Responding quickly and effectively to app-based abuse is part of their license to operate, write Kath Albury of Swinburne University of Technology and Daniel Reeders of the Australian National University. in this article republished from The Conversation.

According to the Australian Institute of Criminology’s 2022 survey, one in four app users have experienced online abuse or harassment while using dating apps. This included image-based abuse as well as abusive and threatening messages. Another third experienced personal or off-app abuse from people they met on apps.

These figures provide the background to a national roundtable convened by Communications Minister Michelle Rowland and Social Services Minister Amanda Rishworth on Wednesday.

App abuse is highly gendered and reflects pre-existing patterns of marginalization. The targets are typically women and members of LGBTIQA+ communities, while the perpetrators are usually men. People with disabilities, Aboriginal and Torres Strait Islander people and people from migrant backgrounds reported being directly targeted based on their perceived differences.

What do these patterns tell us? This app abuse is neither new nor unique to digital technologies. It reflects long-standing trends in offline behavior. Criminals simply take advantage of the opportunities offered by dating apps. With this in mind, how can we begin to address the problem of dating app abuse?

He tries to find solutions

Survivors of app abuse and violence said the apps were slow to respond and failed to provide meaningful responses. In the past, users have reported abusive behavior only to be met with a chatbot. Also, blocking or reporting abusive users does not automatically reduce in-app violence. It just allows the abuser to abuse another person.

See also  Bend man, 22, arrested on rape, other charges against 18-year-old girl he met on Tinder dating app

Wednesday’s roundtable explored how app makers can better work with law enforcement to respond to serious and persistent offenders. Although no official results have been announced, it has been suggested that users of the app provide 100 points of identification to verify their profiles.

However, this proposal raises data protection concerns. It would create a database of the true identity of people from marginalized groups, including LGBTIQA+ communities. If this data were to be leaked, it could cause immeasurable damage.

Prevention is key

Moreover, even if the profile verification process were strengthened, regulators would still only be able to respond to the most serious cases of damage, and the abuse has already happened. That’s why prevention is vital when it comes to dating app abuse. And this is where researching everyday patterns and understanding app usage adds value.

Abuse and harassment are often fueled by the stereotypical belief that men are “entitled” to sexual attention. They also play on widespread assumptions that women, queer people, and other marginalized groups don’t deserve the same level of respect and care in all their sexual encounters and relationships—from lifelong partnerships to casual relationships.

In response, app makers launched PSA-style campaigns to change the culture of their users. For example, Grindr has a long-running “Kindr” campaign targeting sexual racism and fatophobic abuse among gay, bisexual and trans people who use the platform.

Icons of various dating apps appear on your mobile screen
Match Group is one of the largest dating app companies. It owns Tinder, Match.com, Meetic, OkCupid, Hinge, and PlentyOfFish, among others. Shutterstock

Other apps have tried to build safety into the app for women. For example, on Bumble, only women can initiate chats to prevent unwanted contact from men. Tinder also recently made its “Report” button more visible and partnered with WESNET to provide users with safety advice.

Similarly, the Alannah & Madeline Foundation’s eSafety-funded intervention ‘Crushed But Okay’ gives young men advice on how to respond to online rejection without getting offended. This content has been viewed and shared over a million times on TikTok and Instagram.

See also  TikTok admits employees used app to spy on journalists to track company leaks

In our research, users of the app told us they wanted education and guidance for anti-social users – not just the police. This can be achieved by apps working with community support services and promoting a culture that challenges prevailing gender stereotypes.

Political tools for change

Apps are widely used because they facilitate conversation, personal connections and intimacy. But they are a for-profit business produced by multinational corporations that generate revenue by serving ads and monetizing users’ data.

Quick and effective action against application-based abuse is part of their community operating license. We must impose severe penalties on app developers who violate this license.

The United Kingdom is on the verge of passing legislation that plans jail terms for social media managers who knowingly expose children to harmful content. Similar penalties, which hurt the bottom line of app makers, may encourage action instead.

In an age of widespread data breaches, app users already have good reason to be distrustful of giving out their personally identifiable information. They don’t necessarily feel safer if they have to provide more information.

Our research shows that users want transparent, accountable and timely responses from app developers when they report behavior that makes them feel unsafe or unwelcome. They want more than chatbot-like responses to reports of abusive behavior. At the platform policy level, this could be addressed by hiring more local employees who respond to complaints and concerns in a transparent and timely manner.

And while prevention is key, policing can still be an important part of the picture, especially when abusive behavior occurs after users have taken the conversation away from the app itself. In such cases, app developers must respond to police requests for access to data. Many apps, including Tinder, already have clear policies regarding cooperation with law enforcement.

See also  Here's how to protect yourself

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *