A #FAILED interview: dating apps responsibilities

Illustration by oliv dorsaz @oabsolu

Two months ago a journalist contacted me to write a “piece about dating apps responsibilities”, as he called it.

I answered immediately because he was referred by an acquaintance who is doing an amazing job in data protection and digital rights. Maybe there was too much enthusiasm though… The guy never answered and I did not find the article online! haha

So I thought about sharing my text/opinion with any wandering people coming to my website. Here it is:

Do you think dating apps have been pro-active in protecting vulnerable users of its devices from abuse?

I would need to answer this with another question, what do you define as an abuse?

This issue is tentacular.

When the definition concerns personal data protection, some dating websites are rather “reactive” and not proactive. For instance, AdultFriendFinder had security vulnerabilities and it was hacked twice in a period of one year and a half: in March 2015 and in October 2016. Ashley Madison was also hacked. Those platforms argue they have improved their systems afterwards.

When the definition concerns pervasive practices of data collection, for instance tracking and saving users online behaviour for the sake of finding a “good match” (but more specifically to target advertisements) dating apps are more interested in exploiting features such as geolocation than in considering the negative consequences of sharing that type of information. We have seen in the news for example that Grindr shared users’ location and HIV status to external apps.

Users usually need to agree to share personal information when accessing the service but what we have little knowledge about is what is actually collected and to what extent, not everything is visible on the Graphical User Interface. Terms and conditions are not always clear. And more important, algorithmic practices in dating companies remain a secret. In the case of Tinder for instance, the CEO refers to a desirability score they calculate to suggest you matches without providing too much detail. Referring to that score has perhaps helped them gaining more popularity instead of caring about implementing transparent systems for the users.

For my research, I have tried for one year and a half to do a sociological study about programming practices in dating apps but they are not interested, and sometimes I don’t even get a reply from them. It seems like they do not see the value of a sociological perspective about developers work, they might believe instead that their secret formula will be stolen.

Actually, online abuses is a very broad phenomena that reaches out social media in general and not only dating apps. Whereas the European Commission is putting efforts to prevent breaches of antitrust rules and abuse of dominant positions for companies like Google and Facebook, users are contributing through personal narratives to raising awareness of the value of their data.

What do you think apps could do that could be easily implemented?

Reinventing their business model that relies on trading data. That is the reason why we are currently living in “data cultures” as some authors have put forward. In there, we are “dividuals”, that is features of individuals spread in networks as the philosopher Deleuze taught us.

This idea might not be easily implemented but we need new models fare-play for all the actors involved and that requires time. It could even provide better dating experiences.

And if companies have being so ingenious to develop nice user friendly interfaces and sophisticated algorithms, then they could also find novel ways of being sustainable.

More broadly, they could also provide access to their algorithmic practices. By opening their doors to academia, to sociologists and anthropologists in particular, we could gain insight about the dating phenomena from another perspective. A humanist standpoint about their technical expertise could indeed help the company and the users understand better what’s happening behind the doors.

Why do you think these things haven’t been done?

As I said, this phenomena is touching upon a broad range of online platforms and also offline. It is a whole system that should be rethought.

Hard to answer and the idea is not to generalize. Some applications might have good security systems but usually the scandals get more place in the news. Some companies might have better fair policies to create an app but they are not visible next to oligarchic groups.

From a more day-to-day operations point of view, I have lead a developers team and it is hard to imagine all the existing use cases to prevent that your system crashes when you are developing an application. Hence, companies need to reinforce the testing phase, introduce technical protocols that includes algorithmic literacy, take into account social concerns attached to the subject and not to the “dividual” connected, and start putting into practice legal and other theoretical frameworks that could help the dating industry control undesired scenarios and have a deeper meaning of online experiences. By this I mean not only to protect their interests but those of their users that are vital to the apps’ functioning.

How do you see the future of dating apps privacy-wise as users start to be more concerned about how their personal information is being used?

The more users find new alternatives to get a service (that is not only the most popular) and the more they get involve in data protection concerns, the faster the companies will need to adapt to the public’s requirements and not the other way around.

I also think policy making has an important role to play in there if they gain velocity to lead the way in technological development.

Welcome discussion!

Leave a Reply

Your email address will not be published. Required fields are marked *