A Growing Collection of Tools and Services Against Deepfakes

A Growing Collection of Tools and Services Against Deepfakes
Female athlete facing the dangers of deepfakes in cyberspace

While building out the Colossea service, I was researching statistics around Right of Publicity violations. These statistics are important figures for making the case in favor of the Defense side of our business.

Colossea lets athletes utilize their right of publicity to endorse brands that they aspire to or at least feel compatible with. This is letting athletes go on the Offense about their self-marketing rights, so to speak. Colossea Defense is meant to complement this service: Offering to help athletes defend against violations of their right of publicity. The latter is where the statistics on such violations comes into play; and where I started to discover some ready-made tools to use right away.

The main term that makes the rounds in this context seems to be "nonconsensual deepfakes", by the way. And, regrettably, what's really meant in the majority of cases is "nonconsensual deepfake pornography". This, finally, most prominently refers to "female pornography" – no big surprise for anyone but a naive, athletic nerd I guess. What it means for Colossea Defense, though, seems to be: Is it mostly going to be takedown attempts for female athletes? – Only time will tell.

Here's the point of this article at last: Not only did I learn about the downsides of generative A.I.; I also started to discover services to counter deepfakes of the worst nature. Instead of compiling a list of such over time and publishing it later, why not grow the list on the fly? Here comes service number one; it's more than worth it giving it another backlink and the little extra attention that comes with it:

Take It Down
This service is one step you can take to help remove online nude, partially nude, or sexually explicit photos and videos taken before you were 18.


P.S. For the article's header shot, I tried and failed to generate one of those blurred deepfake porn photos that generative AIs are said to return when crossing a line. Such would have fit the article quite well. I didn't want to try too hard, though. And probably, I shouldn't have said "Blurred...". Good that I tried. Good that I failed.