FBI warns of blackmailers using deepfaked nudes to bully and extort victims


Are you careful about what images of yourself you post online? If not, you may want to read this. The FBI has recently issued a warning about an alarming rise in extortion schemes involving fake nudes created with the help of AI editing tools.

According to reports, malicious actors find innocent images of their intended victim on social media and use AI to create realistic and sexually-explicit content. The photos are then sent directly to the victim for harassment, with the intention of extorting real nude images or payments. Unfortunately, once the fake images are created, it can be challenging to prevent their continued sharing or removal from the internet.

In light of this new threat, the FBI recommends exercising caution when sharing images online. However, with only a few images, it’s still possible for someone to create a deepfake, making it difficult to completely avoid being a victim of such extortion schemes.

Nude deepfakes emerged online in 2017, with users creating sexually explicit content of female celebrities using AI research methods. Despite some attempts to counter the spread of the content, tools and sites for creating deepfake nudes are still easily accessible.

It’s worth noting that such extortion schemes may violate federal criminal statutes, though there are limited global laws criminalizing the creation of non-consensual fake images. Virginia, for example, has outlawed deepfakes as a type of revenge porn. The UK is also considering making the sharing of such images illegal in its upcoming Online Safety Bill.

In short, vigilance is more important than ever. It’s up to us to keep ourselves safe by being mindful of the images we share online, and reporting any instances of harassment or extortion to the relevant authorities.

Leave a comment

Your email address will not be published. Required fields are marked *