Summary
AI is increasingly exploited by criminals for fraud, cyberattacks, and child abuse, warns Alex Murray, UK’s national police lead for AI.
Deepfake scams, such as impersonating executives for financial heists, and generative AI used to create child abuse images or “nudify” photos for sextortion are rising concerns.
Terrorists may exploit AI for propaganda and radicalization via chatbots.
Murray urged urgent action as AI becomes more accessible, realistic, and widely used, predicting significant crime growth by 2029.
If only we could have seen this coming somehow, and prepared for it
I’m going to have to get my shocked face out of storage.
Want me to pick yours up whilst I’m there?
I actually see a positive in AI.
Ever got nudes leaked? Well now you can just claim they are deepfakes. Its so much relieving that you can just dismiss any photo or videos of you doing embarassing things as deepfakes.
I’m gonna be really honest, I think a big part of what feels violating about people seeing your nudes in the first place is being sexualized without your consent, and losing agency over who you allow yourself to be sexualized by
That’s not any different with deepfakes. I don’t think that’d actually make that person feel much better. Like maybe they can save face on the fact that they took nudes depicting themselves in whatever way, but the thing that I think does the most emotional damage isn’t actually changed or affected by saying “it’s not actually me, those are deepfakes” :(
Your partner and family members may know particular hidden features of your body to tell if it is a deepfake or not, so it is still somewhat damaging in private even if it isn’t in public.
Tried that, but unfortunately, I have a distinctive tattoo in a distinctive place.
This is a low priority given the serious issues raised here, but I feel like a portmanteau of sex and extortion was a bad call