The Agenci 03455 760 999

Think Fake News is Bad. Try Deep Fakes


Not everything you read (or see) can be believed..


I received an email yesterday telling me I had won the Lottery! I was VERY excited, for about 3 seconds when I suddenly realised I don’t play the National Lottery. I’m pretty sure however that if I did, the National Lottery would probably want to call me or even come and see me to tell me I had just won £25,000!?

Now of course you know where I’m headed, and you may already be thinking “Ah, this is just a cautionary tale about ‘Phishing emails. How dull..!”

But hang-on… it’s not JUST about Phishing emails. It’s about the power of persuasion and about how our modern world has evolved to the point where we can’t believe everything (anything?) we see, read or even hear.

 

Fakes and Deep Fakes

Technology is now available in Hollywood to recreate actors who are no longer with us, such as Oliver Read in ‘Gladiator’, Paul Walker in ‘Fast n Furious’ and Peter Cushing in ‘Star Wars’. The technology is truly outstanding. But there is a darker side to the use of this technology we should be aware of. In 2017 someone calling themselves ‘Deepfakes’ on Reddit began posting videos where they superimposed celebrities’ faces on the bodies of women in pornographic movies. Stars like Emma Watson, Gal Gadot and Selena Gomez have all had their faces placed on to Porn stars bodies in movies (not just images). Thankfully for them they had the money to have the images and movies removed by expensive lawyers. But what about you and me? Would we have the same ability? Many are calling Deepfakes another version of “Revenge Porn” where sexualised images are posted in a way to deliberately cause harm or distress.

The technology was first created by a graduate student, Ian Goodfellow in 2014. He called the machine deep learning technique a “Generative Adversarial Network”, or a GAN., and it was a way to algorithmically generate new types of data out of existing data sets. For instance, a GAN can look at thousands of photos of Barack Obama, and then produce a new photo that approximates those photos without being an exact copy of any one of them, as if it has come up with an entirely new portrait of the former pPresident not yet taken. GANs can also be used to generate new audio from existing audio, or new text from existing text – it is a multi-use technology.

Imagine the possibilities; The good. The bad. And the downright nasty!

If you’re interested in seeing this in action, then take a look at the video link of Barak Obama saying things that you really wouldn’t expect him to be saying (please be aware it contains some fruity language!).

https://www.youtube.com/watch?v=cQ54GDm1eL0&feature=youtu.be

 

These People Do Not Exist

Worryingly the same technology is being used to create imagines of people who have NEVER actually existed at all! That’s right, you’re imaginary friend could now have a real face.  If you’re interested in seeing how good this is just take a look at the faces in the title of this article or go to https://thispersondoesnotexist.com/  to see how good (scary?) this is.

 

What does all this mean?

If our senses can be tricked with fake videos and fake people then it’s little wonder we’re still falling for Phishing emails.  Yes we can put in technology to try and block these emails, images and videos but we need to be more intelligent than that. I believe we need to upgrade our thinking as well as our hardware.

If we don’t question what we read (including fake news) and/or we don’t question what we see and hear then we’re going to fall prey to some of the tricks and scams that are out there.

As Barak Obama says in the video, we need to be more vigilant about what we trust on the internet.  Wise words indeed. Even if he didn’t really say it at all.