Have you ever wondered how you would look like if you would be a movie star? For example, how you would look if you would simply replace Leonardo DiCaprio in the Titanic movie?

Deepfake, new scary technology which can make you say things you never did. Should you be afraid?

Or that you could become the President of the United States for a few minutes and give an online speech instead of him?

Deepfake, new scary technology which can make you say things you never did. Should you be afraid?

That would be probably really cool right?

What if I would tell you that such a technology already exist?

And that it actually takes only few seconds to make such a video.

Or that such a technology can be used to commit cybercrime, influence public opinion or even affect elections?

Simply said, welcome to the era of DEEPFAKE technology.

What is deepfake?

According to Wikipedia, deepfake technology is a technique for human image synthesis based on artificial intelligence.

To translate it into simpler language, deepfake technology uses machine learning in order to create fake content. That content can be of any kind like videos, voice, or even texts.

The history of deepfake technology.

Somewhere around 1997, the first faked videos appeared. They were made in a very simple way and you could easily identify that it was fake. In the original video footage with people speaking the original voice was replaced with another audio played from a different source.

In 2017 the word DEEPFAKE was created and mentioned by an amateur REDDIT user named “deepfakes“.

This community created many fake videos which were shared on Reddit and the internet.

The first deepfake videos shared by the “deepfakes” community appeared in pornography. Faces of porn actors were simply replaced with faces of real celebrities like Scarlett Johansson, Katy Perry, Emma Watson or Taylor Swift.

Those videos were quickly identified on platforms like Twitter, Reddit or PornHub and banned shortly after publishing.

In 2018 the “deepfakes” community was banned for violating the content policy of Reddit.

Deepfake, new scary technology which can make you say things you never did. Should you be afraid?

Since then, many deepfake videos are created and published daily. The technology is getting more advanced and it is used in many different ways.

How can be deepfake technology used?

As I said, the deepfake technology is so advanced that it is very hard to spot it immediately and recognize what is real and what is fake.

Just take a look at this video from Buzzfeedvideo which is in my opinion “unbelievable”.

Do you see why I think it is unbelievable? Just a few years ago nothing like this would be possible. And today, it takes only a few minutes to create.

And believe it or not, such a video with fake content can easily become viral on the internet or social networks and reach millions of people in just a few days.

Last but not least, this kind of content can easily influence public opinion or even elections.

First cybercrime using the deepfake technology.

We have talked about the fake video. And how about fake text messages or even a fake voice?

Or a fake call which can cost your company $243,000?

Earlier this year The Wall Street Journal reported an unusual fraud. According to the report, fraudsters tricked a CEO with a fake voice call from his boss. In that call, his German boss (CEO of the mother company) demanded on him that he should make a money transfer to a Hungarian supplier.

Deepfake, new scary technology which can make you say things you never did. Should you be afraid?

As he suspected nothing unusual and knew the voice he actually confirmed the transfer. Once the transfer was made, the money was further transferred to Mexico and other locations. And of course, they disappeared.

This was one of the first cybercrime when artificial intelligence was used in such a way.

How to recognize and fight deepfake?

In the past few months there were many cases when famous people from around the world were fighting against deepfake.

The website named NotJordanPeterson.com created many videos using AI (artificial intelligence) with the face and voice of the real Dr. Jordan Peterson. The website was taken down after he said he will take legal actions against it.

The only way to fight AI is to use AI against it.

During the PennApps XX (the student-run college hackathon), the app which took the first place was called DeFake.

DeFake is a Google Chrome extension which uses machine learning to recognize the fake content. Thanks to its knowledge of the real public figure the video can be checked in real-time and deepfakes can be detected immediately.

According to the DeFake app developers, you can even send them a link with the concerned video. Later on, you will receive the result confirming or denying the authenticity of the video.

Facebook is taking up the fight against deepfake.

Deepfake, new scary technology which can make you say things you never did. Should you be afraid? 1

If you are worried about the fake content you are not alone – Facebook is too.

It has partnered up with Microsoft and others and created THE DEEPFAKE DETECTION CHALLENGE. The challenge is starting in October 2019 and has only 1 goal. It is designed to invite and incentivize participants to compete, detect and fight fake content.

If you are interested in participating in the challenge check out the FAQ site for more details.

Conclusion.

So as you see we live in a very dangerous era where everything is possible.

The fight against deepfake might not be easy but we all have to stand up and try our best in order to eliminate it from our daily life.

And as you see there are tools, and ways which can help you with that.

Together we are stronger.

Deepfake, new scary technology which can make you say things you never did. Should you be afraid? 2

Related article: Discover TikTok – is it the future of social media?


Featured image: Jakob Owens on Unsplash

Facebook image: Alex Haney on Unsplash

Liked it? Take a second to support ediblorial on Patreon!
Deepfake, new scary technology which can make you say things you never did. Should you be afraid? 3