×
Our website is made possible by displaying online advertisements to our visitors. Please consider supporting us by whitelisting our website.

Beware of Deepfake technology: A more sophisticated strategy used by cyber criminals

starconnect
starconnect
Cyber criminals take to deeptake technology to defraud victims
Security concept: Cyber Crime on computer keyboard background

Rotimi Onadipe l Friday, August 27, 2021

LAGOS, Nigeria – Deepfake is a video or audio recording that replaces someone’s  face or voice with that of another person in a way that appears real. Deepfake technology can be used to make people believe something is real when it is not real. It is a more sophisticated strategy which cybercriminals are now using to defraud unsuspecting victims in today’s digital age.

Deepfake technology uses powerful techniques and artificial intelligence to manipulate visual and audio content for the purpose of deceiving unsuspecting victim. With deepfake technology it is very difficult to differentiate between a genuine and fake video, audio or picture.

 Fraudsters can use this strategy in different forms e.g., they can use the face of an unsuspecting victim in a video to impersonate him and use his identity for criminal activities. The fraudsters may also fake the victim’s voice on phone to call his friends, loved ones or any company to defraud them and they will not know that the call is from a fraudster.

On several occasions, deepfake technology had also been used to make celebrities, politicians and many highly placed people seem to say or do things they never did after which the fraudsters collected ransom from them.

As deepfake technology advances, cybercriminals are stealing more identities and other personal information of unsuspecting victims on a daily basis because many internet users are not aware about the danger of deepfake technology and how to spot it.

Audio deepfakes have been used severally as social engineering scams and many internet users fell victim to this latest fraud scheme. A typical example happened in the United Kingdom in 2019 when a UK based energy firm’s CEO was scammed over the phone after he was instructed to transfer over €200,000 into a bank account by an online scammer who used “audio deepfake technology” to impersonate the voice of the CEO of the firm’s  parent company.

Several damages caused by deepfake technology are not limited to financial damages but most people’s reputation have also been damaged through fake news, online misinformation and fake videos created through deepfake technology.

As technology advances, the use of deepfake technology is growing and becoming more sophisticated. However, awareness and vigilance are our greatest weapon and best defence against this threat.

How to detect Deepfakes:

1. Strange blinking or no blinking at all.

2. Movements that are not natural.

3. Inconsistent skin tone.

4. Screenshot the video and look at it carefully.

5. Look at the size and colour of the lips carefully.

6. Take a closer look at the eyes and eyebrows.

7. Watch out for lighting that doesn’t look real.

Advertisement

8. As you listen to the speech of the person in the video, take a closer

look at the lips.

9. Check if there are shadows where they are not expected to be.

10. Investigate the source of the image or video.

How to protect yourself and your family from the danger of Deepfakes:

1. Educate yourself and your family on how to detect deepfakes.

2. Set up countermeasures against deepfakes.

3. Be cautious about what you share online and offline.

4. Keep sensitive documents in a very safe place.

5. Conduct regular searches on the internet and report any suspicious activity.

6. As far as internet safety is concerned, educate and monitor your children very well because they are the most vulnerable to deepfake identity theft.

Rotimi Onadipe,

Tel: +234-8169121819

Email:  [email protected]

Share this Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Be the first to get the news as soon as it breaks Yes!! I'm in Not Yet
Verified by MonsterInsights