Crypto investors have been urged to keep their eyes peeled for âdeepfakeâ crypto scams to come, with the digital-doppelganger technology continuing to advance, making it harder for viewers to separate fact from fiction.Â
David Schwed, the chief operating officer of blockchain security firm Halborn, told Cointelegraph that the crypto industry is more âsusceptibleâ to deepfakes than ever because âtime is of the essence in making decisions,â which results in less time to verify the veracity of a video.
Deepfakes use deep learning artificial intelligence (AI) to create highly realistic digital content by manipulating and altering original media, such as swapping faces in videos, photos, and audio, according to OpenZeppelin technical writer Vlad Estoup.
Estoup noted that crypto scammers often use deepfake technology to creat fake videos of well-known personalities to execute scams.
An example of such a scam was a deepfake video of FTXâs former CEO in November, where scammers used old interview footage of Sam Bankman-Fried and a voice emulator to direct users to a malicious website promising to âdouble your cryptocurrency.â
Over the weekend, a verified account posing as FTX founder SBF posted dozens of copies of this deepfake video offering FTX users “compensation for the loss” in a phishing scam designed to drain their crypto wallets pic.twitter.com/3KoAPRJsya
— Jason Koebler (@jason_koebler) November 21, 2022
Schwed said that the volatile nature of crypto causes people to panic and take a âbetter safe than sorryâ approach, which can lead to them getting suckered into deepfake scams. He noted:
âIf a video of CZ is released claiming withdrawals will be halted within the hour, are you going to immediately withdraw your funds, or spend hours trying to figure out if the message is real?â
However, Estoup believes that while deepfake technology is advancing at a rapid rate, itâs not yet âindistinguishable from reality.â
How to spot a deepfake: Watch the eyes
Schwed suggests one useful way to quickly spot a deepfake is to watch when the subject blinks their eyes. If it looks unnatural, thereâs a good chance itâs a deepfake.
This is due to the fact that deepfakes are generated using image files sourced on the internet, where the subject will usually have their eyes open, explains Schwed. Thus, in a deepfake, the blinking of the subjectâs eyes needs to be simulated.
Hey @elonmusk & @TuckerCarlson have you seen, what I assume is #deepfake paid ad featuring both of you? @YouTube how is this allowed? This is getting out of hand, its not #FreeSpeech itâs straight #fraud: Musk Reveals Why He Financial Supports To Canadians https://t.co/IgoTbbl4fL pic.twitter.com/PRMfiyG3Pe
— Matt Dupuis (@MatthewDupuis) January 4, 2023
Schwed said the best identifier of course is to ask questions that only the real individual can answer, such as âwhat restaurant did we meet at for lunch last week?â
Estoup said there is also AI software available that can detect deepfakes and suggests one should look out for big technological improvements in this area.
He also gave some age-old advice: âIf itâs too good to be true, it probably is.â
Related: âYikes!â Elon Musk warns users against latest deepfake crypto scam
Last year, Binanceâs chief communications officer, Patrick Hillman, revealed in an August blog post that a sophisticated scam was perpetrated using a deepfake of him.
Hillman noted that the team used previous news interviews and TV appearances over the years to create the deepfake and âfool several highly intelligent crypto members.â
He only became aware of this when he started to receive online messages thanking him for his time talking to project teams about potentially listing their assets on Binance.com.
Earlier this week, blockchain security firm SlowMist noted there were 303 blockchain security incidents in 2022, with 31.6% of them caused by phishing, rug pulls and other scams.