Artificial intelligence is here and it is already changing things in the crypto world. Coders use it to code, researchers use it to research, and unfortunately scammers use it to scam. That’s the finding of a new report from blockchain analysis firm Elliptic on the emerging risks of artificial intelligence in perpetuating the criminal use of cryptocurrency.
This is an excerpt from The Node newsletter, a daily digest of the most important crypto news from CoinDesk and beyond. You can subscribe to access the full newsletter here.
Note: The opinions expressed in this column are those of the author and do not necessarily reflect the views of CoinDesk, Inc. or its owners and affiliates.
“The rise of artificial intelligence has shown that there is great potential to spur innovation, especially in the crypto space. “However, as with any emerging technology, there remains a risk that threat actors may seek to misuse new developments for illegal purposes.”
While the risk currently remains small, the firm’s researchers have identified five “typologies” where AI is currently being used in nefarious ways. These include creating and spreading deepfakes to make more convincing scams, creating AI scam tokens to capitalize on excitement, using large language models to design hacks, spreading disinformation, and creating more convincing phishing websites/prompts to facilitate identity theft. It is located.
Being aware of these new (or frankly old but now overloaded) scams means users can stay one step ahead. This means crypto users need to become more familiar with the most common types of crypto-related scams. CoinDesk has a good report on this front that covers all the bases, including social media scams, Ponzi schemes, rug pulling, and “romance scams” (now often referred to as “pork butchering”).
In a recent post about deepfakes, AI, and AI, Pete Pachal, author of the excellent Media CoPilot Substack, wrote: “The reason there is no easy way to solve the problem is that there are really multiple problems, each with their own variables and solutions. ” crypto.
According to Pachal, who recently spoke at the Consensus 2024 session titled “From Taylor Swift to the 2024 Election: Deepfakes vs. Truth,” deepfakes are becoming increasingly difficult to detect as artificial intelligence image generation improves. For example, earlier this month, a video circulated on social media of a fake Elon Musk promoting his fake trading platform Quantum AI, promising fake returns to users and apparently fooling several people.
Examples like this will likely become more common. Verification company Sumsub claims that almost 90% of deepfake scams detected in 2023 are related to cryptocurrencies. While it’s unclear how effective these scams are, the FBI’s online crime report found that crypto investment losses in the US increased by 53% to $3.9. billion last year.
The story continues
See also: This is How Scammers Can Empty Your Crypto Wallet
However, it is worth noting that most cases of fraud in the crypto industry are coincidentally crypto-related, as this is a topic that attracts a lot of attention and is often complicated for people unfamiliar with the culture.
As CFTC Commissioner Summer Mersinger told CoinDesk: “I think it’s a little unfair because most of these cases are just run-of-the-mill scams; Someone steals someone else’s money, someone claims to buy crypto but doesn’t actually buy crypto. So we saw that play out no matter what the hottest topic was at the time.
If there’s any consolation, it’s that AI-generated images, videos, and text are relatively easy to spot if you know what to look for. Given how common it is for even high-profile individuals to be fooled by social engineering programs or malicious scripts, cryptocurrency users in particular need to be careful.
MetaMask creator Taylor Monahan has some sage advice on this subject: always know when you’re a potential target and verify what you’re clicking on actually is.
Given the nature of the technology, crypto is already a low-trust environment. And it could fall even further.