While deep fakes may seem like a niche problem reserved for celebrities, politicians or other high-profile individuals, their rapid growth may bring the issue into the mainstream. The term “deep fake” refers to AI-generated content— often pornographic — that features a specific person. For instance, a deep fake could be a pornographic video that includes an individual who wasn't actually there. Or it could be a politician saying something they never said.
It's a trend on a meteoric rise. In fact, a recent report shows that the number of deep fake videos created and available online has doubled every six months since the end of 2018. The proliferation of this harmful content undoubtedly worries those in the public eye, along with media companies and consumers, who may struggle to discern what is real.
But the threat doesn't stop there. Robert Ellenhorn, Payment Risk Specialist at EverC, noted that this nefarious activity also poses a significant risk to payment providers. EverC's technology helps merchant acquirers and payment aggregators illuminate the risks in their merchant portfolios.
“While deep fakes initially centered on the shadier parts of the internet which tend to transact in cryptocurrency and other high-risk payment methods, they're now becoming more mainstream,” he said. “This increases the chances that the people and businesses behind them will begin using credit cards for the transactions and seek to utilize the infrastructure of mainstream payments providers.”
By understanding deep fakes and leveraging technology to stay ahead of the risk, payment providers can get in front of the issue and protect their business.
Diving into deep fakes
Deep fakes are akin to a photoshopped photo, but with new technology for a whole new era. The fake video or audio relies on AI or deep learning (the “deep” part of deep fake) to create and synthesize the content. Instead of a video editor changing each frame, the technology can automatically make the changes across each frame.
Ellenhorn noted that this meant the content creation happens much faster. And it's also far less expensive. Accelerating and automating the creation of deep fakes opens up the creation of fraudulent videos and audio to a broader world of potential criminals. It's worth noting that the technology itself isn't necessarily inappropriate or ill-intended; many may leverage similar AI and deep learning for legitimate purposes. “There are reasonable use cases, and it's not all bad,” Ellenhorn said. For instance, film production companies may use deep fake technology to edit movie scenes without needing the actors to physically redo them.
That regulations are still catching up to deep fake technology adds to the issue's complexity. There are no federal laws prohibiting the creation of fake videos. But legislation could be on the way. Texas and California have enacted laws that prohibit deep fakes created to influence elections. And Virginia has banned the creation of deep fake pornography.
What deep fakes mean for payments organizations
The business of deep fakes presents multiple risks to acquiring banks and payment providers. Ellenhorn noted that under existing laws, processing payments for deep fake transactions isn't illegal. However, that may change as state legislatures and the federal government become more aware of the issue.
In the meantime, unknowingly transacting with the darker side of the internet can still pose significant and damaging brand reputational risks. Financial institutions that find themselves with a deep fake content producer as a customer could face damage to their brand (i.e. they become a top headline) and lose customers as a result. It could also lead to additional unsavory business. Ellenhorn noted that “the chances are high that the people building a business off deep fakes may also be involved in other illegal activity.” That could be money laundering to hide deep fake transactions, blackmailing of deep fake subjects, and more.
So what can the payments industry do to prepare? It starts with recognizing that deep fakes are becoming a big business. “It's something that you want to be paying attention to and getting in front of,” Ellenhorn said. Merchant acquirers and payment processors can also leverage leading solutions such as EverC to help them better monitor the risk in their merchant portfolios.
EverC uses AI and machine learning modeling to crawl millions of domains, providing the payments industry with increased visibility, insight and understanding of their customers. The solution uncovers hidden risks and helps identify potential issues long before they become significant problems. “Part of the Know Your Customer [KYC] and due diligence processes is essentially understanding and verifying what your customers are really doing,” Ellenhorn said.
EverC can help reveal just that, ensuring that payment industry players have greater understanding of the risks present in their merchant portfolio. And while the risks continue to evolve, the right technology, compliance efforts can help payments companies stay ahead of the curb and proactively reduce risk.