“Pig butchering” is an unsavory term for a very specific kind of phishing attack, wherein the scammer targets a wealthy individual with the lure of romance and then takes them for all they’re worth. It’s hardly a new idea — they used to call this kind of thing “fleecing” — but new software tools are making it a lot easier and more effective.
A ring of scammers in Hong Kong managed to use live “deepfake” video to steal millions from their victims.
Police in Hong Kong arrested 27 people who operated out of an office and conspired to rip off wealthy people by pretending to be attractive romance prospects and getting them to invest big bucks in phony cryptocurrency schemes.
The methodology is familiar: Set yourself up as an attractive stranger with an alluring profile photo, slowly build up a rapport with the victim through text messages, and casually hint at the prospect of vast profits with a new crypto platform. Once the victim puts real money into the cryptocurrency scam, you vanish.
The new twist for this particular ring involved the use of fake real-time video to set their victims’ minds at ease. When a mark got suspicious and asked the scammer for a video call, they obliged — and used deepfake software tools to make the live video participant look like the attractive woman in the original AI-generated profile photo.
Satisfied that their romantic prospects appeared to be legitimate, the victims were taken for the equivalent of $46 million USD. Though the ring was based in Hong Kong, its victims were taken from mainland China, Taiwan, India, and Singapore, according to Ars Technica.
Pig butchering has become a popular method to steal money in the last few years, but it’s a high-risk, high-reward enterprise. Organized crime rings in southeast Asia have set up these ventures on an almost industrial scale, targeting specific wealthy individuals and stringing them along with online romances that can take weeks or months before a victim is convinced to “invest” in a crypto scam via phony apps and fabricated profits. There are even reports that crime rings are luring people across national borders with job listings, only to effectively enslave them as unwilling participants in the lowest level of the scam.
AI-generated fake profile pics are one of the easiest elements of the pig butchering process, because it’s not hard to find a photo of an attractive person online and pretend to be them through text or even voice calls. But faking that person on video is obviously a lot harder.
The deepfake process (wherein one person’s face and other features are swapped with another’s frame-by-frame) isn’t new, but previously it had been restricted to pre-recorded videos. Now with the ability to create deepfake videos in real time with active responses, victims can no longer trust live video calls to vet their remote partners.
All you need is a few easy-to-find software tools and a beefy PC to make the video smooth and convincing enough. A single AI-generated photo can then be used to mask the face of the scammer, and any irregularities in the video could effectively be chalked up to streaming hiccups.
Pig butchering expanded at a rapid pace during the COVID-19 pandemic, with potential victims stuck at home and starving for interaction. This is when the scams started targeting people all over the world, and have now successfully stolen as much as $75 billion according to a University of Texas study. Though the technique is rampant in Asia, it’s expanding internationally, with a poll showing 12 percent of Americans who are active on dating apps reporting a pig butchering attempt as of one year ago. This indicates that scammers now have the tools to greatly widen their nets for potential marks and steal from less wealthy people.
With techniques becoming more sophisticated and more effective, it’s more important than ever to keep your guard up and beware of attractive strangers offering cryptocurrency investments.