CSG Law Alert: AI Can Serve Up More Tricks Than Treats
On Halloween, it is fun to dress up and pretend to be someone else; online, however, when people “masquerade” as someone else, it can lead to scams, fraud, identity theft and more.
With artificial intelligence (“AI”) tools readily available, it has become much easier for bad actors to dupe people into sharing credentials, providing access and wiring funds. And with the sophistication of the technology, simulated or “clone” images and voices are easily and effectively created.
While a really good voice cloning effort requires several minutes of a recorded voice to create, the technology needs only three seconds of your recorded voice to create a cloned voice. And if you want to be a bad actor but do not have the time, you can hire a third party to create the clone for you. While many are familiar with RaaS (ransomware as a service), the new emerging (and insidious) tool of bad actors is VCaaS, or voice cloning as a service.
If you regularly speak at events, and your presentation is recorded and posted online, or if you speak on a podcast, you have put out into the world the footage needed to create your cloned voice.
Deep fakes are also very easy to create these days, and the “tells” are no longer readily apparent. This is why it’s crucial not to trust everything you see and to learn where to look for the fingerprints of digital manipulation. In some cases, one must look very carefully at affect when the “speaker” is not speaking to detect the fake, but, in others, fingers and toes might be a bit “off” because generative AI still cannot get fingers and toes quite right.
As the famous magician, Harry Houdini, said, “What the eyes see and the ears hear, the mind believes.” As trusting as many people tend to be, it’s imperative to verify information in this age of rampant scams.
- If you receive an urgent call from a loved one, hear their voice, and are asked to send money or otherwise compromise yourself, ask for a detail that only that person would know, or tell them you will call them back on their cellphone in two minutes, as you are in a bad reception location. The odds are good the caller will hang up, knowing they have been detected.
- If you see an image or recording of an event or action that evokes an immediate emotional reaction in you, take a deep breath and then look at the source carefully. Take a step back and examine known fakes before accepting the conclusion compelled by the strong image or recording.
- Fakes can be detected from the embedded information or metadata regarding the device that created the image – including date, time, and more. Forensic experts can often “decode” and debunk deep fakes.
In short, do not be too quick to react. Stop, think, verify information and proceed with caution. And if you do fall victim to a scam, you are not to blame! Report the information as soon as possible – whether to law enforcement, a trusted advisor, your banking relationship or others as appropriate. If you need further guidance, please contact our team here at CSG Law.