Speaking with Paola Suro for regional outlet 11 Alive, Debbie Shelton Moore detailed the six-minute phone call, which included a demand for a $50,000 ransom payment. Thankfully, Moore’s 22-year-old daughter, Lauren, was quickly determined to be not in danger despite the call’s dire-sounding claims.
"The man had said, 'Your daughter’s been kidnapped and we want $50,000,'" Moore recalled. "Then they had her crying, you know, 'Mom Mom!' in the background and it was her voice."
Though initially convinced her daughter was in peril due to the presence on the call of a voice that “sounded so much like her,” Moore was relieved to find that she was indeed safe. Warning others of the believability factor of something that was ultimately proven to be fake, Moore noted she is “very well aware” of scams in general. Hearing her daughter’s voice, however, rendered her incapable of thinking “clearly” about the situation at hand.
Complex has reached out to the Cherokee County Sheriff’s Office and the Kennesaw Police Department, both of which were involved in the response to the scam, for comment. This story may be updated.
AI, generally speaking, has remained a source of intense debate in recent months. In fact, the need for regulation on AI is a key facet of the WGA and SAG-AFTRA strikes, both of which remained ongoing at the time of this writing.
Sam Altman, the CEO of ChatGPT developer OpenAI, warned during a Senate Judiciary Committee hearing in May that he was "nervous" about the risks moving forward.
"I think if this technology goes wrong, it can go quite wrong," Altman said at the time. "And we want to be vocal about that. We want to work with the government to prevent that from happening.
Terminator director James Cameron, meanwhile, has echoed these concerns as recently as this week.
Unhelpfully, there have also been a few instances of the word “AI” being misleadingly used as a catchall term, even for projects that do not involve the generative type of tech the word most often brings to mind.