Tuesday, September 5, 2023

AI is Newest Fad: It Can Easily Fool Anyone and Right Now People Don't Seem to Care Much

AI can fool the best of fools, right
(Donald T. loves Doc Fauci)

Election 2024 may be on track to be the worst ever in our political history with AI booming like this story from ROLL CALL It outlines a very informative and very concerning article with this headline:

AI Deepfakes in campaigns may be detectable, but will it matter?”

Picture this at some point in the months leading up to the 2024 election, an audio tape leaks that will confirm voters’ worst fears about President Joe Biden. It may be bit grainy and muffled as if it was recorded from a phone in someone’s pocket.

But, it sounds like an 80-year-old confused, perhaps seeming to forget that he’s even president, before turning murderously angry – that’s Joe Biden’s voice people will say.

It may arrive in journalists’ inboxes from an anonymous whistleblower, or just go viral on social media.

Or maybe the uproar will be over audio of Trump saying something that his supporters find disqualifying.

Whether such a clip is real or the work of new, startlingly realistic generative AI models, the affected politician will call it a fake and evidence of the other side’s willingness to lie, cheat, and steal their way to the White House.

While generative AI experts say they will most likely be able to detect such charlatans, it would be impossible to prove a recording is real. And it’s another question, and a doubtful one at that, whether such evidence of some audio’s provenance will matter to partisan voters so ready to reject any data point that doesn’t conform to their worldviews.

“Deepfake” audio, authentic-sounding but false recordings built from short snippets of a subject talking, have become so realistic that they can even fool your relatives, close friends, or on-line search thus presenting painfully obvious potential for underhanded dirty political tactics and stunts.

AI developers warn that the technology’s rapid development and widespread deployment risks ushering in an epistemological dystopia that would undermine the foundations of representative democracy.

·  Hany Farid, a generative AI expert at the University of California, Berkeley said:Campaigns are high stakes. We know that we have state-sponsored actors interfering, we know the campaigns are going to play dirty tricks. We know the trolls are going to do it. We know the supporters are going to do it. We know the PACs are going to do it.”

·  Sam Altman, OpenAI CEO, said AI’s capabilities to generate disinformation personalized to the targets, one-by-one, one of his gravest concerns in his testimony before a Senate Judiciary subcommittee in May.

·  An UN adviser recently told Fox News that a “Deepfake 2024 October surprise was his deepest worry.

So, how can we spot a “Deepfake” or phony AI – this NPR site helps with some hints. Check it out.

My 2 Cents: I said before and will say again, yes some AI has good use, but keep in mind that the first word in AI is “artificial” and artificial means NOT REAL. 

Imagine the chaos and possible total destruction of our election as valid ballots get tossed out for no good reason except for example many show the from same or various places with the same voter names and city location e.g., John Jones, or Jim Smith, or Tom Brown, or Jack Jones, or Bill Smith, or Sam Brown, et al. 

Result will be turmoil run amok as would our faith in the entire system and by careful slick design and the click of an AI by someone’s finger. 

The result, bye-bye American democracy with any hope of saving our electoral system of having free, fair, safe, and secure voting instead of utter chaos.

Our country that we all say we love and cherish would disappear with a simple AI computer click or two. 

Yes, it’s damn scary, and that is not hyperbolic since hundreds if not thousands of AI experts are voicing that same fear. 

Our imagine something far worse? 

Stay tuned, be prepared, and alert. We have been forewarned. This is not a joke, not one bit. 

Thanks for stopping.


No comments: