• Disinformation, meaning the dissemination of falsehood, could be rife in the 2024 election.
• Social media platforms are gutting their disinformation teams, meaning they’re leaving voters vulnerable.
• Generative AI deepfakes are a game-changer in terms of disinformation.
As of this week, the candidate who is leading the race to become the Republican nominee for President of the United States in 2024 is on his third indictment, and faces a federal investigation and 40 felony charges.
The federal investigation includes charges linked to the first insurrection to ever challenge the notion of the peaceful transition of power after an election in United States history, and involves counts of conspiracy to defraud the United States, conspiracy to obstruct an official proceeding, obstruction of and attempt to obstruct an official proceeding, and conspiracy against rights.
The 40 felony charges involve illegally taking classified documents out of the White House and showing them to people that were never in any sense cleared to see them.
You know his name – it’s on absolutely everything he touches.
He’s currently polling some 37% ahead of Governor DeSantis, his nearest rival.
As of August 5th, there’s just one percentage point separating Donald Trump and incumbent president, Joe Biden.
This is not a political hack job on Donald Trump.
This is by way of showing that the 2024 Presidential election could easily go either way, and the power of incumbency appears to be doing President Biden no favors.
It’s also by way of demonstrating something that should be keeping Democrats awake at night. The more indictments, the more charges Donald Trump accumulates, the less it appears to matter to his rock solid base – which is currently significantly larger than that of any other Republican candidate.
That means a long-term project to energize the Republican base against what Trump himself first described as “the fake news media” is paying the Republican rich benefits, making backers of the 45th President keen to find their “news” in alternative places, particularly online. To the Trump die-hards, nothing that plays negatively for him will sway them out of their belief that he’s hard-done-by, maligned, and the man who deserves the office.
Joe Biden has absolutely nothing like that rock solid base of true believers. In fact, you could make a compelling case that he won the election in 2020 on the basis of two factors. Firstly, he wasn’t Donald Trump. And secondly – much more crucially – he hadn’t just overseen a response to Covid that killed 40% of those who died with the disease entirely unnecessarily, according to a panel which investigated the event after the fact. Trump had – and some of the swing votes were cast essentially to punish him for those unnecessary losses.
Biden has nothing like the gift of a Trump incumbency and a deadly plague to carry him over the line in 2024.
Decision of the undecided.
That means the 2024 Presidential election won’t be decided by either die-hard Democrats or hardcore Trump Republicans. It will be decided… by the undecided. The swing voters. Ohioans.
And that is why the 2024 election will depend more than any previous contest on the availability of trustworthy electronic information.
As of 2021, after the last election and the January 6th insurrection, a survey by the respected Pew Research found that 71% of Americans got some of their news about the world directly from social media, with 53% claiming they got their news that way “sometimes” or “often.”
53% of Americans – assuming a politically useful geographical distribution – will hand you the keys of the White House. Based on social media news.
That’s why two worrying trends are making analysts turn six-day-dead pale and set their hair on fire to get people to listen to them.
The first of those trends is the reduction in staffing numbers, at this crucial time, in misinformation and disinformation teams at some of the leading social media platforms, meaning a significant loosening of the governance of fact-based reporting and opinion pieces.
Everyone is by now aware that Elon Musk’s Twitter more or less obliterated its fact-checking and content moderation teams within months of his takeover of the platform. In fact, that has played a large part in getting some of the remaining content moderation leadership within the company to quit, though some, as far as we know, have been replaced.
The point is, content moderation is overtly no longer a thing on which Twitter can be relied to provide. Asked directly about the cuts to the content moderation structure within Twitter by Fox News host Tucker Carlson, Musk’s view of its unimportance was clear:
What Musk calls “censorship and activism” is essentially the job of content moderation teams, and with the teams who do that job more than decimated, the likelihood of Twitter being a platform on which news of the election’s progress, unbiased by either party’s die-hards and the vast sums of money they’ll burn on winning over the undecided on the run-up to 2024 is almost comically small.
It’s worth noting though that in the recent Twitter X-odus (We’re very sorry, it was there, we had to use it), many of those who flocked to the likes of first Mastodon and then Meta Threads will have been of a more socially liberal political persuasion.
Which brings us to Meta.
The meaning of disinformation.
Facebook, one of Meta’s prime platforms with a US membership of 239 million, has an unfortunate history in terms of its cynical use to sway political votes. In 2016, it was credited with helping Donald Trump take the White House in the first place, as well as, through what is now referred to as the Cambridge Analytica scandal, swinging the ultra-narrow Brexit vote in the UK, the repercussions of which continue to rock the country seven years on.
Now, it’s been revealed that in the run-up to the launch of its Twitter alternative, Meta Threads, Meta laid off significant members of a global team specifically employed to tackle misinformation and disinformation around elections, meaning an almost textbook crippling of the defences against “fake news” across Meta, Instagram, and now Threads – just when arguably, they’re most needed.
Meanwhile, a federal judge, US District Judge Terry Doughty of Louisiana, has dealt the Biden administration a potentially crippling blow as the nation gears up to the 2024 election.
Judge Doughty ordered a number of federal agencies – and more than a dozen top officials – not to communicate with social media companies about taking down “content containing protected free speech.”
That’s an important framing. Information may be knowingly or unknowingly false, it may be factually flawed, it may be economically illiterate or politically inept – technically, Americans still have a right to say it, read it, hear it, believe it, and take it to their hearts. The Constitution says so.
That means one side of the 2024 has had its hands tied behind its back in fighting the potential tide of disinformation that will be aimed at voters, meaning such disinformation is far more likely to achieve its goals.
For an administration absolutely depending on the ability of voters to receive good, solid, factual information and not to receive chronic or cynical disinformation – meaning blatantly, provably false assertions like the theft of the 2020 election, rather than mere differences on policy direction or character – that’s almost the worst news that could possibly be imagined.
Disinformation II – the meaning of reality.
But then there’s Thing #2.
Yes, all the arguments over disinformation were all still part of the first troubling technological development that’s pivotal to the 2024 election – meaning the best, or worst, depending on your point of view, is yet to come.
The best or worst revolves around the big new player in the tech world – generative AI.
Since October, 2022, we’ve seen generative AI explode into the world and change practically everything we thought we knew. One of the things we thought we knew was that humans were ultimately responsible for creative arts, be they speechwriting, news reporting, inventive artwork, or video creation. We thought we knew reality from, say, CGI.
As it happens, more often than not, we were already wrong about that, but the age of generative AI is the age of the deepfake video, so convincing that no human or technological system as yet exists that can tell the best of them from actual footage of a real event.
That’s why, for instance, early proofs of concept deepfakes usually did enough to convince people they were watching real footage, and then threw in increasingly surreal content, to prove it was faked. Because these days, unless the creators of the deepfake come out and say it’s a deepfake, there’s no way to prove it is.
That is obviously a potentially horrifying power to unleash ahead of a finely balanced presidential election, particularly because politically-invested human beings are a perfect case study in the art of confirmation bias – they don’t want to know that negative images of their opponents were faked, they inherently want to believe the bad in their opponents and the good in their candidate, so their critical faculties are turned down to a minimum when presented with seemingly incontrovertible video evidence that reinforces their view.
That is akin to the golden gun of disinformation, and it’s available ahead of 2024, meaning the nature of observable reality as we’ve known it no longer exists.
If you think that sounds ridiculous, you’re a little behind the curve. Both Donald Trump and Ron DeSantis have so far been caught sharing deepfakes about one another (the deepfakes were identified in one case because the nature of the deepfake was fairly obvious, and in the other because footage differed from a range of other captures of the events in question).
DeSantis’ most recent deepfake is a bizarre image of Trump hugging Dr Anthony Fauci, the former director of the National Institute of Allergy and Infectious Diseases, with whom Trump had a public falling-out over the US Covid response. Trump’s use of deepfakes, by contrast, was almost comical, using DeSantis as a character from popular TV show, The Office.
The point being that neither of the candidates’ campaigns initially acknowledged that the deepfakes were deepfakes.
Disinformation and the meaning of confirmation bias.
Imagine the impact of that on an electorate – if they see things that are put out and endorsed by candidate campaigns, they’re likely to believe that what they’re seeing is real, because the whole of electoral history throughout the TV age has depended on that level of integrity. Deepfakes by generative AI are, or can be, indistinguishable from verified reality, meaning disinformation can be indistinguishable from fact.
All of these issues exist ahead of the 2024 election, and analysts already expect disinformation to play an enormous role in the knife-edge contest between (in all likelihood) Donald Trump and Joe Biden, meaning technological transparency and legal consequence have never been as important as they are right now in presidential politics.
Donald Trump coined the phrase “fake news” for any media outlet that dared disagree with his view of the world, even in cases of verifiable fact. In fact, his press secretary is on record as creating the phrase “alternative facts” for the former-president’s views on reality.
It seems as though 2024 may be an election based on genuinely “fake news” – in every conceivable form.