Depending on which sources you consult, there is no shortage of “ages” defining the breadth of human history (and within them, an endless parade of “sub ages”).
The Stone Age, the Bronze Age, the Iron Age; The Dark Ages, the Middle Ages, the Age of Exploration; The Industrial Age, the Machine Age, the Age of Oil, the Jet Age, the Nuclear Age, the Space Age… the list (and its subcategories) is endless.
According to historians, the Information Age began sometime in the mid-20th century, marking the shift from traditional technologies developed during the Industrial Revolution to a society centred on information technology.
Many say it is still ongoing today—though evidence suggests its decline began sometime around 2008, with the proliferation of smartphones, high-speed internet and social media.
But at the dawn of the so-called Information Age, there was surely no shortage of idealistic visions imagining a utopian society, replete with all the knowledge and understanding necessary for endless peace and prosperity.
So how on Earth did we find ourselves here on the brink of absolute uncertainty and confusion, our technology proving more hindrance than help in ushering in a new Age of Enlightenment?
That’s a big question, best left for future historians to muddle through once the dust has settled.
For now it’s enough to trust the Information Age is dead or dying fast, and facts are now merely a suggestion.
Case in point: the enthusiastic re-election, south of the border, of the walking embodiment of misinformation and outright lies.
For all that information we’re bombarded with, many sure don’t seem capable of parsing through it to get at any sort of agreeable truth.
If the Information Age is over, what comes next? Bloggers and columnists have probed the question for well over a decade already, coming up with lame suggestions like The Data Age, or The Experience Age.
But in light of recent developments, there may be a better description for the time period we’re now entering: The Age of Ignorance.
Take some high-ranking Google search trends in the wake of the U.S. election as proof: queries like “did Biden drop out?”; “what do tariffs do?”; and, “can I change my vote?” show just how ill-informed the average voter is.
We’re not immune in Canada. As the editor of a small, weekly newspaper, my inbox is regularly blessed with absolute, batshit nonsense from faceless info warriors, sharing links to conspiracy theories and bad-faith “news” stories from muckrakers muddying the waters of truth and decency.
But then, it doesn’t matter what the truth is, anymore—we form our beliefs first, then build the “proof” around it. And nobody can tell us any different.
The well of information that is the internet has grown so deep, you can find said “proof” for any wild opinion you might have, and a small army of like-minded information warriors ready to take up the fight alongside you—it doesn’t matter that many of them could be literal bots paid for by a foreign government.
With almost zero guardrails in place, it’s only going to get worse as artificial intelligence improves. How long until we can’t trust our own eyes anymore?
On election night in the U.S., a deepfake video was circulating of Martin Luther King Jr. spouting pro-Trump rhetoric—much to the disgust of the civil rights leader’s family.
“It’s vile, fake, irresponsible, and not at all reflective of what my father would say,” MLK’s daughter Bernice King wrote on social media. “And you gave no thought to our family.”
But how many saw the original video and took it at face value? Likely exponentially more than those who saw the King follow-up.
The broad damage is already done—the internet and social media have muddied the waters so effectively, we no longer have a common baseline of understanding from which to work. But all hope is not yet lost.
Speaking at the International Institute of Communication (IIC) conference in Ottawa last month, Frank Graves, president of EKOS Research, and Don Lenihan, an expert in public engagement, broke down Graves’ “Vicious Cycle of Disinformation” for attendees.
The cycle consists of five stages: Economic insecurity; cultural insecurity and ordered populism; mistrust; disinformation; and polarization on key issues.
In their presentation to the IIC, Graves and Lenihan noted with AI evolving rapidly, its potential to amplify Graves’ cycle—or disrupt it—can’t be overstated.
Deepfakes like the bogus MLK Jr. video have the power to further erode trust in traditional information sources, they say, but as far as AI goes, deepfakes are “yesterday’s news.”
The real danger lies in AI’s ability to persuade.
“Recent studies show chatbots are already 20 per cent more effective than humans at using facts and arguments to challenge deeply held beliefs,” Lenihan said in a summary of the presentation. “That’s impressive, but suppose we give the bot a bit of personal information about the subject—say, their age, gender, or education level. The bot then performs twice as well as humans with the same information.”
Further, AI bots can now interact using sensory skills, reading a human’s tone, body language and other cues to determine emotional state.
“With this kind of personalization, bots are becoming adept at persuasion faster than we can test or regulate their capabilities. This raises an important question: How might people use these bots to shape public views?” Lenihan said.
“Bots are programmed to do as they’re told, so their methods depend entirely on the goals of those controlling them. Programmed respectfully and empathetically, they could be immensely helpful, say, in supporting aging or ill individuals. But if their goal is to sell a defective car or push a conspiracy theory, they’ll use disinformation, deception, and emotional manipulation to get the job done.”
Now imagine millions of bad-faith bots tasked with sowing social discord and disrupting accepted narratives—the Vicious Cycle of Disinformation becomes a roadmap to societal collapse.
But the authors are optimistic, stating with the right approach, Canadians, and democracies the world over, can “halt the cycle of disinformation” using two simple steps: AI literacy and individual opportunity.
First, Canadians must deepen their understanding of disinformation and how to counter it, especially “sophisticated forms driven by AI and algorithms … Democracies must make AI work for them, rather than against them,” they say.
But addressing disinformation requires fixing the social and economic conditions that make people vulnerable to bad info.
“Governments must address root causes like economic inequality and wealth concentration to ensure that citizens have real opportunities for productive and meaningful lives. Democracy is nothing without individual opportunity,” Graves and Lenihan say. “These steps are ambitious but crucial. Canadians from all walks of life are calling for solutions to the threats posed by disinformation and polarization. Democracies are, it seems, at a tipping point. Can we mobilize the leadership and public will to make this vision a reality?”