Many are already aware how artificial intelligence (AI) is fooling people online with images such as the Pope wearing a fashionable jacket, or Donald Trump being chased by police.

But Gordon Crovitz, CEO of NewsGuard, a company focusing on preventing misinformation in media, warns the technology is also creating believable news articles with an "authoritative" voice.

"The internet of a few months ago, before AI, already was full of misinformation and what looked like reliable news sources," Crovitz told CTV's Your Morning on Friday. "What we now have is AI-enhanced internet."

Crovitz says that makes even more important to trust factual news sources for information and not rely on AI tech like ChatGPT and Google Bard.

The advancement of AI in the past few months is concerning for Crovitz, because it can produce very believable misinformation.

"They get, very often, answers from these AI tools that are very well-written explanations on topics in the news, perfect grammar, highly persuasive, and entirely false," he said. "There's no way for people to understand that the response from an AI machine that's written to this authoritative style, often with no citations, may be entirely made up."

NewsGuard tested whether ChatGPT would write false information based on misleading ideas, and the technology complied 100 per cent of the time.

Crovitz says AI-generated answers were also "highly convincing."

"So convincing that, just yesterday, a NewsGuard analyst found what we think is the first example of a Chinese disinformation effort, citing ChatGPT as an authoritative source," Crovitz said.

As AI continues to evolve, Crovitz encourages people to question what they are reading online before taking it as fact. But the onus is not just on people using AI, Crovitz believes companies need to be diligent as well.

"If the humans in charge of training the AI, actually train the AI, then we could see a reduction in the propensity to misinform," he said.

 

To hear the full interview click the video at the top of this article.