Be careful about who you're arguing with on Twitter these days.

The reopen America movement is a hot topic on the social media network, but half of the accounts could be automated bots fueling conversations about Covid-19, researchers say.

There has been a push to reopen the economy and ease restrictions, with people protesting in cities across the US to end lockdowns and stay-at-home orders.

Researchers from Carnegie Mellon University (CMU) have discovered that much of the discussion on Twitter about the pandemic and stay-at-home orders are being propagated by misinformation campaigns using Twitter bots.

The researchers collected 200 million tweets discussing the coronavirus since January, and they found that 82 per cent of the top 50 influential retweeters are bots and 62 per cent of the top 1,000 retweeters are bots as well.

Accounts that are possibly humans with bot assistants generated 66 per cent of the tweets and accounts that are definitely bots generated 34 per cent of the tweets, according to the research.

"We're seeing up to two times as much bot activity as we'd predicted based on previous natural disasters, crises and elections," said Kathleen Carley, a computer science professor at Carnegie Mellon.

While there is no universally shared definition of a bot and not all bots are considered bad, a bot is generally viewed as a software program that controls Twitter accounts and automate tasks like tweeting or retweeting. In theory, it is possible for one person to control thousands of accounts.

Carley's research team used multiple methods to determine who is or isn't a bot. Artificial intelligence processed account information and examined factors like the number of followers, frequency of tweeting and an account's mentions network.

Carley said the surge in bot accounts and activity could be attributed to more people being at home with the time to create their own bots. She said there has also been an increase in firms that have been hired to run bot accounts.

"Because it's global, it's being used by various countries and interest groups as an opportunity to meet political agendas," Carley said.

The researchers found that a subset of tweets about "reopening America" cited baseless conspiracy theories, like the debunked theory linking coronavirus to 5G cell towers.

"Conspiracy theories increase polarization in groups -- it's what many misinformation campaigns aim to do," Carley added. "People have real concerns about health and the economy, and people are preying on that to create divides."

Carley said spreading conspiracy theories can lead to more extreme behavior with "real-world consequences" like affecting voting behavior and "hostility toward ethnic groups."

"We're prioritizing the removal of COVID-19 content when it has a call to action that could potentially cause harm," according to a statement from Twitter. "As we've said previously, we will not take enforcement action on every Tweet that contains incomplete or disputed information about COVID-19."

Twitter introduced these new policies on March 18, and the company said it has removed more than 2,600 tweets. The company said its automated tools have also challenged more than 4.3 million accounts that were targeting discussions around coronavirus with "spammy or manipulative behaviors."

"We permanently suspend millions of accounts every month that are automated or spammy, and we do this before they ever reach an eyeball in a Twitter Timeline or Search," wrote Nick Pickles and Yoel Roth, the company's director of global public policy strategy and development and head of site integrity, respectively, in a blog post this week.

The Carnegie Mellon researchers said users should closely examine Twitter accounts for signs that an account could be a bot, like sharing links with subtle typos, issuing multiple tweets very quickly, or a username and profile image that don't appear to match up.

"Even if someone appears to be from your community, if you don't know them personally, take a closer look, and always go to authoritative or trusted sources for information," Carley said. "Just be very vigilant."