TORONTO -- Tech giants Facebook, Twitter and Google are facing mounting criticism after a U.S. lawmaker pressured the social media platforms to improve the way they deal with disinformation regarding COVID-19 vaccines.

Representative Mike Doyle made the call during a congressional session on Thursday, after stating how easily he and his staff were able to find anti-vax propaganda on the three social media platforms.

He called on Facebook CEO Mark Zuckerberg, Google CEO Sundar Pichai and Twitter CEO Jack Dorsey to specifically remove 12 people who are accused of spreading disinformation about vaccines and downplaying the dangers of COVID-19.

Although the platforms have not responded to Doyle’s call, Facebook said this week it is looking into the matter to see if the accounts violated its standards.

The Centre for Countering Digital Hate (CCDH) – a not-for-profit NGO – found that about two-thirds of anti-vaccine content on major social media sites is linked to 12 people with large online followings, also dubbed as the “disinformation dozen.”

“Anti-vaccine activists on Facebook, YouTube, Instagram and Twitter reach more than 59 million followers, making these the largest and most important social media platforms for anti-vaxxers,” according to the report released in March.

According to the report, the disinformation dozen have repeatedly violated Facebook’s and Twitter's terms of service agreements.

None of those people are reported to have been banned from accessing their accounts on each platform.

“Anti-vaxxers are using social media platforms to target Black Americans, exploiting higher rates of vaccine hesitancy in that community to spread conspiracies and lies about the safety of COVID-19 vaccines,” according to the report.

In addition to removing repeat offenders, the report outlines steps social media platforms can take to decrease the presence of misinformation on users' feeds.

Despite the recent crackdown against misinformation, these individuals have largely been permitted to maintain their presence across mainstream social media platforms, the CCDH says.

On Monday, Facebook said it took down 1.3 billion fake accounts between October and December and that it had more than 35,000 people working on tackling disinformation on its platform. The social network said it has also removed more than 12 million pieces of content about COVID-19 and vaccines which health experts flagged as misinformation.

Earlier this month Twitter also announced it would be monitoring and removing dangerous falsehoods surrounding vaccines. Twitter stated it has removed a total of 8,400 tweets spreading anti-vax propaganda and “challenged” 11.5 million accounts worldwide.

False claims and conspiracies have grown throughout the pandemic, most notably during the 2020 U.S. presidential election when thousands of accounts were removed after sharing false claims doubting the election outcome.

Similar steps have been taken in Canada over the years to remove accounts that use social media platforms to spread fake news and hateful speech.

In the 2019 Canadian election, Facebook's team monitored the platform for signs of people trying to use it to spread misinformation or disinformation as part of an election integrity initiative.

CTV News reached out to Facebook and Twitter for comment about the “disinformation dozen,” but did not immediately get replies.