VANCOUVER -- Liberal candidate for Vancouver-Granville Taleeb Noormohamed looks at his Twitter feed and reads a few of the messages directed at him out loud.

“Oh great another Muslim entering politics, how unique. Not!” one of the messages says. “And no, we don't want your Sharia law here in Canada, ever. You can fool the useful idiots but you can't fool us all. The only reason the left is siding with the Muslims is that they hate Christianity.”

Researchers say millions of online messages will be sent to candidates running in this federal election. Some of them will be abusive, racist or gendered and targeted towards candidates of colour and women.  Attention Control podcast investigated how online hate and abuse could be hurting our democracy.

Researchers Chris Tenove and Heidi Tworek will be looking at these kinds of online messages as they study online harassment and abuse experienced by all candidates this federal election. They’ll be paying special attention to the experiences of women and candidates of colour.

“Women, people from religious minorities and people of colour will generally experience more of that abuse,” says Tworek, pointing to research done in the U.K. and Germany. “But part of the other reason why we need to do this is to see how this is exactly playing out in a Canadian context.”

It’s a complex project that is more than just running a few keywords through a spreadsheet. Tenove and Tworek will be building an algorithm to help them sort through those millions of messages to detect and categorize different types of harassment from a Canaidan context.

“It's very easy to run programs that will give you a very, very general guess at toxic language and that is helpful in some respect, but it can miss so much,” says Tenove. “Terms used in hateful comments are often colloquial, often difficult to know them all, unless you're familiar with the group.”

The researchers will also be interviewing candidates to hear about their experiences handling these messages and their strategies to deal with them. Their hope is that the data could provide a pathway to specific policies that could lead to tangible ideas to make our political landscape more inclusive.

For example, if they find that staff for candidates of colour or women spend much more of their time dealing with abusive messaging instead of campaigning, “a policy recommendation quite simply could then be to give extra staff to those offices,” says Tworek. “We want them to be on a fair, level playing field.”

For Noormohamed, he tries to engage with some of the messages in the hopes that he can challenge stereotypes around people of colour.

“People say to me sometimes, ‘Oh isn't running as a brown Muslim guy going to be a disadvantage?’ I actually see it as an opportunity,” says Noormohamed.

“I see it as a great opportunity for people to realize that, despite what they may think of the name on the sign, despite what they may see, that they have so much more in common with me as being like this pretty average Canadian guy that, you know, happens to look different or happens to have a long name.”

The online world can be a reflection of the hatreds and prejudices that exist offline, warns Tworek, reflecting on the recent video of a man in Montreal telling NDP Leader Jagmeet Singh to cut off his turban so he would “look like a Canadian.” Many praised Singh for his calm reaction to the racist comments.

Noormohamed says the downside when you are a candidate who happens to be a person of colour, or an Indigenous person or a woman, is that “you actually carry the weight of far more than yourself. And so I have to, as a Muslim, as a person of colour, I have to do everything possible to make sure that I am not just the best representative of my riding, but that when people look at me because of something that I do, I don't unfairly impact somebody else who may be a Muslim or who may be a minority who wants to run. That's a lot of pressure to put on people, but it is where we are.”

Online abuse is 'targeting' groups already underepresented

Michael Morden, research director at the Samara Centre for Democracy, says social media is not an equal opportunity offender.

“Researchers have found that high profile women attract considerably more incivility than high profile men,” he says. “There's pretty strong indications that people of colour attract a lot more incivility than white people on social media. So social media incivility is a problem for equity, too, particularly because it's targeting groups that are already underrepresented in our politics.”

Toxic messages directed at certain groups can discourage them from participating in online conversations and from running as candidates.

According to Equal Voice, 42% of candidates running in this election are women. Even though women 15-years and older make up more than 50% of Canada’s population.

The NDP have the most candidates who are women at 49% and the People’s Party have the least at 17%.

Attention Control reached out to all the federal parties to provide a breakdown of their candidates based on gender and ethnicity. The NDP, Green and Conservative Party provided demographic and gender breakdowns. The Liberals only provided a gender breakdown and the People’s Party did not respond.

Morden says he’ll be keeping track of these numbers this election to see how many people of colour and women will ultimately get elected.

The role of tech companies and online users

Many of the major social media platforms have human and automated moderation systems to look for abusive content online and assess whether it should be removed.

Globally, YouTube took down over 110,000 videos they found to be hateful or abusive between April and June this year. Facebook took down over 4 million pieces of hateful content between January and March this year across the world, according to company transparency reports.

But the process of content moderation still needs to be much more transparent, says Tworek. “Germany, for example, does get specific transparency reports because of a law that it has pushed through,'' she says. “But in terms of actually understanding what's going on within Canada, we don't have the data from social media companies that breaks that down.”

“In essence it means that social media companies are privatizing the enforcement of speech online,” she says, pointing to “an enormous debate about the balance between social media companies, governments and civil society and who gets to regulate speech in a meaningful way.”

Morden is wary of relying on tech companies and government to deal with harmful speech online and believes users and citizens need to step up.

“I don't know that it's feasible to wait for platforms to alter the way they operate in order to bring about better outcomes in terms of healthier conversations, anymore than it makes sense to wait for McDonald's to start serving us healthier food,” he says. He believes citizens need to take a more active role in how our online conversations are moderated.

Morden outlines different strategies in his recent Field Guide to Online Political Discussions. “We've prescribed a bit of a process where you could kind of kick this to a citizens’ assembly or follow the lead of a parliamentary committee chaired by the opposition because I really do think process matters.”

The report lists seven main ways users can have more productive online conversations:

1. Lead by example: Being civil can cause others in a conversation to follow your lead.

2. Police your own side: Calling out incivility is most effective when you're addressing someone on the same political team.

3. Practice slow politics: Small changes in the way you use technology can reduce the likelihood of using social media on the go, cutting down on thoughtless and aggressive exchanges.

4. Get into the weeds: Inviting people to provide detailed explanations of what political choices they support, and doing so yourself, can reduce polarization.

5. Reframe your language: Thinking about the moral foundations of an argument, and reflecting those foundations in your own language, can reduce the psychological distance between you and the person you're having a discussion with.

6. Remind us what we share: Priming someone to consider the identities that unite us (like civic identity) rather than the identities that divide us (like party affiliations) can reduce polarization.

7. Spot a bot: Recognize fake accounts, and don't give them what they want—attention.

“At the end of the day we're really contending with human behavior and unfortunately it's kind of up to us to fix this,” says Morden.

Attention Control with Kevin Newman” is a new podcast from Antica Productions, and will be investigating the intersection of data, technology, and democracy during the federal election campaign. Every week during the campaign, the show will bring listeners data-driven investigations that will help separate fact from fiction, as well as timely, in-depth interviews with insiders from the tech industry and their critics.