NEW YORK -- It took bloodshed in Charlottesville to get tech companies to do what civil rights groups have been calling for for years: take a firmer stand against accounts used to promote hate and violence.

In the wake of the deadly clash at a white-nationalist rally last weekend in Virginia, major companies such as Google, Facebook and PayPal are banishing a growing cadre of extremist groups and individuals for violating service terms.

What took so long? For one thing, tech companies have long seen themselves as bastions of free expression.

But the Charlottesville rally seemed to have a sobering effect. It showed how easily technology can be used to organize and finance such events, and how extreme views online can translate into violence offline.

"There is a difference between freedom of speech and what happened in Charlottesville," said Rashad Robinson, executive director of Color of Change, an online racial justice group. The battle of ideas is "different than people who show up with guns to terrorize communities."

A SLOW REACTION

Tech companies are in a bind. On one hand, they want to be open to as many people as possible so they can show them ads or provide rides, apartments or financial services. On the other hand, some of these users turn out to be white supremacists, terrorists or child molesters.

Keegan Hankes, analyst at the Southern Poverty Law Center's intelligence project, said his group has been trying for more than a year to get Facebook and PayPal to shut down these accounts. Even now, he said, the two companies are taking action only in the most extreme cases.

"They have policies against violence, racism, harassment," said Hankes, whose centre monitors hate groups and extremism. "The problem is that there has been no enforcement."

Case in point: The neo-Nazi website Daily Stormer has been around since 2013. But it wasn't effectively kicked off the internet until it mocked the woman killed while protesting the white nationalists in Charlottesville.

SHIFTING LINE

PayPal said groups that advocate racist views have no place on its service, but added that there is a "fine line" when it comes to balancing freedom of expression with taking a stand against violent extremism.

Other companies like Facebook, Twitter and Google struggle with the same balancing act. The fine line is constantly moving and being tested.

Ahead of the rally, Airbnb barred housing rentals to people it believed were travelling to participate. Before and after Charlottesville, PayPal cut off payments to groups that promote hate and violence. GoDaddy and Google yanked the domain name for Daily Stormer following the rally. Facebook, Twitter and Instagram are removing known hate groups from their services, and the music streaming service Spotify dropped what it considers hate bands.

"Companies are trying to figure out what the right thing is to do and how to do it," said Steve Jones, a professor at the University of Illinois at Chicago who focuses on communication technology. What happens from here is "partly going to depend on the individual leadership at these companies and company culture -- and probably resources, too."

CAT AND MOUSE

While traditional brands such as Tiki had no way of knowing that their torches were being bought for the rally, tech companies have tools to identify and ban people with extremist views.

That's thanks to the troves of data they store on people and to their ability to easily switch off access to users. Airbnb users can link to social media profiles, and the company said it used its existing background checks and "input from the community" to identify users who didn't align with its standards.

Yet these services also allow for anonymity, which makes their jobs more difficult. Banned people can sign up again with a different email address, something they can easily obtain anonymously.

Facebook spokeswoman Ruchika Budhraja said hate groups also know the site's policies and try to keep things just benign enough to ensure they are not in violation.

For instance, the event page for the "Unite the Right" rally in Charlottesville looked fairly innocuous. Budhraja said there was nothing on the page that would suggest it was created by a hate organization. It has since been removed.

Facebook's technology is designed to automatically flag posts that are on the absolute extreme and clearly violate the company's policies. They are sometimes removed before users can even see them. What Facebook can't leave to automation are posts, events and groups in that ever-growing grey area.

THE BROADEST REACH

The First Amendment offers hate groups a lot of speech protection, but it applies only to government and public settings. A private company is typically free to set its own standards.

Christopher Cantwell, a self-described white nationalist who has been labeled an extremist by the Southern Poverty Law Center, said he was banned from Facebook, Instagram and PayPal because the companies are trying to silence him for his views.

"Everybody is going through extraordinary lengths to make sure we are not heard," Cantwell told The Associated Press .

Even Cloudflare, a security company that prides itself on providing services regardless of their content, terminated Daily Stormer on Wednesday. This appears to be the site's final blow.

Daily Stormer founder Andrew Anglin said in an email to the AP that these private companies are "de factor monopolies and oligopolies" and should be regulated as "critical infrastructure."

The Daily Stormer and other banned groups could move to darker corners of the web, where extreme views are welcome. But this won't help with recruitment and won't allow them to disseminate their views as broadly as they could on Facebook or Twitter.

"These are the platforms everyone is using," Hankes said. "They don't want to be pushed to the margins because they want influence."

Because of that, the industry's efforts might just be a game of Whac-A-Mole, with extremist views returning, perhaps in different guises, once public outrage dies down.

------

Associated Press Writers Michael Casey in Concord, New Hampshire, and Michael Kunzelman in Baton Rouge, Louisiana, contributed to this report.