TORONTO -- Ascribing lifelike qualities to virtual assistants could cause consumers to reveal more personal information than they normally would to the tech companies that own them, according to researchers at the University of Waterloo.

The study reveals that people tend to interact with products such as Amazon Alexa or Siri based on their perception of how they might behave or look like in real life -- right down to how they might dress.

Researchers say it comes down to a phenomenon called anthropomorphizing; the act of attributing human form or personality to things not human.

“People are anthropomorphizing these conversation agents which could result in them revealing information to the companies behind these agents that they otherwise wouldn’t,” Edward Lank, computer science professor at the University of Waterloo, said in a press release.

“These agents are data gathering tools that companies are using to sell us stuff.”

Study participants were asked to interact with Amazon Alexa, Google Assistant, and Siri and provide feedback on their perception of the voice assistants’ personalities.

The 10 men and 10 women involved in the study were then asked to create an avatar of what they thought each assistant looked like.

“Google and Alexa were generally seen as more genuine, whereas Siri was more commonly seen as disingenuous and cunning -- more likely to be working in the interest of a third party,” lead researcher Anastasia Kuzminykh told CTVNews.ca by phone Friday.

“Visualization wise, Siri was described as ‘eccentric’ or having a bit of an edge. One person described her as having a cheerleader trope.”

Kuzminykh noted that while the study showed that Alexa came across as the most genuine and caring, the voice assistant’s individuality was largely found to be neutral when compared to Google and Siri.

The participants perceived Alexa to be of average height and slightly older than the other voice assistants, likely wearing casual, non-descript clothing and dark, wavy hair.

Siri was commonly described as being of average height, younger than the other agents, and wearing casual but fashionable clothes, with long, straight blonde or black hair.

Interestingly, Google Assistant was perceived to be taller with lighter coloured hair, dressed in “silicon valley” type clothing (i.e. jeans and a hoodie, according to Kuzminykh).

“Unfortunately, this brings up a lot of biases,” she said.

“How an agent is perceived impacts how it’s accepted and how people interact with it; how much people trust it, how much people talk to it, and the way people talk to it.”

Kuzminykh said the study brings up an interesting question for consumers to consider when interacting with their voice assistant of choice: do you interact with them as a person, or as an actual piece of technology?