Carnegie Mellon University
June 22, 2020

New Study Explores User Comfort With Privacy Assistants

By Daniel Tkacik

 

 

Jessica Colnago believes that in the future, walking down the street will be a little weird.

"You know how every time you enter a website, it says: 'We use cookies. Do you consent?' Imagine that same thing walking down the street, but for a light pole, or a surveillance camera or an energy sensor on a house," Colnago said.

Colnago, a Ph.D. student in the Institute for Software Research's Societal Computing program, works with a research team developing personalized privacy assistants (PPAs) — technologies that help people make privacy decisions about devices around them. Without PPAs, "… it's going to be unbearable to live in a world with internet of things devices everywhere giving you notice and asking for consent," Colnago said.

In a new study presented at the Conference on Human Factors in Computing Systems (CHI 2020) in May, Colnago and her co-authors from ISRCyLab and the Heinz College, outlined their efforts to discover how much autonomy people feel comfortable giving PPAs. (You can watch a video of their presentation on YouTube.)

"We found that people are definitely interested in having some sort of assistance like that provided by a PPA, but what that assistance looks like varies," Colnago said.

The team conducted 17 interviews to gauge participants' reactions to three increasingly autonomous versions of PPAs. The first would simply notify users that devices were near them. A majority of participants reacted positively to this version, while a few viewed noted it would fuel their anxiety.

Among the people who indicated they would like to receive such notifications, the majority noted that they would ideally also want to have some control over the data collected about them, rather than just being told about something they couldn't control.

"… it's going to be unbearable to live in a world with internet of things devices everywhere giving you notice and asking for consent,"
Jessica Colnago

The researchers presented the study participants with a second version of a PPA, which would know users' personal preferences on privacy and use that information to make recommendations. A majority of participants also reacted positively to this version, though some of them would rather have the recommendations presented to them based on authoritative sources rather than their personal preferences.

The last PPA was the most autonomous: the PPA would leave the user out of the decision-making process entirely and make privacy decisions for them based on their preferences. Reception was mixed.

"I would consider owning such an appliance," one participant said. "I don't like to be fully controlled by a device, you know?" another noted.

"These interviews told us that there is no single version of a PPA that everyone would be comfortable with," Colnago said. "What we develop needs to include an array of features users can choose from to fit their individual needs and comfort levels."

Moving forward, the team aims to develop a system they can actually test with users to determine how they'd react in a more realistic situation.

"We gained important insights from these 17 participants, but the scenarios we gave them were all hypothetical," Colnago said. "We need to measure how people would actually behave."

Other authors on the study included Yuanyuan Feng, a, ISR post-doctoral student; Tharangini Palanivel, a master's student in the Heinz College; Sarah Pearman, a Ph.D. student in societal computing; Megan Ung, an undergraduate computer science student; Alessandro Acquisti, a professor in the Heinz College; Lorrie Cranor, director of CyLab and professor in both ISR and Engineering and Public Policy; and Norman Sadeh, professor in ISR and PPA project director.