ΒιΆΉΤΌΕΔ

Β« Previous | Main | Next Β»

Easier, faster, smoother, better ... but first we need a few details

Post categories:

Joanne Moore | 10:22 UK time, Monday, 9 January 2012

Online services frequently gather data from users – users explicitly disclose some of it, and other data is implicitly captured; but different services have differing amounts of access and openness on the data they hold. Users are not always given the opportunity to view the collected data or manage how it will be used. With emerging technologies, more user data will be gathered, often with the intention to enhance the user experience; but gathering user data opens up issues of privacy and security to the detriment of the user experience it was intended to enhance. This literature review aims to explore issues around personal data from the perspective of users, and is part of our work on the .

Μύ


Drawing of a person holding a box with an arrow pointing at it labelled 'Your data'

Security and privacy considerations for end users are complex and ever changing

Privacy and security are complex concepts, they change constantly depending on the context the user is in. Security and privacy cannot be looked at in isolation, people cannot consider them out of context, rather decisions are made as part of another task with a different purpose - for example keeping up to date with friends, or buying some groceries. Users approach each situation depending on past experience and current expectations. There is a recognised paradox between what users say they are comfortable disclosing and what they actually disclose – this is partly because some studies asked security questions out of context or without a purpose for users to relate to; another explanation is that users can be enticed into decisions based on the potential benefits rather than the costs associated with the disclosure. Even when users are less comfortable disclosing information, if the benefits are compelling enough they sometimes find hacks and workarounds to maintain some sense of their privacy while using the system.Μύ

Μύ

User research findings

Three categories of attitudes towards privacy

Categories were identified by Westin (1967), and used as a basis for understanding some of the attitudes of users (Liu, C. et al. 2004, Iachello, G. and Hong, J. 2007, Langheinrich, M. 2001). Although the percentages change across different studies – in general about half of the population are privacy pragmatists, they are aware of privacy implications and take care when considering activities that may put their data at risk. The next group, the unconcerned individuals, are the least cautious. The smallest group, the fundamentalists are the most reluctant to disclose personal information about themselves, also this group are more likely to have been victims of identify theft or similar security breaches.

Comfort levels – some information is more sensitive than others

When surveyed on which information participants are always or usually comfortable disclosing Μύ- the results showed that most were comfortable disclosing personal preferences, such as a favourite TV show or snack, and email addresses. About half were comfortable disclosing their full name and postal address, and then fewest were comfortable disclosing health information, phone numbers or credit cards. Interestingly - although full name and address could be used to contact someone directly, they were less likely to disclose their phone number (Liu, C. et al. 2004). However, as a study of Facebook users listed fear of being stalked as a concern, address information in this context would have been rated lower (Strater, K., and Richter Lipford, H. 2008). This is an example of different results for different contexts.

Dichotomy between what users say they will disclose vs. what they actually disclose

Although survey data is problematic, as self-reporting preferences do not reflect real world behaviour, this paradox is not unique to survey results (Liu, C. et al. 2004).

There is a difference in what people say they will disclose and what they actually disclose probably because decisions are based on the potential benefits users expect to receive (Bachelor, G. and Hong, J. 2007, Wang. 2010, ΜύMathiasen, N. R. and BΓΈdker, S. 2011). An explanation for the dichotomy is that in reality, security decisions do not occur in isolation, so trying to understand them out of context is unrealistic, and can result in unrealistic findings.

Benefits override privacy concerns – even with extreme data logging

When the benefits of logging or collecting personal information are compelling, users will disclose more information than researchers expect; even personally sensitive mobile phone data, as explored in a field study using auto persistent logging on mobile phones (KΓ€rkkΓ€inen, T. et al. 2010). The study found that users didn’t feel their privacy was threatened because the benefits overshadowed and outweighed their initial concerns. The benefits users identified were simply an interest in looking back on what they had done using their mobiles. The only concern participants had with the system was that logged text messages were stored behind a relatively simple password. Also, surprising, when asked about controlling or deleting their data, most preferred to have everything logged for fear of missing out or forgetting things they had done.

Work around and hacks for controlling private data

Even when using a relatively public space to share private information, if users feel that their data will be put risk by the service they want to use, they can sometimes find workarounds to use it in the way they want. For example (Lange, P.G. 2008) a year long ethnographic study of the use of YouTube included participants sharing private videos with friends and family. ΜύThere is a private option available, which felt too restrictive, as all family members would need a YouTube account to access the video. Workarounds employed to protect privacy include limited or vague tags and cryptic references. Participants also monitored the view count of videos to ensure it hadn’t become too public. In fact, Don Norman (2009) states that the more secure a system is designed to be, the less secure it gets as people find workarounds and adaptations to be able to do the thing they want to do when privacy and security gets in the way.Μύ

Users want to be in control of their data….

Control over data is the main privacy user requirement. In line with Westin’s findings (1967) – they want to know what will be done with their data and they want to control what is done with it. In a study (Liu, C. et al. 2004), participants said that they were against automatic data transfer partly because it removed or limited their control over what their data would be used for and by whom.

Μύβ€œI want to be in charge of all information sent to other companies. Just because they are similar, doesn’t mean I [want] my information shared with them.”Μύ

Μύβ€œI want to be in control of what is done. This way I know what was done,” and β€œI don’t want anything sent automatically. I want to check out everything I am applying for.”

…however, managing data is a task usually done once, and rarely returned toΜύ

When interviewing Facebook users, although concerned about the visibility of their personal information, it was identified that adjusting privacy settings is a one-time action, rarely revised. Participants adjusted their privacy settings once the limitations of their current ones were revealed to them; either by the researchers or when they experienced a breach of their privacy (Strater, K., and Richter Lipford, H. 2008). However, the issue remains that the system itself didn’t make the ongoing implications of past privacy decisions clear - despite selecting privacy settings, user’s private data was more visible than they thought.

Conclusion

When building systems that incorporate privacy and security implications - we can make assumptions and best guesses on user responses, but based on the evidence, we won’t really know how end users will respond until they are able to sample the experience for themselves.

References

Iachello, G. and Hong, J. (2007). End-user privacy in human-computer interaction. Found. Trends Hum.-Comput. Interact., 1:1-137.

KÀrkkÀinen, T., Vaittinen, T., and VÀÀnÀnen-Vainio-Mattila, K. (2010). I don't mind being logged, but want to remain in control: a field study of mobile activity and context logging. In Proceedings of the 28th international conference on Human factors in computing systems (CHI '10). ACM, New York, NY, USA, 163-172

Lange, P. G. (2008). Publicly private and privately public: Social networking on YouTube. Journal of Computer-Mediated Communication, 13(1):361-380.

Liu, C., Marchewka, J. T,. ΜύLu, J., and Yu. C-S,. (2004). Beyond concern: a privacy-trust-behavioral intention model of electronic commerce. Inf. Manage. 42, 1

Mathiasen, N. R. and BΓΈdker, S. (2011). Experiencing security in interaction design. In Proceedings of the 2011 annual conference on Human factors in computing systems (CHI '11). ACM, New York, NY, USA

Norman, D. A. (2009). The way I see it: When security gets in the way. interactions 16, 6 (November 2009), 60-63.

Schaar, P. (2010). Privacy by design. Identity in the Information Society, 3(2):267-274.

Strater, K., and Richter Lipford, H. (2008). Strategies and struggles with privacy in an online social networking community. In Proceedings of the 22nd British HCI Group Annual Conference on People and Computers: Culture, Creativity, Interaction - Volume 1 (BCS-HCI '08), Vol. 1. British Computer Society, Swinton, UK, UK, 111-119.

Wang,Y. (2010). Respecting User Privacy in Cross-System Personalization. Downloaded 17th November 2011 from

Westin, A. F. (1967).ΜύPrivacy and Freedom. New York, NY: Atheneum

Μύ

Comments

  • Comment number 1.

    Is this blog trying to explain the ΒιΆΉΤΌΕΔ's profound discomfort about the steps it will have to take to comply with the ?

    Russ

  • Comment number 2.

    On the topic of β€œsome information is more sensitive than others”, it is important to keep in mind that there are strong cultural differences in what people consider sensitive/private.

    This table circulating on the web in the past few weeks is a good illustration:

  • Comment number 3.

    The second part of β€œUsers want to be in control of their data…however, managing data is a task usually done once, and rarely returned to” may be a little misleading. Managing _privacy setting_ is a task usually done once / rarely returned to. Which would mean that people want to control their data, but can't be bothered with privacy setting panels.

    Does that perhaps points to a need for privacy/user data management made through the management of one's data (here's your data, what shall we do with it) rather than through settings?

Μύ

More from this blog...

ΒιΆΉΤΌΕΔ iD

ΒιΆΉΤΌΕΔ navigation

ΒιΆΉΤΌΕΔ Β© 2014 The ΒιΆΉΤΌΕΔ is not responsible for the content of external sites. Read more.

This page is best viewed in an up-to-date web browser with style sheets (CSS) enabled. While you will be able to view the content of this page in your current browser, you will not be able to get the full visual experience. Please consider upgrading your browser software or enabling style sheets (CSS) if you are able to do so.