Aligning Privacy Expectations with Technical Tools

In the world of privacy protections, the devil often resides in the details.  People do care about privacy.  In 1974, Alan Westin conducted privacy surveys to measure privacy attitudes.  He found that most participants were privacy fundamentalists (those with high concerns) or privacy pragmatists (those with medium concerns).  Westin’s studies stand the test of time.  Chris Hoofnagle and Joseph Turow’s work demonstrates that people do indeed have a taste for privacy.  According to other research, 44% of social network users take steps to limit the amount of personal information appearing online and 71% change their privacy settings on their profiles to limit what they share.

An important new study (Columbia’s Michelle Madejski, Maritza Johnson, and Steve Bellovin) reveals that Facebook’s default privacy settings fail to capture real-world expectations.  Facebook users can share types of information, such as profile, photographs, videos, and wall posts, with designated groups.  Default settings include users’ friends, friends of friends, networks, and strangers (i.e., everyone).  The Columbia study found that 93.8% of participants revealed information that they did not want disclosed and that 84.6% of participants hid information that they wanted to share.  In other words, Facebook users either shared too little or too much information using Facebook’s privacy settings.  The study also showed that users have specific sharing tastes that the Facebook settings fail to capture.  For instance, the nature of the information changes users’ privacy expectations.  Users had different views about disclosure if the information concerned work, sex, religion, politics, personal identifiers, strong dislikes, interests, family, profanity, drugs, alcohol, and academics.  It found that Facebook’s privacy flaws stem from its reliance on gross data types (photographs, wall postings, etc.) for defining default settings, rather than the “context of its information” determined by users’ preferences with regard to specific categories of information.

Consider this possibility.  Facebook’s privacy settings could allow users to show information about sex, drugs, and dislikes to friends while they could share their views about politics, academics, and religion with friends of friends.  With specific categories of information as default privacy settings, most violations identified by the study would have been avoided.  A key privacy improvement would be automatic categorization of information with a predicted context, each of which would have its own default privacy configuration reflecting the user’s intent with the data.  The Columbia study confirms the wisdom of Dan Solove’s pragmatic taxonomy and of Helen Nissenbaum’s contextual integrity theory.  Now, the task is encouraging online social networks to align technical settings to capture contextual expectations.

You may also like...