Privacy nudges protect information

Privacy nudges will safeguard Internet surfers’ personal information. (credit: Sibel Ergener) Privacy nudges will safeguard Internet surfers’ personal information. (credit: Sibel Ergener)

As more of our affairs can be managed online, privacy on the Internet becomes more of an issue. Research at Carnegie Mellon has centered on the idea of “privacy nudges,” which will give warnings when inputting personal information online. A team of professors — Lorrie Cranor, an associate professor of computer science; Alessandro Acquisti, an associate professor of information technology and public policy; and Norman Sadeh, a professor in the School of Computer Science — has combined computer science, economics, and psychology to create this solution.

The “privacy nudges” themselves can be presented in a variety of situations. For example, if a person is about to enter information on a website — whether it’s a name, a birthday, a photograph, a credit card, or a Social Security number — a message would come up warning the user that his or her information might be compromised if he or she continues. From that point, the user would decide whether or not to proceed.

Other examples include suggestions to change the default settings for a program or scanning an e-mail before sending it. In the latter example, the system scans the e-mail to find any indications that the e-mail might be considered a “flame,” — a kind of e-mail spam — withholds its sending, and at a later time asks if it should still be sent. As Acquisti describes it, nudges are “a shortcut, in a way,” to easier decision-making.

Acquisti’s main focus is on the “economics of privacy,” namely, combining psychology and classical economics to see how individuals act in regards to their personal privacy. This particular field contains some uncertainty, since there typically is a cognitive bias in which many mental factors specific to each individual play different roles in decision-making. From 2003 to 2007, Acquisti’s focus was on discovering behaviors in private decision-makers. “It was how we can explain [decision-makers’] biases, but it was also how we help people overcome their biases,” said Acquisti. “It was about pushing safe decision-making.”

From there, Acquisti described the merits of “soft paternalism,” which is when the government provides individuals with suggestions rather than regulations that they need to follow. He emphasized the importance of letting users select their own preferences when it comes to privacy.

Although the team favors individual preference in privacy settings, members acknowledge that certain governmental regulations, some suggested by Congress and the Federal Trade Commission, can be coordinated with the nudges in order to safeguard against the greatest number of potential privacy issues.

One of the dangers that Acquisti wishes to avoid is desensitizing the users to the warnings. If the “nudges” happen too often, the users will get annoyed and disregard them. When working on the project, Acquisti always asks himself, “When we design a new information system, how can we provide privacy to the user [without seeming insistent]?”

Other groups have also done research on privacy settings online. A group at Stanford University is focused on using human voices to give users verbal warnings, while a group at Princeton University is working to redesign the web browser completely.

Next in the process is more research on private behaviors in a number of different environments. Work is also being done on privacy regarding the location tracking service Locaccino, which allows users to share their location with their friends. In the meantime, the group awaits the approval of their National Science Foundation grant proposal. Google has funded the nudges; according to Acquisti, “They’re keeping their eyes very open to research in academia.”