Personal tools
You are here: Home Newsletter Latest Issue
Document Actions

Table of Contents - Volume 45, Issue 1 - February 2015

by Flo Appel last modified 2015-03-29 18:04

Access the entire issue or individual articles.

Current issue in the Digital Library

(if you do not have access to the SIGCAS newsletter in the DL, you can register as a guest to view the current issue, or any of the articles)

Table of contents

How to throw the race to the bottom: revisiting signals for ethical and legal research using online data
Erin Kenneally
Pages: 4-10
doi>10.1145/2738210.2738211
Full text: PDF

With research using data available online, researcher conduct is not fully prescribed or proscribed by formal ethical codes of conduct or law because of ill-fitting "expectations signals" -- indicators of legal and ethical risk. This article describes where these ordering forces breakdown in the context of online research and suggests how to identify and respond to these grey areas by applying common legal and ethical tenets that run across evolving models. It is intended to advance the collective dialogue work-in-progress toward a path that revisits and harmonizes more appropriate ethical and legal signals for research using online data between and among researchers, oversight entities, policymakers and society.

Visual cryptography on postage stamps
Andreas O. Bender
Pages: 11-13
doi>10.1145/2738210.2738212
Full text: PDF

Some national post offices offer postage stamps showing an image chosen by the customer, but in at least one case, an image was refused for political reasons. Visual cryptography was used to get the post office to produce two stamps showing images which are closely related to the rejected image. They look like random graphic designs, but the original image appears when the two graphic designs are held one on top of the other and lit from the back.

How do we decide how much to reveal?
Aylin Caliskan-Islam
Pages: 14-15
doi>10.1145/2738210.2738213
Full text: PDF

How do we decide how much to share online given that information can spread to millions in large social networks? Is it always our own decision or are we influenced by our friends? Let's isolate this problem to one variable, private information. How much private information are we sharing in our posts and are we the only authority controlling how much private information to divulge in our text messages? Understanding how privacy behavior is formed could give us key insights for choosing our privacy settings, friends circles, and how much privacy to sacrifice in social networks. Before analyzing end users' privacy behavior, we had the intuition that privacy behavior might be under the effect of network phenomena. Christakis and Fowler's network analytics studies [2] showing that obesity spreads through social ties and smoking cessation is a collective behavior [3], influenced us to further investigate network properties of privacy behavior.

Usability versus privacy instead of usable privacy: Google's balancing act between usability and privacy
Paul Gerber, Melanie Volkamer, Karen Renaud
Pages: 16-21
doi>10.1145/2738210.2738214
Full text: PDF

A smartphone is an indispensible device that also holds a great deal of personal and private data. Contact details, party or holiday photos and emails --- all carried around in our pockets and easily lost. On Android, the most widely-used smartphone operating system, access to this data is regulated by permissions. Apps request these permissions at installation, and they ideally only ask for permission to access data they really need to carry out their functions. The user is expected to check, and grant, requested permissions before installing the app. Their privacy can potentially be violated if they fail to check the permissions carefully. In June 2014 Google changed the Android permission screen, perhaps attempting to improve its usability. Does this mean that all is well in the Android eco-system, or was this update a retrograde move? This article discusses the new permission screen and its possible implications for smartphone owner privacy.

Predicting privacy and security attitudes
Serge Egelman, Eyal Peer
Pages: 22-28
doi>10.1145/2738210.2738215
Full text: PDF

While individual differences in decision-making have been examined within the social sciences for several decades, this research has only recently begun to be applied by computer scientists to examine privacy and security attitudes (and ultimately behaviors). Specifically, several researchers have shown how different online privacy decisions are correlated with the "Big Five" personality traits. However, in our own research, we show that the five factor model is actually a weak predictor of privacy preferences and behaviors, and that other well-studied individual differences in the psychology literature are much stronger predictors. We describe the results of several experiments that showed how decision-making style and risk-taking attitudes are strong predictors of privacy attitudes, as well as a new scale that we developed to measure security behavior intentions. Finally, we show that privacy and security attitudes are correlated, but orthogonal.

"Shadow security" as a tool for the learning organization
Iacovos Kirlappos, Simon Parkin, M. Angela Sasse
Pages: 29-37
doi>10.1145/2738210.2738216
Full text: PDF

Traditionally, organizations manage information security through policies and mechanisms that employees are expected to comply with. Non-compliance with security is regarded as undesirable, and often sanctions are threatened to deter it. But in a recent study, we identified a third category of employee security behavior: shadow security. This consists of workarounds employees devise to ensure primary business goals are achieved; they also devise their own security measures to counter the risks they understand. Whilst not compliant with official policy, and sometimes not as secure as employees think, shadow security practices reflect the working compromise staff find between security and "getting the job done". We add to this insight in this paper by discussing findings from a new interview study in a different organization. We identified additional shadow security practices, and show how they can be transformed into effective and productivity-enabling security solutions, within the framework of a learning organization.

Valuing data security and privacy using cyber insurance
Anand Shah, Shishir Dahake, Sri Hari Haran J.
Pages: 38-41
doi>10.1145/2738210.2738217
Full text: PDF

What should be the minimum value of data security or privacy to a customer? We reason that at a minimum this value should be equal to the premium charged by an insurer for cyber insurance that compensates the customer for the claims resulting from the data security and privacy breaches. We calculate the premium for cyber insurance and the percentage coverage availed by a customer using Monte Carlo simulations.

'Twas a week before x-mas
Ed Nichols
Pages: 42-42
doi>10.1145/2738210.2738218
Full text: PDF


Powered by Plone CMS, the Open Source Content Management System

This site conforms to the following standards: