• Log In
  • Skip to main content
  • Skip to primary sidebar

MIS Distinguished Speaker Series

Temple University

You are here: Home / Archives for privacy

privacy

Jan 27 – David Lanter – Enabling Data Protection by Design with Data Provenance Metadata”

January 17, 2023 By Aleksi Aaltonen

Time: Friday, 27 January 2023, 10:30–12:00
Room: LW420

Abstract

Privacy by design is part of the larger “data protection by design” achieved by security architecture and focuses on leveraging privacy protection principles, controls along with privacy enhancing technologies into the design of information management technologies and systems. The European Union’s General Data Protection Regulation (GDPR) makes protecting privacy and personal data a default requirement for system and service behavior of systems and services that must be thought through and designed in. Although the concept of privacy and data protection by design found its way into GDPR, the authors of the legislation acknowledged that “its concrete implementation remains un-clear”. Data provenance information, however, offers a way to meet these requirements. This presentation provides an overview of pioneering research the author conducted to combine information systems and related processing workflows with digital provenance metadata capture and processing to augment scientific reproducibility, comparison, trust or to otherwise improve information system assisted decision support.

Bio

David Lanter is an Assistant Professor of Practice and Director of the Information Technology Auditing and Cyber Security (ITACS) programs at Temple University’s Fox School since 2016. Prior to coming to Temple University, he was Vice President of Information Management Systems at CDM Smith, Research Director at Rand McNally, Software Design Engineer at Microsoft, Assistant Professor of Geography and Research Fellow at University of California – Santa Barbara, Systems Analyst at Grumman Data Systems, Software Engineer at Navigation Sciences, and President of Geographic Designs Inc. Dave earned his Bachelor of Arts degree with honors in Science, Technology and Society from Clark University, Master of Arts degree in Geographic Information System design from SUNY-Buffalo, Master of Science degree in Information Technology Auditing and Cyber Security from Temple University, and his Ph.D. in Geographic Information Processing from the University of South Carolina. He is a certified Geographic Information Systems Professional (GISP), Certified Information Systems Auditor (CISA), and Certified Information Systems Security Professional (CISSP). Dr. Lanter is a member of ISACA and ISACA’s Philadelphia Chapter where he authors and presents continuing professional education webinars and workshops. He is also a member of Urban and Regional Information Systems Association (URISA) where he serves on the faculty of URISA’s GIS Leadership Academy, instructs workshops on GIS data quality assurance and cybersecurity, and served as past chair of URISA’s Workshop Development Committee.

Tagged With: data provenance, Design, GDPR, privacy

Nov 20 – Ozgur Turetken to present “A Comprehensive Privacy Calculus Model – The Case of Smartphones”

December 18, 2020 By Sezgin Ayabakan

A Comprehensive Privacy Calculus Model – The Case of Smartphones

by

Ozgur Turetken

Professor
Associate Dean for Research
Ted Rogers School of Information Technology Management
Ryerson University

Friday, Nov 20

10 – 11 am | Zoom

(send an email to ayabakan@temple.edu to get the Zoom link)

Abstact:

Advances in data collection abilities, the rapid diffusion of smartphones, and recent large scale data breaches are causing consumers’ privacy awareness and concerns to rise. Privacy related literature contains several models such as the Privacy Calculus used to understand privacy behaviours and privacy concerns. The Privacy Calculus involves a rational decision making process whereby an individual engages in a cost-benefit analysis of competing beliefs to decide whether to disclose their personal information. The current research extends the original Privacy Calculus. Our model reflects a novel cost-benefit analysis of competing beliefs on smartphone privacy concerns. We extend this model with variables representing influencers and perceived protection. Overall six categories, composed of 14 variables, are considered for our comprehensive calculus model of privacy concerns: Benefits, Risks, External Influence, Internal Influence, External Protection, and Internal Protection. A survey instrument is distributed to 603 smartphone users to collect both quantitative and qualitative responses. Structured equation modeling (SEM) and a manual content analysis are employed to analyze the data. The quantitative results reveal that factors within the Benefit, Risk, Internal Protection and Internal Influence categories have significant impact on privacy concerns while the External Protection and External Influence categories do not. From the qualitative results, 12 core factors and 22 sub-factors are determined to influence smartphone privacy concerns. Overall, the novel theoretical model of calculus on privacy concerns extends the original privacy calculus for a more thorough and granular understanding of individuals’ conceptualization of privacy concerns and their subsequent intentions to disclose personal information. This research is timely as organizations need to balance their need for customer information with rapidly increasing privacy concerns. The findings have significant practical implications for other stakeholders such as smartphone developers, smartphone service providers, and government regulators.

Tagged With: privacy, Privacy Calculus, Smartphones, Structured equation modeling

Nov 1 – Idris Adjerid to present “Consumer Consent and Firm Targeting after GDPR: The Case of a Large Telecom Provider”

October 25, 2019 By Sezgin Ayabakan

Consumer Consent and Firm Targeting after GDPR: The Case of a Large Telecom Provider

by

 

Idris Adjerid

Associate Professor
Pamplin College of Business
Virginia Tech

Friday, November 1

10:30 – 12:00 pm | Speakman 200

Abstact:

The General Data Protection Regulation (GDRP) represents a dramatic shift in global privacy regulation. In this manuscript, we focus on the impact of GDPR’s enhanced consumer consent requirements that focus on transparent and active elicitation of data allowances. We evaluate the effect of enhanced consent on consumer opt-in behavior and firm targeting after consent is solicited. Utilizing an experiment at a large telecommunications provider with operations in Europe, we find that opt-in for different data types and uses increased once GDPR-compliant consent was elicited. We also find that firm targeting, revenue, and lock-in increased after consumer consent was elicited. Our analysis suggests that these gains to the firm are because of the ability to utilize more targeted marketing campaigns after consumers provide additional data allowances. Our results have significant implications for firms and policymakers and provide insights relevant to the emerging debate on the balance between consumer privacy protection and firms’ collection and use of personal information.

Tagged With: consumer consent, experiment, General Data Protection Regulation, privacy

April 19 – Alessandro Acquisti to Present “The Sense of Privacy”

April 15, 2019 By Jing Gong

The Sense of Privacy

by

Alessandro Acquisti

Professor of Information Technology and Public Policy, PwC William W. Cooper Professor of Risk and Regulatory Innovation

Heinz College, Carnegie Mellon University

Friday, April 19, 2019

10:30 AM – noon

Speakman Hall Suite 200

 

Abstract

Many factors affect privacy behavior in both conscious and unconscious manners. Some of those factors are sensorial cues: hearing, seeing, or smelling the presence of others. Human beings may be wired to react to those cues even when they do not carry information about actual trade-offs associated with privacy choices, and thus should not normatively influence privacy calculus.  In four experiments (N=829), we examine the effect on privacy-relevant behavior (the disclosure of personal information) of sensorial cues signaling the presence of other humans, including cases when that presence does not materially affect risks or benefits associated with personal disclosures. Four types of sensorial cues (proximity, visual, auditory, and olfactory), each signaling the presence of another person around the participant’s physical space, produce a consistent and significant inhibitory effect on disclosure of personal, intimate information in an online survey. The findings suggest a visceral, and in part unconscious, influence of sensorial stimuli on privacy choices. We discuss the implications of the findings in the context of privacy (and security) decision making in a digital age, where physical cues human beings may have adapted to use for detection of threats may be absent or even manipulated by third parties.

Bio

Alessandro Acquisti is a Professor of Information Technology and Public Policy at the Heinz College, Carnegie Mellon University (CMU), and the PwC William W. Cooper Professor of Risk and Regulatory Innovation. He is the director of the Peex (Privacy Economics Experiments) lab at CMU, and the co-director of Carnegie Mellon’s CBDR (Center for Behavioral and Decision Research). Alessandro investigates the economics of privacy. His studies have spearheaded the investigation of privacy and disclosure behavior in online social networks, and the application of behavioral economics to the study of privacy and information security decision making. Alessandro has been the recipient of the PET Award for Outstanding Research in Privacy Enhancing Technologies, the IBM Best Academic Privacy Faculty Award, the IEEE Cybersecurity Award for Innovation, Heinz College School of Information’s Teaching Excellence Award, and numerous Best Paper awards. His studies have been published in journals, books, and proceedings across a variety of fields, including Science, Proceedings of the National Academy of Science, Management Science, Journal of Economic Literature, Marketing Science, Journal of Consumer Research, Journal of Personality and Social Psychology, and Journal of Experimental Psychology. Alessandro has testified before the U.S. Senate and House committees on issues related to privacy policy and consumer behavior, and has been frequently invited to consult on privacy policy issues by various government bodies, including the White House’s Office of Science and Technology Policy and the Council of Economic Advisers, the Federal Trade Commission, the National Telecommunications and Information Administration, and the European Commission. Alessandro’s findings have been featured in national and international media outlets, including the Economist, the New York Times, the Wall Street Journal, the Washington Post, the Financial Times, Wired.com, NPR, CNN, and 60 Minutes; his TED talks on privacy and human behavior have been viewed over 1.2 million times online. His 2009 study on the predictability of Social Security numbers was featured in the “Year in Ideas” issue of the NYT Magazine (the SSNs assignment scheme was changed by the US Social Security Administration in 2011). Alessandro holds a PhD from UC Berkeley, and Master degrees from UC Berkeley, the London School of Economics, and Trinity College Dublin. He has held visiting positions at the Universities of Rome, Paris, and Freiburg (visiting professor); Harvard University (visiting scholar); University of Chicago (visiting fellow); Microsoft Research (visiting researcher); and Google (visiting scientist). He has been a member of the National Academies’ Committee on public response to alerts and warnings using social media, he is a member of the Board of Regents of the National Library of Medicine (NLM), and he is a Carnegie Fellow (inaugural class).

Tagged With: Alessandro Acquisti, Carnegie Mellon, privacy

Primary Sidebar

RSS MIS News

  • AIS Student Chapter Leadership Conference 2025 April 17, 2025
  • Temple AIS wins at the 2024 AIS Software Innovation Challenge! January 15, 2025
  • 10 Week Summer Internship in CyberSecurity October 7, 2024
  • Volunteer for Cybersecurity Awareness Month October 7, 2024
  • MIS faculty awarded promotions June 17, 2024

Tags

AI amrit tiwana Artificial Intelligence blockchain boston college bots brian butler carnegie mellon univ crowd culture deception Deep Learning Design experiment Field Experiment financial technology georgia state georgia tech Healthcare Human vs AI information security Innovation Institutional Theory IT Outsourcing long tail Machine Learning machines Maryland media Online Communities platform privacy productivity Quasi-natural experiment recommender systems simulation Social Capital social media social network steven johnson technology adoption temple univ user generated content UT Dallas wharton

Archives

Copyright © 2025 Department of Management Information Systems · Fox School of Business · Temple University