PRIVED
Privacy and Digital Economics. An experimental challenge
Privacidad y Economía Digital. Un desafío experimental
21st May 2019
Objective
Further research on solutions to increase the security and anonimity of users when using internet services.
This workshop is part of the development of PRIVED, a project funded by the BBVA Foundation.
Audience
International researchers interested in the effect of privacy on economic decision-making processes.
Location
Department of Economic Analysis (third floor). Room 3P16 Faculty of Economics.
Valencia
-Spain
Programme
MONDAY 20th MAY
09.30 Registration and Workshop Opening
Session 1 Chair: Adriana Alventosa
10.00 Zvika Neeman (Tel Aviv University)
10.45 Daniel Bird (Tel Aviv University)
11.30 Coffee break
12.00 Amparo Urbano (University of Valencia and ERI-CES)
12.45 Discussion
13.30 Lunch Break
Session 2 Chair: Penélope Hernández
15.00 Ran Eilat ( Ben-Guiron Aviv University of the Negev)
15.45 Coffee break
16.15 Mariola Sánchez (University of Valencia and ERI-CES)
17.00 Rann Smorodinsky (Technion - Israel Institute of Technology)
TUESDAY 21st MAY
Session 3 Chair: Jose Manuel Pavía
10.00 Galit Ashkenazi (Tel Aviv University)
10.45 Matthew Ellman (Institut d’Anàlisi Econòmica - CSIC)
11.30 Coffe Break
12.00 Antonio Morales (University of Málaga)
12.45 Discussion and Workshop Closure
13.30 Lunch Break
Speakers
Galit Ashkenazi
Title: Optimal use of information in besian games with incomplete information on one side.
Abstract: TBA
Daniel Bird
Title: Preference Based Privacy (joint with Zvika Neeman)
Abstract: TBA
Ran Eilat
Title: Optimal Privacy-Constrained Mechanisms (Joint with Kfir Eliaz and Xiaosheng Mu)
Abstract:
Modern information technologies make it possible to store, analyze and trade unprecedented amounts of detailed information about individuals. This has led to public discussions on whether individuals' privacy should be better protected by restricting the amount or the precision of information that is collected by commercial institutions on its participants. We contribute to this discussion by proposing a Bayesian approach to measure loss of privacy and applying it to the design of optimal mechanisms. Specifically, we define the loss of privacy associated with a mechanism as the difference between the designer's prior and posterior beliefs about an agent's type, where this difference is calculated using Kullback-Leibler divergence, and where the change in beliefs is triggered by actions taken by the agent in the mechanism. We consider both ex-ante (the expected difference in beliefs over all type realizations cannot exceed some threshold k) and ex-post (for every realized type, the maximal difference in beliefs cannot exceed some threshold k) measures of privacy loss. Using these notions we study the properties of optimal privacy-constrained mechanisms and the relation between welfare/profits and privacy levels.
Matthew Ellman
Title: TBA
Abstract: TBA
Antonio Morales
Title: Privacy in a digital world: An online experiment (Joint with Penélope Hernández, Zvika Neeman and José Manuel Pavía)
Abstract:
We design an online experiment to measure the value of privacy for online users in a shopping website where subjects have a shopping experience for 30 periods. Subjects’ private information is their type, either “mostly A shopper” or “mostly B shopper” and their willingness to pay for each product (a or b). The website has an algorithm that learns about the type of the subject from his shopping behaviour, and can use that information to decide which product to offer and at which price. On the theory side, we characterize the optimal shopping strategy that minimizes the chances that the algorithm learns about the shopper’s type (optimal concealing strategy) and also the optimal shopping strategy that maximizes the welfare to the shopper (welfare maximising strategy). On the experimental side, we run several online experiments under three main scenarios: one where the payoffs from the optimal concealing strategy are roughly the same as the payoffs from the welfare maximising strategy and then the other opposite cases. When finishing their online shopping experience, in a surprise restart, shoppers are offered the opportunity to use an app that implements the optimal concealing strategy. And we investigate the shoppers’ willingness to accept the app.
Zvika Neeman
Title: How Bayesian Persuasion can Help Reduce Illegal Parking and Other Socially Undesirable Behavior (joint with Penélope Hernández)
Abstract:
Modern information technologies make it possible to store, analyze and trade unprecedented amounts of detailed information about individuals. This has led to public discussions on whether individuals' privacy should be better protected by restricting the amount or the precision of information that is collected by commercial institutions on its participants. We contribute to this discussion by proposing a Bayesian approach to measure loss of privacy and applying it to the design of optimal mechanisms. Specifically, we define the loss of privacy associated with a mechanism as the difference between the designer's prior and posterior beliefs about an agent's type, where this difference is calculated using Kullback-Leibler divergence, and where the change in beliefs is triggered by actions taken by the agent in the mechanism. We consider both ex-ante (the expected difference in beliefs over all type realizations cannot exceed some threshold k) and ex-post (for every realized type, the maximal difference in beliefs cannot exceed some threshold k) measures of privacy loss. Using these notions we study the properties of optimal privacy-constrained mechanisms and the relation between welfare/profits and privacy levels.
Mariola Sánchez
Title: Security in digital markets (joint with Amparo Urbano)
Abstract:
This paper contributes to the literature on security in digital markets. We analyze a two-period monopoly market in which consumers have privacy concerns. We make three assumptions about privacy: first, that it evolves over time; second, that it has a value that is unknown by all market participants in the first period; and third, that it may affect market participants' willingness to pay for products. The monopolist receives a noise signal about consumers' average privacy. This signal allows the monopolist to adjust the price in the second period and engage in price discrimination. The monopolist's price in period two acts as a signal to consumers about privacy. This signal, together with consumers' purchase experiences from the first period, determines demand. We address two scenarios: direct investment in security to improve consumers' experiences and investment in market signal precision.
Rann Smorodinsky
Title: Privacy, Patience, and Protection (joint with Ronen Gradwohl)
Abstract:
Should players with privacy concerns advocate for the implementation of privacy protection in the form of either regulation or technology? We analyze repeated games in which players have private information about their levels of patience and in which they would like to maintain the privacy of this information vis-a-vis third parties. We show that privacy protection in the form of shielding players' actions from outside observers is harmful, as it limits and sometimes eliminates the possibility of attaining Pareto-optimal payoffs.
Amparo Urbano
Title: Demand for privacy, selling consumer information, and consumer hiding vs. opt-out (joint with S.P Anderson, N. Larson and M. Sánchez)
Abstract:
We consider consumers choosing whether to buy a good, when they know that information about them can be sold to another firm selling another good they might also buy. This causes some consumers to hide their types by not buying the first good, which delivers an endogenous demand for privacy and renders the demand for the second good more inelastic. But it also can give the firm in the first market a greater incentive to harvest consumers to sell to the second firm, and, therefore, the upstream price can go down while increasing the downstream price. We determine whether information selling improves upstream profits, consumer surplus, and total welfare, and we find the consequences of allowing consumers to opt out of having their information sold by the upstream firm.
If you have any question, please contact us: