TY - JOUR
T1 - Locally private Gaussian estimation
AU - Joseph, Matthew
AU - Kulkarni, Janardhan
AU - Mao, Jieming
AU - Wu, Zhiwei Steven
PY - 2019
Y1 - 2019
N2 - We study a basic private estimation problem: each of n users draws a single i.i.d. sample from an unknown Gaussian distribution N(µ, s2), and the goal is to estimate µ while guaranteeing local differential privacy for each user. As minimizing the number of rounds of interaction is important in the local setting, we provide adaptive two-round solutions and nonadaptive one-round solutions to this problem. We match these upper bounds with an information-theoretic lower bound showing that our accuracy guarantees are tight up to logarithmic factors for all sequentially interactive locally private protocols.
AB - We study a basic private estimation problem: each of n users draws a single i.i.d. sample from an unknown Gaussian distribution N(µ, s2), and the goal is to estimate µ while guaranteeing local differential privacy for each user. As minimizing the number of rounds of interaction is important in the local setting, we provide adaptive two-round solutions and nonadaptive one-round solutions to this problem. We match these upper bounds with an information-theoretic lower bound showing that our accuracy guarantees are tight up to logarithmic factors for all sequentially interactive locally private protocols.
UR - http://www.scopus.com/inward/record.url?scp=85090176189&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85090176189&partnerID=8YFLogxK
M3 - Conference article
AN - SCOPUS:85090176189
SN - 1049-5258
VL - 32
JO - Advances in Neural Information Processing Systems
JF - Advances in Neural Information Processing Systems
T2 - 33rd Annual Conference on Neural Information Processing Systems, NeurIPS 2019
Y2 - 8 December 2019 through 14 December 2019
ER -