Skip to content

Make Some ROOM for the Zeros: Data Sparsity in Secure Distributed Machine Learning

Author: Schoppmann, P., Gascón, A., Raykova, M., and Pinkas, B.
Published in: CCS '19: Proceedings of the 2019 ACM SIGSAC Conference on Computer and Communications Security, 1335–1350
Year: 2019
Type: Academic articles
DOI: 10.1145/3319535.3339816

Exploiting data sparsity is crucial for the scalability of many data analysis tasks. However, while there is an increasing interest in efficient secure computation protocols for distributed machine learning, data sparsity has so far not been considered in a principled way in that setting.We propose sparse data structures together with their corresponding secure computation protocols to address common data analysis tasks while utilizing data sparsity. In particular, we define a Read-Only Oblivious Map primitive (ROOM) for accessing elements in sparse structures, and present several instantiations of this primitive with different trade-offs. Then, using ROOM as a building block, we propose protocols for basic linear algebra operations such as Gather, Scatter, and multiple variants of sparse matrix multiplication. Our protocols are easily composable by using secret sharing. We leverage this, at the highest level of abstraction, to build secure protocols for non-parametric models (k-nearest neighbors and naive Bayes classification) and parametric models (logistic regression) that enable secure analysis on high-dimensional datasets. The experimental evaluation of our protocol implementations demonstrates a manyfold improvement in the efficiency over state-of-the-art techniques across all applications.Our system is designed and built mirroring the modular architecture in scientific computing and machine learning frameworks, and inspired by the Sparse BLAS standard.

Visit publication


Connected HIIG researchers

Phillipp Schoppmann

Former Associated Researcher: Data, actors, infrastructures

  • Open Access

Explore current HIIG Activities

Research issues in focus

HIIG is currently working on exciting topics. Learn more about our interdisciplinary pioneering work in public discourse.