The growing power of digital platforms raises the question of democratic control or at least containment. In light of the transforming impact of platforms on markets, the public sphere, elections, and employment conditions, governments, and civil society alike are demanding more transparency and accountability. Shedding light on the principles and practices of algorithmic ordering promises to limit the power of platforms by subjecting their hidden operations to regulatory inspection. This article questions the popular image of an openable ‘black box’. Based on a critical reflection on transparency as a panacea for curtailing platform power, we propose the concept of observability to deal more systematically with the problem of studying complex algorithmic systems. We set out three broad principles as regulatory guidelines for making platforms more accountable. These principles concern the normative and analytical scope, the empirical and temporal dimension, and the necessary capacities for learning and knowledge generation.