How can we obtain scientic benefits from statistical analysis of sensitive data, without compromising the privacy of the individuals who contribute their data? The past decade has seen rapid progress on this question due to the emergence of a mathematically rigorous privacy framework known as differential privacy. Informally, differential privacy provides a robust individual privacy guarantee, ensuring that no adversary, regardless of their capabilities, can learn much more about an individual user than they could have learned had that user's data never been collected.