Computer scientists have recently begun designing systems that appear, at least at first glance, to be surprisingly, wastefully inefficient. A stock exchange forces all electronic trades to travel through a thirty-eight mile length of fiber-optic cable coiled up in a box; the Bitcoin protocol compels participants to solve difficult yet useless math problems with their computers; and the iPhone locks users out for many painful seconds after a mistyped password, a delay that increases with each subsequent mistake. We draw these examples and others together into a common, emerging, and underappreciated approach to digital system design, which we name “desirable inefficiency.” Designers have turned to desirable inefficiency when the efficient alternative fails to provide or protect some essential human value, such as fairness or trust. Desirable inefficiency is an example of a design pattern that engineers have organically and voluntarily adopted to make space for human values. Regulators should study these emergent engineering responses and actively impose design patterns like desirable inefficiency to protect values important to society.