Many systems in nature can be described using discrete input–output maps. Essentially, these maps describe any system where an input space or parameter space maps into an output space. Without knowing details about a map, there may seem to be no a priori reason to expect that a randomly chosen input would be more likely to generate one output over another. Here, by extending fundamental results from algorithmic information theory, we show instead that for many real-world input–output maps, the a priori probability P(x) that randomly sampled inputs generate a particular output x decays exponentially with the approximate Kolmogorov complexity K(x) of that output. These input–output maps are biased towards simplicity. We derive an upper bound P(x) ≲ 2^(−aK(x)−b), which is tight for most inputs. The constants a and b, as well as many properties of P(x), can be predicted with minimal knowledge of the map. We explore this strong bias towards simple outputs in a wide range of systems, ranging from the biophysical problem of RNA folding into secondary structures, to systems of coupled ordinary differential equations, to a stochastic financial trading model.