this is from Gregory at hipchat:
The colormap can/should be thought of (and perhaps coded) as orthogonal to "stretch" issues like lin/log/asinh.
If you are representing a *single* dimension as a function of position in a plane (e.g., a CCD image in a single filter band – as opposed to something like a 3-band color image where you are representing a tuple of information at each position) you can think of the process as decomposing to mapping the "z axis" value (e.g., flux) from its native domain to a range of [0, 1] – that's the stretch – and then mapping [0,1] to a one-dimensional path in the N-dimensional colorspace of the display (N = 3 for RGB computer displays) – that's the colormap.
The issue that matplotlib's "viridis" and other similar contemporary colormaps are trying to address is to make the "perceptual distance" for each similar-sized step along [0, 1] the same. Naive colormaps that correspond, for instance, to mapping [0, 1] linearly to perceived colors along the wavelength axis from red to violet (i.e., "across the rainbow") are (very) far from having that property. This is important both for making visualizations intelligible to people with limited color vision and for exposing the information content of the display in a uniform way that assists people in reasoning about what they are seeing.
Much more is understood about this now than, say, 20 years ago.