LinearSegmentedNorm

class LinearSegmentedNorm(levels, vmin=None, vmax=None, clip=False)[source]

Bases: matplotlib.colors.Normalize

Normalizer that scales data linearly with respect to average position in an arbitrary monotonically increasing level lists. This is the same algorithm used by LinearSegmentedColormap to select colors in-between indices in the segment data tables. This is the default normalizer paired with DiscreteNorm whenever levels are non-linearly spaced. Can be explicitly used by passing norm='segmented' to any command accepting cmap.

Parameters
  • levels (list of float) – The level boundaries.

  • vmin, vmax (None) – Ignored. vmin and vmax are set to the minimum and maximum of levels.

  • clip (bool, optional) – Whether to clip values falling outside of the minimum and maximum levels.

Example

In the below example, unevenly spaced levels are passed to contourf, resulting in the automatic application of LinearSegmentedNorm.

>>> import proplot as plot
>>> import numpy as np
>>> levels = [1, 2, 5, 10, 20, 50, 100, 200, 500, 1000]
>>> data = 10 ** (3 * np.random.rand(10, 10))
>>> fig, ax = plot.subplots()
>>> ax.contourf(data, levels=levels)

Methods Summary

__call__(value[, clip])

Normalize the data values to 0-1.

inverse(value)

Inverse operation of __call__.