NiBabel

Access a cacophony of neuro-imaging file formats

Previous topic

nibabel.volumeutils.rec2dict

Next topic

nibabel.volumeutils.shape_zoom_affine

This Page

Reggie -- the one

nibabel.volumeutils.scale_min_max

nibabel.volumeutils.scale_min_max(mn, mx, out_type, allow_intercept)

Return scaling and intercept min, max of data, given output type

Returns scalefactor and intercept to best fit data with given mn and mx min and max values into range of data type with type_min and type_max min and max values for type.

The calculated scaling is therefore:

scaled_data = (data-intercept) / scalefactor
Parameters :

mn : scalar

data minimum value

mx : scalar

data maximum value

out_type : numpy type

numpy type of output

allow_intercept : bool

If true, allow calculation of non-zero intercept. Otherwise, returned intercept is always 0.0

Returns :

scalefactor : numpy scalar, dtype=np.maximum_sctype(np.float)

scalefactor by which to divide data after subtracting intercept

intercept : numpy scalar, dtype=np.maximum_sctype(np.float)

value to subtract from data before dividing by scalefactor

>>> scale_min_max(0, 255, np.uint8, False) :

(1.0, 0.0) :

>>> scale_min_max(-128, 127, np.int8, False) :

(1.0, 0.0) :

>>> scale_min_max(0, 127, np.int8, False) :

(1.0, 0.0) :

>>> scaling, intercept = scale_min_max(0, 127, np.int8, True) :

>>> np.allclose((0 - intercept) / scaling, -128) :

True :

>>> np.allclose((127 - intercept) / scaling, 127) :

True :

>>> scaling, intercept = scale_min_max(-10, -1, np.int8, True) :

>>> np.allclose((-10 - intercept) / scaling, -128) :

True :

>>> np.allclose((-1 - intercept) / scaling, 127) :

True :

>>> scaling, intercept = scale_min_max(1, 10, np.int8, True) :

>>> np.allclose((1 - intercept) / scaling, -128) :

True :

>>> np.allclose((10 - intercept) / scaling, 127) :

True :

Notes

The large integers lead to python long types as max / min for type. To contain the rounding error, we need to use the maximum numpy float types when casting to float.