image processing – Downsample array in Python

image processing – Downsample array in Python

scikit-image has implemented a working version of downsampling here, although they shy away from calling it downsampling for it not being a downsampling in terms of DSP, if I understand correctly:

but it works very well, and it is the only downsampler that I found in Python that can deal with np.nan in the image. I have downsampled gigantic images with this very quickly.

When downsampling, interpolation is the wrong thing to do. Always use an aggregated approach.

I use block means to do this, using a factor to reduce the resolution.

import numpy as np
from scipy import ndimage

def block_mean(ar, fact):
    assert isinstance(fact, int), type(fact)
    sx, sy = ar.shape
    X, Y = np.ogrid[0:sx, 0:sy]
    regions = sy/fact * (X/fact) + Y/fact
    res = ndimage.mean(ar, labels=regions, index=np.arange(regions.max() + 1))
    res.shape = (sx/fact, sy/fact)
    return res

E.g., a (100, 200) shape array using a factor of 5 (5×5 blocks) results in a (20, 40) array result:

ar = np.random.rand(20000).reshape((100, 200))
block_mean(ar, 5).shape  # (20, 40)

image processing – Downsample array in Python

imresize and ndimage.interpolation.zoom look like they do what you want

I havent tried imresize before but here is how I have used ndimage.interpolation.zoom

a = np.array(64).reshape(8,8)
a = ndimage.interpolation.zoom(a,.5) #decimate resolution

a is then a 4×4 matrix with interpolated values in it

Leave a Reply

Your email address will not be published. Required fields are marked *