Scalar field interpolation/filter methods

Feel free to ask any question here
Post Reply
gysvanzyl
Posts: 4
Joined: Fri Jan 22, 2021 2:25 am

Scalar field interpolation/filter methods

Post by gysvanzyl »

Hi all,

When interpolating a scalar field from one cloud onto another, is there a way to use the minimum of the n nearest points instead of mean? Or maybe mean minus 2 or 3 standard deviations?

Alternatively, is there a way to apply a filter (similar to Gaussian or bilateral) to a scalar field that will assign a scalar field value to each point that is the minimum of the scalar field values at the n nearest points.

It doesn't look like either of these are possible in CloudCompare directly, but might there be a way to specify a custom function as an interpolator method in the CloudComPy interpolatorParameters class?

Cheers,
Gys
daniel
Site Admin
Posts: 8158
Joined: Wed Oct 13, 2010 7:34 am
Location: Grenoble, France
Contact:

Re: Scalar field interpolation/filter methods

Post by daniel »

Indeed, this is not readily available in CC (apart if you want to use the Rasterize tool, but it only work in 2.5D).

Either you would have to adapt the C++ code, or maybe some Python script with CloudCompy of the Python plugin. But you'll have to ask directly on these projects github page (as they are separate projects, and their maintainers rarely come on this forum).
Daniel, CloudCompare admin
gysvanzyl
Posts: 4
Joined: Fri Jan 22, 2021 2:25 am

Re: Scalar field interpolation/filter methods

Post by gysvanzyl »

For the record, I did manage to make a working python plugin script to do this. Code could probably be better, but at least this works:

Code: Select all

import pycc
import numpy as np
from scipy.spatial import KDTree

def min_smoothing(radius=0.005):
    '''
    Creates a new scalar field from the currently displayed one by
    setting the value of each point as the minimum of all the values
    within a specified radius.
    '''
    cc = pycc.GetInstance()

    entities = cc.getSelectedEntities()
    if not entities:
            raise RuntimeError("No entities selected")
    
    cloud = entities[0].getAssociatedCloud()

    # Get point coordinates
    points = np.array([list(cloud.getPoint(i)) for i in range(cloud.size())], dtype=np.float32)
    
    # Get the currently displayed scalar field
    SF = cloud.getCurrentDisplayedScalarField()
    scalar_values = np.array([SF.getValue(i) for i in range(SF.size())], dtype=np.float32)
    
    # Use a KD-tree to find all neighbors within the specified radius
    tree = KDTree(points)
    neighbors_list = tree.query_ball_point(points, r=radius)  # list of arrays
    
    # Compute the minimum scalar field value among neighbors within radius
    min_scalar_values = np.empty(cloud.size(), dtype=np.float32)
    for i, neighbors in enumerate(neighbors_list):
        if neighbors:
            min_scalar_values[i] = np.min(scalar_values[neighbors])
        else:
            min_scalar_values[i] = np.nan  # or assign a default/fallback value
    
    new_sf_name = f"{SF.getName()}_minR_{radius}"
    new_sf_index = cloud.addScalarField(new_sf_name)
    new_sf = cloud.getScalarField(new_sf_index)
    
    for i, val in enumerate(min_scalar_values):
        new_sf.setValue(i, val)
    
    new_sf.computeMinAndMax()
    
if __name__ == "__main__":
    min_smoothing(radius=0.005)
daniel
Site Admin
Posts: 8158
Joined: Wed Oct 13, 2010 7:34 am
Location: Grenoble, France
Contact:

Re: Scalar field interpolation/filter methods

Post by daniel »

Thanks for sharing!
Daniel, CloudCompare admin
Post Reply