I'm trying to construct a KDtree for finding 'K nearest neighbour' I've created a class called 'Point' which holds attributes: pointID, lat (latitude) and Lon (longitude). the input for the build_index is an array 'points' contains all instances of points.
in the code below I'm trying to construct the KDtree but I'm having issues with trying to retrieve just the lat and Lon of each Point to sort, but I understand just using 'points' will not work as it is a an array with just the class instances.
thanks for the help in advance!
class KDTreeNN(NearestNeigh):
def build_index(self, points: [Point]):
depth = 0
n = len(points)
if n <0:
return None
axis = depth % 2
sorted_points = sorted(points, key = lambda point: point[axis])
depth 1
return {
'point' : sorted_points[int(n/2)],
'left' : self.build_index(sorted_points[:int(n/2)]),
'right' : self.build_index(sorted_points[int(n/2) 1:])
}
CodePudding user response:
point[axis]
is not working because point
does not support such bracket notation.
There are several solutions:
Define
Point
as a named tuple:Instead of defining
Point
as something like:class Point: def __init__(self, pointID, lat, lon): self.pointID = pointID self.lat = lat self.lon = lon
...define it as a named tuple. Make sure to make
lon
andlat
its first two members:Point = namedtuple("Point", "lat,lon,pointID")
Also ensure that were create
Point
instances, you put the arguments in the right order.With that change it will work. This requires that you treat
Point
instances as immutable.Access the attribute (
lon
orlat
) depending on a string variable:axis = ("lon", "lat")[depth % 2] sorted_points = sorted(points, key = lambda point: getattr(point, axis))