Home > OS >  testing coordinates distance with a function
testing coordinates distance with a function

Time:03-22

I have a point with coordinates

p = (x,y)

I am trying to perform a test via lambda function in a way that it will assess that p=(x,y) is in the unit circle if the distance from (0,0) is less than 1. Thus, the formula should be the following one:

test_unit = lambda p: (here is the test)

I have tried the following solution, but actually, I am not that sure that it works properly.

sum_items = lambda x,y: x > 0 and y > 0
print(sum_items(-0.8, -0.2))

I think that my approach could be wrong, for reason that it does not takes into account the p point as well as that the sums of points may do not fit the case. If someone could have possibly some clue, please just drop your comment below.

Thanks

CodePudding user response:

To calculate the distance between two points (x1, y1) and (x2, y2) you should use the following formula: sqrt((x1 - x2)**2 - (y1 - y2)**2)

For your particular case where you want to ensure that the distance to the circle is less than 1, you do not need to take the square root. This should do:

within_unit_circle = lambda x,y: x**2   y**2 <= 1
print(within_unit_circle(-0.8, -0.2))

Please, look up https://en.wikipedia.org/wiki/Pythagorean_theorem

[EDIT] Show how to pass a point as a tuple:

within_unit_circle = lambda p: p[0]**2   p[1]**2 <= 1
p = (-0.8, -0.2)
print(within_unit_circle(p))
  • Related