I have a circle plotted on a graph using matplotlib
angle = np.linspace(0, 4 * np.pi, 150)
radius = 0.2
x = radius * np.cos(angle) 0.5
y = radius * np.sin(angle) 0.5
fig, ax = plt.subplots()
ax.set_aspect(1)
ax.plot(x, y)
I also have some code that randomly plots scatter points on the graph:
q = [random.uniform(0.3, 0.7) for n in range(900)]
b = [random.uniform(0.3, 0.7) for n in range(900)]
ax.scatter(q, b, color='black', marker='.', s=1)
I want to count how many of the randomly plotted points fall within the circle. Is there a way I can do this?
CodePudding user response:
To do this you have to find the points that have a distance between them and the center of the circle inferior to the radius. The center of the circle is (0.5, 0.5).
To do this you can do the following:
import numpy as np
q, b = np.array(q), np.array(b)
# The mask is a boolean array that tells if a point is inside
# of the circle or outside of the circle.
mask = np.sqrt((q-0.5)**2 (b-0.5)**2) <= radius
#np.count_nonzero counts the number of True elements in an array.
number_inside = np.count_nonzero(mask)
number_inside
is the number of points inside the circle.