Im trying to plot a scatter with values from my array, everything is working till im trying to scale the size of the dots with an value of my array.
For example my array looks like this:
['50', ' 50', ' 0.6352952']
First value is x, second y and the third one is which i want to scale with
My plots currently looks like that:
for i in range(0, len(convertedResults)):
if convertedResults[i][0] and convertedResults[i][1]:
plt.scatter(convertedResults[i][0], convertedResults[i][1], s=1*convertedResults[i][2])
plt.show()
I can easy plot without convertedResults[i][2], but if im trying to set the size of each dot, the following error appears:
TypeError: ufunc 'sqrt' not supported for the input types, and the inputs could not be safely coerced to any supported types according to the casting rule ''safe''
I also tried to set the s size manually to s=0.653123123 (for Example) and it worked. So it seems like there is a problem with my array. I don´t know if its necessary but my array is build like this:
[['50', ' 50', ' 0.6352952'], ['60', ' 50', ' 2.8389171199999996'], ['70', ' 50', ' 2.8389171199999996'], ['50', ' 60', ' 0.6352952']]
CodePudding user response:
The data in convertedResults
is of type string, so you are trying to pass a string into s
. Even when you multiply convertedResults[i][2]
by 1, the result is also a string according to the Python standard. You need to use s = float(convertedResults[i][2])
in the scatter
call