I have ajson which I have pulled a list of coordinates of 10 sided polygons (attached an example).
I need to create an array from these coordinates that will plot them on a 128 x 128 square as filled in polygons so that I can multiply it with another array.
f=open('139cm_2000_frame27.json')
data=json.load(f)
shapes=data["shapes"]
for i in shapes:
for c in i["points"]:
print(c)
This code prints the coordinates as such:
[59.10047478151446, 18.879430437885873]
[58.04359793578868, 16.37330203102605]
[58.661631924538575, 13.724584936383643]
[60.71850877026435, 11.94499905752918]
[63.42857142857143, 11.714285714285715]
[65.75666807562841, 13.120569562114127]
[66.81354492135418, 15.62669796897395]
[66.19551093260428, 18.275415063616357]
[64.13863408687851, 20.05500094247082]
[88.71428571428572, 82.42857142857143]
[85.63470409582908, 81.33512050565437]
[83.78598455752919, 78.6403674679397]
[83.8742751273506, 75.3736163845474]
[85.86585180850763, 72.78265513654777]
[89.0, 71.85714285714285]
.............
The first 10 make one polygon , the second 10 another... I need to get these coordinates of these polygons and put them on a 128x128 array. Essentially it will look like this, but in an array form, this is just plotted using matplotlib
CodePudding user response:
You can use the json library to work with json objects. If you need to load the json from a file use json.load() like so:
import json
with open('example.json', 'r') as f:
versions = json.load(f)['versions']
and use loads when from a string:
import json
json_str = '{"versions": 10}'
versions = json.loads(json_str)['versions']
CodePudding user response:
Ok, you’ve managed to get some way. You now have to navigate the ‘dicts within lists within dicts’. This code will simply print out all the coordinates of all the shapes:
f=open('139cm_2000_frame27.json')
data=json.load(f)
shapes=data["shapes"]
for i in shapes:
print(i[‘label’]) # prints the label first
for c in i[‘points’]:
print(c) # a list containing coordinates
(Sorry, but the single quoted strings are malformed. I’ll have to fix those later)