Lisbon city has open data, regarding several parameters, of sensors around the city. The JSON query is the following
http://opendata-cml.qart.pt/measurements/RULAEQ0001?startDate=202104010000&endDate=202108310000
This will return the registered noise levels RULAEQ
for station 0001
between the those dates expressed in YYYYMMDDHHMM
There are 80 monitoring stations across the city and I would like to get the data for each one of them. The stations are numbered from 0001 to 0080.
What is the best way to automate this process in python?
Thank you in advance.
Note 1: I have the code working for one single query
# -*- coding: utf-8 -*-
# Import libraries
import pandas as pd
import urllib.request, json
from flatten_json import flatten
# Query URL
url = 'http://opendata-cml.qart.pt/measurements/RULAEQ0001?startDate=202104010000&endDate=202108310000'
# Read JSON from URL and decode
with urllib.request.urlopen(url) as url:
data = json.loads(url.read().decode())
# Flatten JSON data
data_flattened = [flatten(d) for d in data]
# Create dataframe
df = pd.DataFrame(data_flattened)
# Print Dataframe
print(df)
Note 2 : As usual, this is for a non-profit project and no commercial exploitation will be done.
CodePudding user response:
# -*- coding: utf-8 -*-
# Import libraries
import pandas as pd
import urllib.request, json
from flatten_json import flatten
for station in list(range(1, 80)):
# Query URL
url = f'http://opendata-cml.qart.pt/measurements/RULAEQ{str(station).zfill(4)}?startDate=202104010000&endDate=202108310000'
print(station)
# Read JSON from URL and decode
with urllib.request.urlopen(url) as url:
data = json.loads(url.read().decode())
# Flatten JSON data
data_flattened = [flatten(d) for d in data]
# Create dataframe
df = pd.DataFrame(data_flattened)
# Print Dataframe
print(df)
for station in list(range(1, 80)):
will loop between 1 and 80
url = f'http://opendata-cml.qart.pt/measurements/RULAEQ{str(station).zfill(4)}?startDate=202104010000&endDate=202108310000'
will use that id and pad the string with 0s to make it 4 digits long.