I am creating a function that sends data to a remote server. I am currently using the pandas library to read through a CSV file and convert the data to a dataframe. What I need to do is loop through that dataframe and covert each row to JSON and send them to my database.
The reason I need to loop through is that sets of data that are too big (currently sending 100 row by 21 col) are too long for HTML strings. What I need to do is send loop through and send lots of 10 or so.
Below is where I am at the moment:
def UploadData(root, self, data):
i = 0
data_arr = []
for row in data:
if i % 5 == 0:
# Add row to array or something
data_arr.append(row)
json_str = data_arr.to_json(orient='records')
url = 'https://newsimland.com/~db/JSON/?tok={"tok":"YOUR TOKEN HERE","cmd":{"STORE":"test_database","VALUE":' json_str '}}'
r = requests.get(url)
else:
# Add row to array
data_arr.append(row)
i = 1
data = r.json()
if r.status_code == 200:
Alert(title="Error", text="Data upload unsuccessful")
else:
Alert(title="Success", text="Data upload successful")
One of the problems with this is that .to_json(orient='records')
is meant for a dataframe, not the array I am appending to. Also if the original dataframe is less than 5 rows, it wont send the data to the database.
Does anyone know how I could achieve this?
CodePudding user response:
If I understand you correctly, you want to send your DataFrame in parts of 5 or fewer rows. In this case I recommend you to split the dataframe in the following way, so that the rows are retained as a DataFrame and you can use to_json
.
import numpy as np
def UploadData(root, self, data):
size = 5
for chunk in np.split(data, np.arange(size, data.size, size)):
if chunk.size:
json_str = chunk.to_json(orient='records')
# Send your data here!