Home > database >  Post files, but endpoint limits 50 posts per minute
Post files, but endpoint limits 50 posts per minute

Time:06-14

Good afternoon, i'm completely new to coding and i need some help, i splitted a json file into ~2800 smaller json files and need to post to a certain endpoint, however the limit of the endpoint is ~50 files per minute.

Currently i have made this is python :

import requests

url = 'testtest'
headers = {'Authorization' : 'testtest', 'Accept' : '*/*', 'Content-Type' : 'application/json'}
r = requests.post(url, data=open('C:\Accountmanager\json1.json', 'rb'), headers=headers

The file names are json1 -> json2800

Currently it only posts 1 file, and i need to post all 2800 files with a limit of 50 per minute, is there someone who can help out :) ?

CodePudding user response:

If your code is correct for 1 request, you should be able to do all of them like this :

import requests
from time import sleep

url = 'testtest'
headers = {'Authorization' : 'testtest', 'Accept' : '*/*', 'Content-Type' : 'application/json'}
for i in range(1, 2801): # for each file
    r = requests.post(url, data=open('C:\Accountmanager\json'  i  '.json', 'rb'), headers=headers
    sleep(60/50) #for 50 requests per minute (60s)

I recommend you to replace 60/50 (last line) by something like 60/48 for example to be sure there is no problem due do lags

CodePudding user response:

I would suggest calling the directory that has all the files then iterating through each one and posting it. Then use the time library to control how often a post is posted.

import os
import requests

url = 'test'
headers = {'Authorization' : 'testtest', 'Accept' : '*/*', 'Content-Type' : 'application/json'}

_, _, files = next(os.walk("C:\Accountmanager"))
file_count = len(files)
for i in files:


#the value for the directory for *data=open* is the directory of the file   i which is the name of each file inside of your directory
    r = requests.post(url, data=open('C:\Accountmanager\ '.replace(" ", "")   i, 'rb'), headers=headers)
    time.sleep(2)

Now in all honesty, I don't have any experience with the requests library so I haven't been able to get the code running as I get an error relating to the url variable, so I'm assuming you've got that part set up and hopefully I've helped you with the question at hand.

CodePudding user response:

So you have a large JSON that needs to be split into smaller component parts. Each part has to be saved locally in its own file. Those saved files have to be POSTed to some URL. You need to limit flow to no more than 50/minute.

from json import load, dumps
from requests import post
from time import sleep
from os.path import join

URL = '...'
HEADERS = {'Authorization' : 'testtest', 'Accept' : '*/*', 'Content-Type' : 'application/json'}
BASEDIR = r'C:\Accountmanager'
filelist = []
with open('filename.json') as json_file:
    data = load(json_file)
    for counter, dict_ in enumerate(data, 1):
        new_json = dumps(dict_, sort_keys=True, indent=4)
        filename = join(BASEDIR, f'json{counter}.json')
        with open(filename, 'w') as new_file:
            new_file.write(new_json)
            filelist.append(filename)

for filename in filelist:
    with open(filename, 'rb') as data:
        post(URL, data=data, headers=HEADERS).raise_for_status()
        time.sleep(1.2)

CodePudding user response:

Just modify your requests.post to put it in a loop with a sleep timer to throttle the requests like the following:

import requests
import time

url = 'testtest'
headers = {'Authorization' : 'testtest', 'Accept' : '*/*', 'Content-Type' : 'application/json'}
for i in range(1, 2801):
    filename = f"C:\Accountmanager\json{i}.json"
    r = requests.post(url, data=open(filename, 'rb'), headers=headers
    time.sleep(2)

You can play with the sleep timer to see where the throttle limit is or this will post 1 every 2 seconds so you'll be well under your 50 per minute limit.

  • Related