I am creating a FastAPI Python application where a user uploads a file to be processed. I do not want the file to exceed size X (in bytes).
How do I limit the file upload size before the POST request stores the file in memory?
I am using uivcorn for testing but I am expecting to deploy this code with google cloud platform (GCP). I am not sure if this can be done on the python code-side or server configuration-side.
Code Snippet:
from fastapi import (
FastAPI,
Path,
File,
UploadFile,
)
app = FastAPI()
@app.post("/")
async def root(file: UploadFile = File(...)):
text = await file.read()
text = text.decode("utf-8")
return len(text)
CodePudding user response:
I found a python library that takes care of this via FastAPI middleware. If the upload file is too large it will throw a 413 HTTP error; "Error: Request Entity Too Large"
from starlette_validation_uploadfile import ValidateUploadFileMiddleware
from fastapi import (
FastAPI,
Path,
File,
UploadFile,
)
app = FastAPI()
#add this after FastAPI app is declared
app.add_middleware(
ValidateUploadFileMiddleware,
app_path="/",
max_size=1048576, #1Mbyte
file_type=["text/plain"]
)
@app.post("/")
async def root(file: UploadFile = File(...)):
#...do something with the file
return {"status: upload successful"}
CodePudding user response:
it usually controls by web server like nginx or Apache but if you want to control in the server side you can use this code:
from fastapi import (
FastAPI,
Path,
File,
UploadFile,
)
app = FastAPI()
@app.post("/")
async def root(file: UploadFile = File(...)):
if len(await file.read()) >= 8388608:
return {"Your file is more than 8MB"}