Home > database >  Why worker in heroku cannot read file from web main app in heroku and how to do so?
Why worker in heroku cannot read file from web main app in heroku and how to do so?

Time:10-16

I have a streamlit file deployed to heroku. With the following structure

  • main.py: main function is to write a file into csv format in the same folder when a button is clicked from main.py
  • scheduler.py: file located in the same folder takes the same csv format file generate by main.py and writes it to bigquery

This is deployed in heroku with procfiles specifying

  • web: sh setup.sh && streamlit run main.py
  • worker: python scheduler.py

All the codes are working fine. However the problem is that when I deployed it to heroku, scheduler file runs and push to bigquery, however no file is being pushed. Is this because that worker and web is in different environment, so worker cannot read the file that has been written by main.py?

How can I do a background push to bigquery without affecting the main.py?

CodePudding user response:

Is this because that worker and web is in different environment, so worker cannot read the file that has been written by main.py?

That's correct. You can't pass information between your processes that way because each one runs on a separate dyno.

I suggest you store your data elsewhere, e.g. in a database like PostgreSQL. If you would prefer to continue using CSV files, you could use Amazon S3 or Azure Blob Storage instead. Both of your dynos can connect to these external services.

Side note: I'm not sure exactly what your scheduler.py does, but if it's mostly sitting around idle and occasionally running a job I wouldn't run it as a worker. You'll be paying for it to do nothing most of the time.

Heroku has a scheduler that would be a better fit. Simply schedule a job that runs the underlying command to push to BigQuery.

  • Related