I have a long script which I run on a remote server and I want to log all outputs as well as error messages to a file.
- I know how to log all terminal outputs (e.g. print()) to a .txt file:
# in script:
import sys sys.stdout = open('./myfile.txt', 'w')
# ...
sys.stdout.close()
# in terminal:
python myscript.py > ./myfile.txt
This writes all print() outputs to a file, which I want. But it does not write error messages to the file in case it fails.
- I know how to log errors
import logging
try:
# ...
except ZeroDivisionError as e:
logging.error(e) # ERROR:root:division by zero
The problem with all solutions linked to the logging module is, that I need to know where the error will occur to wrap it in a logging.error() function. But I don't always know where my errors will occur.
=> How do I (1) write all outputs and (2) the error that makes my script fail to a file?
CodePudding user response:
You have to redirect python error output sys.stderr
to a file:
import sys
sys.stderr = open("errors.txt", "w")
CodePudding user response:
I think it is possible to create a new file that contains all the logs you want to write. Configuration would look something like this:
import logging
logging.basicConfig(filename='std.log', filemode='w', format='%(name)s - %(levelname)s - %(message)s')
this will output the logging messages to the std.log file