Home > Mobile >  Create 2 loggers on same python process
Create 2 loggers on same python process

Time:04-03

I have a logger instance I pass to functions.

def function(inp1, logger)
   logger.info('important')
   print('sessional')

Some not important info I do print, the other I log (to not oversize output file).

I want to change all "print" to "logger" that have a maximum file size in round robin - new msg will overwritten over the oldest one.

Can I do that on the same logger instance so I wont need to pass another instance? so one logger will always log and second logger will log round-robin.

CodePudding user response:

You can attach more than one handler to a logger.

That means that you could attach to your logger:

  • one FileHandler that will log event at or above the important level (say WARNING)
  • one RotatingFileHandler that will log all events (say at or above INFO or DEBUG)

With such a config, the second handler would log all events also logged by the first one. If this is a problem, you could add a filter to that second one to reject events having a level at (or above) your important level.

CodePudding user response:

From logging documentation: Multiple calls to getLogger() with the same name will always return a reference to the same Logger object.

So what you want to do in your script is just to call the getLogger() again with the same name.

#test script 
import logging

url_info_logger = logging.getLogger('URL_Fetcher')
general_logger = logging.getLogger("GENERAL")

def function(inp1, logger)
   url_info_logger.info('important')
   general_logger.info('sessional')

It should return the same logging object, since it is already defined before you call it the second time.

  • Related