I have an extension method for Microsoft.Extensions.Logging.ILogger which checks if the given log level is enabled or not. And then proceeds with the actual logging. When I try to unit test this method, I see that logger.IsEnabled(logLevel) always returns false for all the log levels in the unit test project, causing my tests to fail.
If I remove this isEnabled check, then my unit test passes.
Even though, a default log level is set in the appsettings.json file in the actual project as well as unit test project.
The code for my extension method is:
public static class LoggerExtensions
{
public static void LogErrorExt(this ILogger logger, string? message, params object?[] args)
{
if (logger.IsEnabled(LogLevel.Error))
{
logger.LogError(message, args);
}
}
}
The code for my unit test class is this:
[TestClass]
public class ImprovedExtensionsTests
{
private readonly Mock < ILogger > _logger;
public ImprovedExtensionsTests()
{
_logger = new Mock < ILogger > ();
}
[TestMethod]
public void TestOne()
{
_logger.Object.LogErrorExt("Testing ", 12);
_logger.Verify(x => x.Log(
LogLevel.Error,
It.IsAny < EventId > (),
It.IsAny < It.IsAnyType > (),
It.IsAny < Exception > (),
(Func < It.IsAnyType, Exception, string > ) It.IsAny < object > ()), Times.Once);
}
}
How can I set the log levels in unit test project and make them enabled while running the tests?
CodePudding user response:
You have to create the necessary setup for your ILogger mock, e.g.:
_logger.Setup(m => m.IsEnabled(LogLevel.Error)).Returns(true);
before the call to LogErrorExt
(e.g. directly after _logger = new Mock<ILogger>();
).