SoFunction
Updated on 2024-10-28

How to introduce logging in a Python project

Use of the Logging module

Simple to use

The Logging module provides a number of convenient functions

They are. debug(), info(), warning(), error(), critical()

import logging

('debug log test')
('info log test')
('warning log test')
('error log test')
('critical log test')

Output results:

WARNING:root:warning log test
ERROR:root:error log test
CRITICAL:root:critical log test

The reason why only warning, error and critical results are output is because the logging module uses the warning log level by default, which means that only warning and above log levels will be displayed.

The log levels, from highest to lowest, are as follows

(military) rank numerical value When to use
CRITICAL 50 Serious error indicating that the program can no longer be continued
ERROR 40 Due to serious problems, some functions of the program can no longer be performed properly
WARNING 30 Indicates that there have been or will be surprises and that the process is proceeding as expected
INFO 20 Confirmation that the program is working as intended
DEBUG 10 Detailed information, applicable only when diagnosing the problem.
NOTSET 0 No level restrictions

We just need to change the default logging level for logging.

import logging

# Configure logging levels
(level=)

('debug log test')
('info log test')
('warning log test')
('error log test')

The output is as follows:

DEBUG:root:debug log test
INFO:root:info log test
WARNING:root:warning log test
ERROR:root:error log test

Specify the log output style

Of course, we can also specify the log output format

import logging


# Log output style
log_format = '%(levelname)s %(asctime)s %(filename)s %(lineno)d %(message)s'
(format=log_format, level=)

('debug log test')
('info log test')
('warning log test')
('error log test')
('critical log test')

The output is as follows:

DEBUG 2021-05-27 00:04:26,327  65 debug log test
INFO 2021-05-27 00:04:26,327  66 info log test
WARNING 2021-05-27 00:04:26,327  67 warning log test
ERROR 2021-05-27 00:04:26,327  68 error log test
CRITICAL 2021-05-27 00:04:26,327  69 critical log test

Which log message formatting output configuration style description

  • %(levelname)s , log level
  • %(asctime)s , time
  • %(filename)s , filename
  • %(lineno)d , line numbers
  • %(message)s, log messages

These configurations are fixed, can not be written randomly, there are many log formatting style, here only introduces some common format configuration, you can go to the official website to view more formatting configuration information. /zh-cn/3.7/l...

Logging to file

You can write log messages to a file by setting the filename attribute in the

import logging


# Log output style
log_format = '%(levelname)s %(asctime)s %(filename)s %(lineno)d %(message)s'
(
    filename='',
    format=log_format,
    level=
)

('debug log test')
('info log test')
('warning log test')
('error log test')
('critical log test')

After running the program, it looks like this

Custom Logging Configuration

Usually we are in the project is to customize some common logging configuration, and then for the project global use. Write these configurations next time you want to use in other projects between copy and paste over to modify the changes can be. Come to Kang Kang is how to configure.

Preparing Logging Configuration Information

Configuration log details, need to be imported to load log configuration information

First prepare the logging configuration information dictionary

log_dict = {
    'version': 1,
    'disable_existing_loggers': False,  # Whether to disable an already existing logger

    # Log message formatted output configuration
    'formatters': {

        # Simple log output
        'simple': {
            'format': '%(levelname)s %(module)s %(lineno)d %(message)s'
        },

        # Detailed log output
        'verbose': {
            'format': '%(levelname)s %(asctime)s %(filename)s %(lineno)d %(message)s'
        },
    },

    # Log Message Processor Configuration
    'handlers': {
        
        # Output logs to the terminal
        'console': {
            'level': 'DEBUG',                   # Log levels processed, DEBUG and above
            'class': '',   # Log processor
            'formatter': 'simple'               # Log formatting configuration
        },

        # Output log to file
        'file': {
            'level': 'INFO',                                    # Log levels processed, DEBUG and above
            'class': '',    # Use the file log processor
            'formatter': 'verbose',                             # Log formatting configuration
            'filename': './logs/',                      # Log file storage location
            'maxBytes': 1024 * 1024,        # Maximum 10MB per log file, unit: byte
            'backupCount': 20,              # If the file is full, it is automatically expanded to keep up to 20 log files.
            'encoding': 'utf8',
        },
    },

    # Default root logger
    'root': {
        'level': 'DEBUG',           # of log levels allowed to be accepted
        'handlers': ['console']     # Select Log Processor
    },

    # Customized loggers
    'loggers': {
        'server': {
            'level': 'DEBUG',
            'handlers': ['file'],
            'propagate': True       # Set to False to disable passing log messages into the parent logger's handler
        }
    }
}

Preparing Logging Configuration Information

Configuration log details, need to be imported to load log configuration information

First prepare the logging configuration information dictionary

log_dict = {
    'version': 1,
    'disable_existing_loggers': False,  # Whether to disable an already existing logger

    # Log message formatted output configuration
    'formatters': {

        # Simple log output
        'simple': {
            'format': '%(levelname)s %(module)s %(lineno)d %(message)s'
        },

        # Detailed log output
        'verbose': {
            'format': '%(levelname)s %(asctime)s %(filename)s %(lineno)d %(message)s'
        },
    },

    # Log Message Processor Configuration
    'handlers': {
        
        # Output logs to the terminal
        'console': {
            'level': 'DEBUG',                   # Log levels processed, DEBUG and above
            'class': '',   # Log processor
            'formatter': 'simple'               # Log formatting configuration
        },

        # Output log to file
        'file': {
            'level': 'INFO',                                    # Log levels processed, DEBUG and above
            'class': '',    # Use the file log processor
            'formatter': 'verbose',                             # Log formatting configuration
            'filename': './logs/',                      # Log file storage location
            'maxBytes': 1024 * 1024,        # Maximum 10MB per log file, unit: byte
            'backupCount': 20,              # If the file is full, it is automatically expanded to keep up to 20 log files.
            'encoding': 'utf8',
        },
    },

    # Default root logger
    'root': {
        'level': 'DEBUG',           # of log levels allowed to be accepted
        'handlers': ['console']     # Select Log Processor
    },

    # Customized loggers
    'loggers': {
        'server': {
            'level': 'DEBUG',
            'handlers': ['file'],
            'propagate': True       # Set to False to disable passing log messages into the parent logger's handler
        }
    }
}

The keys of the dictionary are fixed, such as version, formatters, handlers, root, loggers, etc. They are all fixed configuration items. There are some sub-options which can be customized such as

  • Under formatters, you can change the names of simple and verbose to what you want.
  • The console and file under handlers can also be modified.
  • The server under loggers can be modified in the same way.

Specific configuration instructions, in this dictionary have one by one notes I will not introduce all, I will introduce handlers log processor configuration

There are a number of logging classes in the logging module, and we can just type in pycharm to bring up the most basic logging classes.

The StreamHandler used above is a stream processor, and the logs will be displayed along with the standard input and output streams of the system, which is what our PyCharm terminal, console, etc. displays.

RotatingFileHandler is a subclass of FileHandler. Its main role is to write the log into the file, when the file content reaches the maximum limit can automatically expand the log file to achieve the log file rotation.

Load Log Configuration Information

The log configuration is then loaded using the () method, which accepts a dictionary argument.

#!/usr/bin/python3
# -*- coding: utf-8 -*-
# @Author: Hui
# @Desc: { use of the logging module logging }
# @Date: 2021/05/26 23:14
import logging
import 

log_dict = {
    'version': 1,
    'disable_existing_loggers': False,  # Whether to disable an already existing logger

    ...Omitted as consistent with above

    # Default root logger
    'root': {
        'level': 'DEBUG',  # Accepted log levels
        'handlers': ['console']
    },
    
    # Customized loggers
    'loggers': {
        'server': {
            'level': 'DEBUG',
            'handlers': ['file'],
            'propagate': True       # Set to False to disable passing log messages into the parent logger's handler
        }
    }
}


def setup_logging():
    """
    Configuring log messages
    :return.
    """
    (config=log_dict)
    logger = ()
	# logger = ('root')
    
    ('debug log test')
    ('info log test')
    ('warning log test')
    ('error log test')


def main():
    setup_logging()


if __name__ == '__main__':
    main()

Use () to get the corresponding configuration logger, which accepts a logger name. If you don't pass it, the root logger will be used by default, which is the same as ('root').

If you run the program between the following errors

ValueError: Unable to configure handler 'file'

That's because you've set up a file processor file in your logging configuration whose log files will be stored in the filename configuration entry, in this case the

./logs/		# represents the files stored in the logs directory under the current path

The Logging module does not automatically create a directory for us, so simply create a logs directory in the current directory.

The final program runs as follows

DEBUG main 74 debug log test
INFO main 75 info log test
WARNING main 76 warning log test
ERROR main 77 error log test

Instead of following the root logger, use the server logger with the following code

import logging
import 

log_dict = {...Ibid. omitted...}

def setup_logging():
    """
    Configuring log messages
    :return.
    """
    (config=log_dict)
    # logger = ()

    logger = ('server')
    ('debug log test')
    ('info log test')
    ('warning log test')
    ('error log test')
    
    
def main():
    setup_logging()


if __name__ == '__main__':
    main()    

The results of the run are as follows:

consoles

DEBUG main 75 debug log test
INFO main 76 info log test
WARNING main 77 warning log test
ERROR main 78 error log test

Log files logs/

Since the server logger is set to 'propagate': True, it will pass the log messages to the parent logger's handler, so not only the console will display the log messages, but the files will also be logged, but the level of the file logging is set to INFO, so the DEBUG debugging log messages will not appear in the files.

Using Logging Profiles

I'm going to use the yaml log configuration file here. The configuration is the same as above, with the addition of an error_file_handler, which is designed to keep the error logs in a separate file for later troubleshooting.

Creating a logging profile

Create a file with the following contents

version: 1
disable_existing_loggers: true

# Log message formatted output configuration
formatters:
    simple:
        format: '%(levelname)s %(filename)s %(lineno)d %(message)s'
    verbose:
        format: '%(levelname)s %(asctime)s -Loc %(filename)s -Row %(lineno)d -%(name)s %(message)s'

# Log Message Processor Configuration
handlers:
    console:
        class: 
        level: DEBUG
        formatter: simple
        stream: ext://

    # Error logs are handled separately
    error_file_handler:
        class: 
        level: ERROR
        formatter: verbose
        filename: ./logs/   # Error log file storage location
        maxBytes: 10485760            # 10MB maximum per log file
        backupCount: 20               # If the file is full, it is automatically expanded to keep up to 20 log files.
        encoding: utf8

    server_file_handler:
      class: 
      level: INFO                     # Only logs at INFO level and above are logged in the file
      formatter: verbose
      filename: ./logs/    # Project log file, record all log messages
      maxBytes: 10485760             # 10MB
      backupCount: 30
      encoding: utf8

# Root logger
root:
    level: DEBUG
    handlers: [console]

# Logger
loggers:
    server:
        level: DEBUG      # Allow printing of DEBUG and above logs
        handlers: [server_file_handler, error_file_handler]
        propagate: True   # Set to False to disable passing log messages into the parent logger's handler

Load Log Configuration Functions

# log_test.py file

import os
import yaml
import logging
import coloredlogs
import 


# Project root path
BASE_DIR = ((__file__))

# Log configuration file
LOG_CONF_FILE = (BASE_DIR, '')


def setup_logging(default_path=LOG_CONF_FILE, default_level=, env_key='LOG_CFG'):
    """
    Configuring project log messages
    :param default_path: default path to log file
    :param default_level: Default log level
    :param env_key: System environment variable name
    :return.
    """
    path = default_path

    value = (env_key, None)  # Get the value of the corresponding environment variable
    if value is not None:
        path = value

    if (path):
        with open(path, mode='r', encoding='utf-8') as f:
            try:
                logging_yaml = yaml.safe_load(())
                (logging_yaml)
                (level='DEBUG')
            except Exception as e:
                print(e)
                print('Unable to load logging configuration file, please check if logging directory is created, use default logging configuration')
                (level=default_level)
                (level=default_level)
    else:
        (level=default_level)
        (level=default_level)
        print('Logging configuration file does not exist, use default logging configuration')

The following third-party libraries are used

  • PyYaml is a configuration file for reading logs in yaml format.
  • Coloredlogs is used to make the logs appear in color on the console.

Then we just run the setup_logging() logging configuration function in our project.

Other modules use ('server') directly to get our configured logger.

# log_demo.py file

import logging

logger = ('server')  # Maintain a global log object

('debug log test')


def log_test1():
    ('info log test')


def log_test2():
    try:
        a = 1 / 0
    except Exception as e:
        (e)


class LogDemo(object):

    @staticmethod
    def log_test():
        ('warning log test')


# log_test.py

def main():
    setup_logging()

    logger = ('server')
    ('debug log test')
    ('info log test')
    ('warning log test')
    ('error log test')

    # Demo of logging in other modules
    import log_demo
    log_demo.log_test1()
    log_demo.log_test2()
    log_demo.LogDemo.log_test()
    
    
if __name__ == '__main__':
    main()    

Log effect display

Running log_test.py results in the following

Console Information

All log profile information

Error log file information

source code (computing)

The source code has been uploaded toGitHub LogSetupDemo, welcome to visit.

The above is how to introduce logging in Python project details, more information about python project to introduce logging please pay attention to my other related articles!