SoFunction
Updated on 2025-04-14

Magical journey of Python file processing

introduction

Have you ever felt helpless in the face of a bunch of messy documents? Have you ever dreamed of having the ability to read, modify and store data easily? Python file processing may be the magic you dream of.

The importance of file processing

File processing is crucial to:

  • Data persistence: Save the data to disk for subsequent use.
  • Configuration Management: Read and write configuration files to control program behavior.
  • Logging: Record the information when the program is running, making it easy to debug and monitor.

Basic concepts

Before we dive into the file processing, we need to understand some basic concepts:

  • File Object: Abstract used to represent files in Python.
  • File handle: The internal representation used by the operating system to access the file.
  • Open and close files:useopen()The function opens the file and closes the file after the operation is completed.
  • Read and write mode: The file can be opened in modes such as read ('r'), write ('w'), and append ('a').

Main part

Read the file

In Python, reading a file usually involves the following steps:

useopen()The function opens the file in read mode.

Using file objectread()orreadline()Method to read content.

Close the file to free up system resources.

with open('', 'r') as file:
    content = ()
    print(content)

Write to a file

Writing a file is similar to reading, but needs to be opened in write mode:

useopen()The function opens the file in write mode. 2. Use file objectwrite()Method write content.

Close the file.

with open('', 'w') as file:
    ('Hello, World!')

Modify the file

Modifying a file usually involves reading existing content, making changes, and then writing back to the file:

with open('', 'r') as file:
    lines = ()

# Modify contentlines[0] = 'Modified line\n'

with open('', 'w') as file:
    (lines)

Handle different types of files

Text file

Reading and writing text files is the most common file operation. useopen()function and specify the appropriate encoding (e.g.'utf-8')。

CSV file

PythoncsvThe module provides the function of reading and writing CSV files. useandIt can simplify the processing of CSV files.

import csv

with open('', 'r') as file: reader = (file)
for row in reader: print(row)
with open('', 'w', newline='') as file: writer = (file)
(['Name', 'Age', 'City'])
(['John', 30, 'New York'])

JSON files

JSON is a lightweight data exchange format, PythonjsonModules can be serialized and deserialized easily.

import json

data = {'name': 'John', 'age': 30, 'city': 'New York'}
with open('', 'w') as file: (data, file)

Sample code

Let's demonstrate the application of Python file processing in real projects through a case study. In this case, we will simulate a simple log analysis task where we need to extract error information from a series of log files and generate a report containing error statistics.

Suppose we have the following log file format:

2024-06-07 12:00:00 INFO Starting application...
2024-06-07 12:00:05 ERROR Failed to load module!
2024-06-07 12:00:10 INFO User logged in.
2024-06-07 12:00:15 ERROR Database connection failed.
...

Our goal is to count the number of occurrences of each error type and write the result to a new file.

# encoding='utf-8'
from collections import defaultdict
import os
import re

# Define the directory where the log file resideslog_directory = 'logs'
# Define the pattern of log fileslog_pattern = (r'^(\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2}) \S+ (.*)$')

# Dictionary for storing error countserror_counts = defaultdict(int)

# traverse all files in the log directoryfor filename in (log_directory):
    if ('.log'):
        with open((log_directory, filename), 'r') as file:
            for line in file:
                match = log_pattern.match(line)
                if match:
                    _, message = ()
                    if 'ERROR' in message:
                        # Extract error type                        error_type = (':')[1].strip()
                        error_counts[error_type] += 1

# Write error statistics to report filewith open('error_report.txt', 'w') as report_file:
    report_file.write('Error Report\n')
    report_file.write('============\n')
    for error_type, count in error_counts.items():
        report_file.write(f'{error_type}: {count}\n')

print('Error report generated successfully.')

Code explanation

Import module: We importeddefaultdictFor error counting,osfor file and directory operations, andreUsed for regular expression matching.

Define log directories and schemas: We define the regular expression pattern of the directory and log line where the log file resides.

Iterate over log files: We traverse all the specified directory.logfile and read content line by line.

Match and count: For each row, we use regular expressions to match date, time, and log levels. If the line containsERROR, the error type is further extracted and the count is updated.

Generate a report: Finally, we write the error count to a namederror_report.txtin the file.

This case shows how to use Python for file reading, regular expression matching, data collection and report generation, which are common application scenarios for file processing in actual projects.

Case study

This case shows how to use Python for file reading, regular expression matching, data collection and report generation, which are common application scenarios for file processing in actual projects.

in conclusion

In this article, we explore the basic concepts and practices of file processing in Python. Mastering these skills is crucial for any Python developer. Remember to always follow best practices, such as usingwithStatements to automatically manage file opening and closing and handle exceptions.

References

  • Python official documentation:
  • /3/tutorial/
  • Python CSV module documentation:

    /3/library/
  • Python JSON module documentation:/3/library/

Summarize

This is all about this article about Python file processing. For more related Python file processing content, please search for my previous articles or continue browsing the related articles below. I hope everyone will support me in the future!