1. Automated file management
1.1 Batch rename files
Suppose you have a batch of files, and the file names need to be modified in batches according to certain rules, and you can use the os and re libraries to implement them.
import os import re # Set directory pathdirectory = 'C:/path/to/your/files' # Get the file listfiles = (directory) # Batch rename filesfor filename in files: new_name = (r'old_pattern', 'new_pattern', filename) # Replace the contents in the file name ((directory, filename), (directory, new_name)) print("File renaming is done!")
1.2 Automatically classify files
Automatically classify files into different folders according to the file extension.
import os import shutil # Set directory pathdirectory = 'C:/path/to/your/files' # Get the file listfiles = (directory) # Define file classification rulesfile_types = { 'images': ['.jpg', '.jpeg', '.png', '.gif'], 'documents': ['.pdf', '.txt', '.docx'], 'audio': ['.mp3', '.wav'] } # Create a folder (if it does not exist)for folder in file_types: if not ((directory, folder)): ((directory, folder)) # Move filesfor filename in files: file_path = (directory, filename) if (file_path): moved = False for folder, extensions in file_types.items(): if any((ext) for ext in extensions): (file_path, (directory, folder, filename)) moved = True break if not moved: print(f"document {filename} No classification!") print("File classification is completed!")
2. Automated timing tasks
2.1 Use the schedule library to execute tasks regularly
schedule is a Python library designed for scheduling timing tasks. You can use it to set tasks that are executed regularly.
import schedule import time # Define the task to be performeddef job(): print("The task begins!") # Execute tasks every 10 seconds(10).(job) # Continuously run taskswhile True: schedule.run_pending() (1)
2.2 Use APScheduler to perform complex timing tasks
APScheduler is a more powerful scheduling task library that supports multiple scheduling methods.
from import BlockingScheduler import datetime # Define the task to be performeddef print_time(): print(f"Current time:{()}") # Create a schedulerscheduler = BlockingScheduler() # Add tasks, execute them every minute at a timescheduler.add_job(print_time, 'interval', minutes=1) # Start the scheduler()
3. Automatically send emails
Using the smtplib library, you can automate email sending, such as sending reports to customers or team members regularly.
import smtplib from import MIMEText from import MIMEMultipart def send_email(subject, body, to_email): from_email = "your_email@" password = "your_password" # Set the email content msg = MIMEMultipart() msg['From'] = from_email msg['To'] = to_email msg['Subject'] = subject (MIMEText(body, 'plain')) # Send email try: server = ('', 587) () (from_email, password) text = msg.as_string() (from_email, to_email, text) () print("The email was sent successfully!") except Exception as e: print(f"Email sending failed:{e}") # Call function to send emailsend_email("Automation Report", "This is the content of the email sent automatically", "recipient_email@")
4. Automated web crawlers
Using requests and BeautifulSoup libraries, web content can be automatically crawled and stored in files.
import requests from bs4 import BeautifulSoup # Define the crawl target URLurl = "" # Send HTTP request to obtain web page contentresponse = (url) # Analyze the content of the web pagesoup = BeautifulSoup(, '') # Get the page titletitle = # Print titleprint(f"Web page title: {title}")
5. Automate data processing
5.1 Using the Pandas library to process data
If you often need to process CSV files or Excel files, you can use the Pandas library to enable data reading, processing, and export.
import pandas as pd # Read CSV filedf = pd.read_csv('') # Perform data processing (for example: filter values greater than 100)df_filtered = df[df['column_name'] > 100] # Save processed data to a new CSV filedf_filtered.to_csv('filtered_data.csv', index=False) print("Data processing is complete!")
5.2 Timely backup database
Databases can be backed up regularly through Python scripts to reduce manual intervention.
import import datetime import os def backup_database(): # Database connection configuration db = ( host="localhost", user="your_user", password="your_password", database="your_database" ) # Create a backup file name backup_filename = f"backup_{().strftime('%Y%m%d_%H%M%S')}.sql" # Use mysqldump for backup (f"mysqldump -u your_user -p'your_password' your_database > {backup_filename}") print(f"Database backup completed!Backup files: {backup_filename}") # Timed backupbackup_database()
6. Automated image processing
If you need to process image files automatically (e.g. batch resize, convert format, etc.), you can use the Pillow library.
from PIL import Image import os # Set image directoryimage_directory = 'C:/path/to/your/images' # Get all image filesfiles = (image_directory) # Batch resize imagefor filename in files: if ('.jpg'): image_path = (image_directory, filename) with (image_path) as img: img = ((800, 600)) # Resize to 800x600 ((image_directory, f"resized_{filename}")) print("Image processing is complete!")
7. Automate web operations
If you need to automate interaction with web pages, you can use Selenium to simulate browser operations.
from selenium import webdriver # Set up WebDriverdriver = (executable_path="path/to/chromedriver") # Open the web page("") # Find and click a buttonbutton = driver.find_element_by_xpath("//button[@id='submit']") () # Get web contentpage_content = driver.page_source print(page_content) # Close the browser()
Summarize
Automating daily tasks with Python can greatly improve efficiency and reduce repetitive work. Through various libraries in Python (such as os, shutil, schedule, smtplib, requests, pandas, Pillow, etc.), you can easily implement various automated tasks such as file management, timing tasks, email sending, web crawling, data processing, etc.
This is the article about the example code of Python automation processing daily tasks. For more related Python automation processing, please search for my previous articles or continue browsing the related articles below. I hope everyone will support me in the future!