SoFunction
Updated on 2025-04-14

Linux Shell implements log monitoring and alarm system

1. Log monitoring basics

Monitor file changes:

  • tail -f: Check the changes at the end of the file in real time.
  • tail -n: Specifies to view the most recent N rows.

Combined with the key content of pipeline filtering:

Extract specific keywords with grep.

example:

tail -f /var/log/syslog | grep "error"

2. Log file scrolling

Understand the scrolling mechanism of log files (such as logs cut by size or time).

Use logrotate for log management.

3. Real-time alarm mechanism

Combined with awk to extract a specific mode and trigger an alarm.

Mail Alarm:

Use the mail or sendmail command to send mail.

Terminal prompt:

Use echo and notify-send.

example:

tail -f  | awk '/ERROR/ {print "Error detected: "$0}'

4. Complex monitoring scenarios

Multi-keyword matching and hierarchical alarm.

Monitor multiple files and aggregate results.

5. Experimental examples

Experiment 1: Simple log monitoring

Goal: Monitor a log file in real time and extract lines containing "ERROR".

Experimental code:

#!/bin/bash
 
logfile=""
 
if [[ ! -f $logfile ]]; then
  echo "Log file not found: $logfile"
  exit 1
fi
 
echo "Monitoring $logfile for 'ERROR'..."
tail -f $logfile | grep "ERROR"

Experiment 2: Dynamically monitor logs and send alarms

Target: Detect error messages in the log file and display an alarm in the terminal.

Experimental code:

#!/bin/bash
 
logfile=""
 
if [[ ! -f $logfile ]]; then
  echo "Log file not found: $logfile"
  exit 1
fi
 
monitor_log() {
  tail -f $logfile | while read line; do
    if echo "$line" | grep -q "ERROR"; then
      echo "[ALERT] $(date): $line"
    fi
  done
}
 
monitor_log

Experiment 3: Log keyword hierarchical alarm

Target: Classify alarms according to the log content, such as "ERROR" triggers high priority alarms, and "WARNING" triggers normal alarms.

Experimental code:

#!/bin/bash
 
logfile=""
 
if [[ ! -f $logfile ]]; then
  echo "Log file not found: $logfile"
  exit 1
fi
 
tail -f $logfile | while read line; do
  if echo "$line" | grep -q "ERROR"; then
    echo "[HIGH PRIORITY ALERT] $(date): $line"
  elif echo "$line" | grep -q "WARNING"; then
    echo "[Warning] $(date): $line"
  fi
done

Experiment 4: Monitoring multi-log files

Goal: Monitor multiple log files at the same time and merge the results.

Experimental code:

#!/bin/bash
 
logfiles=("/var/log/syslog" "/var/log/")
 
for logfile in "${logfiles[@]}"; do
  if [[ -f $logfile ]]; then
    tail -f $logfile | awk -v log=$logfile '{print "["log"] "$0}' &
  else
    echo "File not found: $logfile"
  fi
done
 
wait

Experiment 5: Customized alarm system

Target: Send email notifications based on log information.

Experimental code:

#!/bin/bash
 
logfile=""
email="admin@"
 
if [[ ! -f $logfile ]]; then
  echo "Log file not found: $logfile"
  exit 1
fi
 
tail -f $logfile | while read line; do
  if echo "$line" | grep -q "CRITICAL"; then
    echo "Critical alert detected: $line" | mail -s "Critical Alert" $email
    echo "Email sent for alert: $line"
  fi
done

6. Practical

Write a script to monitor /var/log/syslog, extract the rows containing "Failed" and count the number of times.

#!/bin/bash
 
logfile="/var/log/syslog"
count=0
 
if [[ ! -f $logfile ]]; then
  echo "Log file not found: $logfile"
  exit 1
fi
 
echo "Monitoring $logfile for 'Failed'..."
 
tail -f $logfile | while read line; do
  if echo "$line" | grep -q "Failed"; then
    count=$((count + 1))
    echo "$line"
    echo "Total 'Failed' entries: $count"
  fi
done

Implement a script to monitor the file growth of a specified folder.

#!/bin/bash
 
monitor_dir="/path/to/your/directory"
 
if [[ ! -d $monitor_dir ]]; then
  echo "Directory not found: $monitor_dir"
  exit 1
fi
 
echo "Monitoring file changes in $monitor_dir..."
prev_count=$(ls "$monitor_dir" | wc -l)
 
while true; do
  current_count=$(ls "$monitor_dir" | wc -l)
  if [[ $current_count -ne $prev_count ]]; then
    echo "$(date): File count changed from $prev_count to $current_count"
    prev_count=$current_count
  fi
  sleep 2
done

Improved the code of Experiment 3 to support specifying keywords and alarm levels through configuration files.

#!/bin/bash
 
logfile=""
config_file=""
 
if [[ ! -f $logfile ]]; then
  echo "Log file not found: $logfile"
  exit 1
fi
 
if [[ ! -f $config_file ]]; then
  echo "Config file not found: $config_file"
  exit 1
fi
 
declare -A keywords
 
while IFS=: read -r keyword level; do
  keywords["$keyword"]=$level
done < "$config_file"
 
tail -f $logfile | while read line; do
  for keyword in "${!keywords[@]}"; do
    if echo "$line" | grep -q "$keyword"; then
      echo "[${keywords[$keyword]} PRIORITY] $(date): $line"
    fi
  done
done

Use tail -f and awk to implement real-time log monitoring to count the number of visits per minute in the log.

#!/bin/bash
 
logfile=""
if [[ ! -f $logfile ]]; then
  echo "Log file not found: $logfile"
  exit 1
fi
 
echo "Monitoring $logfile for access counts per minute..."
 
tail -f $logfile | awk '
{
  timestamp = substr($4, 2, 17)  # Extract the timestamp, formatted as "dd/MMM/yyyy:HH:mm"  split(timestamp, time_parts, ":")
  minute = time_parts[1] ":" time_parts[2]  # Only keep for minutes  access_counts[minute]++
  print "Access count for " minute ": " access_counts[minute]
}'

Write scripts to automatically archive and compress log files exceeding the specified size.

#!/bin/bash
 
logfile=""
max_size=1048576  # 1 MB in bytes
archive_dir="archives"
 
mkdir -p "$archive_dir"
 
while true; do
  if [[ -f $logfile ]]; then
    log_size=$(stat -c%s "$logfile")
    if (( log_size &gt; max_size )); then
      timestamp=$(date +'%Y%m%d_%H%M%S')
      mv "$logfile" "$archive_dir/application_$"
      gzip "$archive_dir/application_$"
      echo "Archived and compressed $logfile at $timestamp"
      &gt; "$logfile"  # Clear the original log file    fi
  fi
  sleep 10
done

This is the article about Linux Shell's log monitoring and alarm system implementation. For more related Shell log monitoring and alarm content, please search for my previous articles or continue browsing the related articles below. I hope everyone will support me in the future!