As an operation and maintenance engineer, processing logs and analyzing data is common.sort
Commands are a magic tool for efficient text sorting in Linux, which can quickly sort, deduplicate and count file content. This article uses the simplest way to help you master itsort
core usage.
1. Basic sorting: instantly kill messy text
# By default, sort in ascending order in dictionary order (file/input stream)sort # Example: Sort log time (assuming the first column is time)sort /var/log/nginx/
2. Practical parameters: precise control sorting
Sort by numerical valueWhen processing numbers, be sure to use-n
, avoid "10" being ranked behind "2"!
sort -n
Order in reverse order-r
Implement from large to small or from Z to A:
sort -nr large_numbers.txt # Numerical inverse order
Sort by specified columnuse-k
Select the column,-t
Specify delimiters (such as commas, colon):
# Sort CSV files by column 2 (numerical)sort -t',' -k2n
Go to the heavy-u
Quickly clean up duplicate rows (sorted first):
sort -u > unique_ips.txt
Ignore case-f
Let "Apple" and "apple" be treated as the same:
sort -f mixed_case.txt
3. Practical operation and maintenance scenarios
1. Statistics log IP access frequency
cat | awk '{print $1}' | sort | uniq -c | sort -nr
-
Steps to disassemble:
-
awk
Extract IP columns -
sort
Sort souniq
statistics -
uniq -c
count -
sort -nr
Reverse order by visits
-
2. Sort processes by memory usage
ps aux --sort=-%mem | head -n 10
-
--sort=-%mem
Equivalent tosort -k4nr
(In reverse order of memory in column 4)
3. Merge multiple sorted files
sort -m >
-
-m
(merge) is much more efficient than reordering large files
4. Avoiding pits
Performance optimization:
Useful when processing oversized files-T
Specify temporary directories (avoid insufficient default partition space):
sort -T /mnt/big_disk/tmp/ huge_file.txt
Locale:
When not English sorting exceptions, setLC_ALL=C
Disable localization rules:
LC_ALL=C sort
Stable sorting:
If you need to preserve the original order of equal value rows, add-s
(stable sort)。
5. Summary
sort
+ awk
/uniq
The combination of commands is a Swiss army knife for operation and maintenance analysis data. Master the core parameters:-n
(value),-k
(List),-t
(separator),-r
(Reverse order),-u
(Deduplication) can meet 90% of sorting needs.
remember:Before processing data, use it firsthead
orTest the command to avoid direct operation of large files to overturn!
This is the article about the quick guide to getting started with Linux sort commands. For more related Linux sort command content, please search for my previous articles or continue browsing the related articles below. I hope everyone will support me in the future!