Java Socket reports an error and opens too many files
Caused by: : Too many open files
at .socket0(Native Method)
at (:415)
at .<init>(:88)
at (:56)
at (:108)
There are many reasons
It is said on the Internet that the default number of files allowed to be opened by the system is too small. For network access and high concurrent access, you can solve it by modifying the system parameters.
Check the current settings first:
$ ulimit -a core file size (blocks, -c) 0 data seg size (kbytes, -d) unlimited file size (blocks, -f) unlimited max locked memory (kbytes, -l) unlimited max memory size (kbytes, -m) unlimited open files (-n) 256 pipe size (512 bytes, -p) 1 stack size (kbytes, -s) 8192 cpu time (seconds, -t) unlimited max user processes (-u) 1418 virtual memory (kbytes, -v) unlimited
It can be seen that the setting of "open files" for my personal computer is 256.
Can be modified by naming -n option
ulimit -n 4096
But there is a problem with this option, which is that it only takes effect temporarily. For example, it will fail after restarting the computer. In order to ensure that it is always effective, some Linux systems modify file settings:
* soft nofile 65535
* hard nofile 65535
By modifying the /etc/security/ file, change the upper limit of all users' process opening files to 65536.
Among them, * means all users, soft/hard means soft/hard, and you can only make changes to a certain user or a certain group. For the specific method, please refer to the file comments. After modification, the system needs to be restarted before it takes effect.
But sometimes, the problem cannot be solved by modifying the parameter alone. Some are caused by program problems. For example, I once encountered a program task. Due to the failure to connect to the server, a policy set failed attempts, but did not limit the number of attempts, resulting in trying to connect to the socket every second, which quickly caused the system to hang up and this error was thrown.
Other programs cannot complete as long as they read and write files.
for example:
: Too many open files (Accept failed)
at (Native Method)
at (:409)
at (:545)
at (:513)
Caused by: : /tmp/data_xxxxxx.dat (Too many open files)
at .open0(Native Method)
at (:195)
at .<init>(:138)
There was even a memory overflow error:
out of disk space or the JVM running out of memory
To solve this problem
If you have already located a problem with a program, you can analyze it through the program's log, or you can check whether a thread has opened too many file handles through the command.
- First, query the thread ID through keywords
- Otherwise, java thread:
ps -ef | grep java
- Then use the thread ID to view the list of files opened by the thread:
lsof -p 1902
- You can also count how many files are opened first
lsof -p 1902 | wc -l
Summarize
The above is personal experience. I hope you can give you a reference and I hope you can support me more.