SoFunction
Updated on 2025-03-09

Analysis of common thread pool principles and usage methods of Java

1. Introduction

What is a thread pool?

Everyone may have heard of the concept of pool. Pool is equivalent to a container, which contains many things that you can use immediately. There are thread pools, connection pools, etc. in Java. A thread pool creates some free threads when the system starts or instantiates the pool, waiting for work scheduling. After executing the task, the thread will not be destroyed immediately, but will be in an idle state again, waiting for the next scheduling.

What is the working mechanism of thread pool?

In the programming mode of thread pool, task submission is not submitted directly to the thread, but to the pool. After the thread pool gets the task, it will look for whether there are free threads. If there are, it will be assigned to the free thread for execution. If there are no, it will enter the waiting queue and continue to wait for the free thread. If the maximum number of accepted work is exceeded, the rejection policy of the thread pool is triggered.

Why use thread pool?

The creation and destruction of threads requires a large amount of resources, and repeated creation and destruction are obviously unnecessary. Moreover, the advantage of pooling is that it responds quickly and picks it up when needed, so there will be no time to wait for creation. Thread pools can manage threads inside the system well, such as quantity and scheduling.

2. Introduction to common thread pools

The Java class ExecutorService is the parent interface of the thread pool, not the top-level interface. The following four commonly used thread pool types can be ExecutorService.

Single thread pool ()
There is only a unique thread for work scheduling, which can ensure the execution order of tasks (FIFO, LIFO)

package ;

import ;
import ;
import ;
import ;

public class PoolTest {
	public static void main(String[] args) {
		// Create a single thread pool		ExecutorService singleThreadExecutor = ();
		List<String> list = new ArrayList<String>();
		("first");
		("second");
		("third");
		(o -> {
			// traversal collection submission task			(new Runnable() {

				@Override
				public void run() {
					(().getName() + " : " + o);
					try {
						// 1s interval						(1000);
					} catch (InterruptedException e) {
						();
					}
				}
			});
		});
	}
}

Execution results:

pool-1-thread-1 : first

pool-1-thread-1 : second

pool-1-thread-1 : third

Cacheable thread pool ()

If there are threads that can be used in the thread pool, use them. If not, create a new thread in the pool. The maximum number of threads in the thread pool can be Integer.MAX_VALUE. It is usually used to run tasks that are short-term and often used.

package ;

import ;
import ;
import ;
import ;

public class PoolTest {
	public static void main(String[] args) {
		// Create a cacheable thread pool		ExecutorService cachedThreadPool = ();
		List<String> list = new ArrayList<String>();
		("first");
		("second");
		("third");
		(o -> {

			try {
				// 3s interval				(3000);
			} catch (InterruptedException e) {
				();
			}

			// traversal collection submission task			(new Runnable() {

				@Override
				public void run() {
					(().getName() + " : " + o);
					try {
						// 1s interval						(1000);
					} catch (InterruptedException e) {
						();
					}
				}
			});
		});
	}
}

Execution results:

pool-1-thread-1 : first

pool-1-thread-1 : second

pool-1-thread-1 : third

Because the interval time is long, the previous task has been completed when the next task is running, so the thread can continue to be reused. If the interval time is short, some threads will use the new thread to run.

Reduce the waiting time for each task from 3s to 1s:

Execution results:

pool-1-thread-1 : first

pool-1-thread-2 : second

pool-1-thread-1 : third

fixed-length thread pool (int nThreads)
Create a thread pool with fixed number of threads, and the parameters are passed in manually

package ;

import ;
import ;
import ;
import ;

public class PoolTest {
	public static void main(String[] args) {
		// Create a cacheable thread pool		ExecutorService fixedThreadPool = (3);
		List<String> list = new ArrayList<String>();
		("first");
		("second");
		("third");
		("fourth");
		(o -> {

			try {
				// 1s interval				(1000);
			} catch (InterruptedException e) {
				();
			}

			// traversal collection submission task			(new Runnable() {

				@Override
				public void run() {
					(().getName() + " : " + o);
					try {
						// 1s interval						(1000);
					} catch (InterruptedException e) {
						();
					}
				}
			});
		});
	}
}

Execution results:

pool-1-thread-1 : first

pool-1-thread-2 : second

pool-1-thread-3 : third

pool-1-thread-1 : fourth

Timed thread pool (int corePoolSize)
Create a fixed-length thread pool that supports timed and periodic task execution

package ;

import ;
import ;
import ;

public class PoolTest {
	public static void main(String[] args) {
		// Create a fixed-length thread pool, support timing, delay, and periodic tasks		ScheduledExecutorService scheduledThreadPool = (3);
		(new Runnable() {

			@Override
			public void run() {
				(().getName() + " : Execute every 3 seconds after 1 second");
			}
		}, 1, 3, );
	}
}

Execution results:

pool-1-thread-1: Execute every 3 seconds after 1 second

pool-1-thread-1: Execute every 3 seconds after 1 second

pool-1-thread-2: Execute every 3 seconds after 1 second

pool-1-thread-2: Execute every 3 seconds after 1 second

pool-1-thread-2: Execute every 3 seconds after 1 second

pool-1-thread-2: Execute every 3 seconds after 1 second

pool-1-thread-2: Execute every 3 seconds after 1 second

3. Custom thread pool

Common constructors:

ThreadPoolExecutor(int corePoolSize, int maximumPoolSize, long keepAliveTime, TimeUnit unit, BlockingQueue<Runnable> workQueue)

Parameter description:

1. corePoolSize The core thread size. When the number of threads is <corePoolSize, a thread will be created to execute runnable.

2. maximumPoolSize The maximum number of threads. When the number of threads >= corePoolSize, the runnable will be placed in the workQueue.

3. keepAliveTime: Keep the survival time, when the number of threads is greater than the maximum time that the idle thread with corePoolSize can maintain.

4. Unit time unit

5. WorkQueue saves the blocking queue of tasks

6. threadFactory creates thread factory

7. Handler rejection strategy

Task execution order:

1. When the number of threads is less than corePoolSize, create a thread to execute the task.

2. When the number of threads is greater than or equal to corePoolSize and the workQueue is not full, put it in the workQueue

3. The number of threads is greater than or equal to corePoolSize and when the workQueue is full, the new task is created and the total number of threads must be less than maximumPoolSize.

4. When the total number of threads is equal to maximumPoolSize and the workQueue is full, execute the rejectedExecution of the handler. That is, rejection strategy.

ThreadPoolExecutor has four rejection policies by default:

1. new () throws exception directly RejectedExecutionException

2. new () directly calls the run method and blocks execution

3. new () directly discard the subsequent tasks

4. new () discard the task in the queue head

Buffer queue BlockingQueue:

BlockingQueue is a double buffered queue. BlockingQueue uses two queues internally, allowing two threads to store one and fetch the queue at the same time. While ensuring concurrency security, the queue access efficiency is improved.

Several commonly used BlockingQueues:

  • ArrayBlockingQueue(int i): A BlockingQueue of a specified size, and its construction must specify the size. The objects it contains are sorted in FIFO order.
  • LinkedBlockingQueue() or (int i): BlockingQueue with unfixed size. If the size is specified during construction, the generated BlockingQueue has a size limit and does not specify the size. The size is determined by Integer.MAX_VALUE. The objects it contains are sorted in FIFO order.
  • PriorityBlockingQueue() or (int i): Similar to LinkedBlockingQueue, but the sorting of objects it contains is not FIFO, but is determined based on the natural order of the objects or the Comparator of the constructor.
  • SynchronizedQueue(): A special BlockingQueue, the operation on which must be completed alternately by putting and fetching.
package ;

import ;
import ;
import ;
import ;

public class PoolTest {
	public static void main(String[] args) {
		// Work queue		LinkedBlockingDeque&lt;Runnable&gt; workQueue = new LinkedBlockingDeque&lt;Runnable&gt;();
		// Reject policy		RejectedExecutionHandler handler = new ();
		ThreadPoolExecutor threadPoolExecutor = new ThreadPoolExecutor(2, 10, 20, , workQueue, handler);
		(new Runnable() {

			@Override
			public void run() {
				("Custom thread pool");
			}
		});
	}
}

The above is all the content of this article. I hope it will be helpful to everyone's study and I hope everyone will support me more.