SoFunction
Updated on 2025-04-20

Detailed explanation of commonly used blocking queues and non-blocking queues in Java

Queue Overview and Common Methods

A queue is an ordered list, which can be implemented using an array or a linked list, and follows the first-in-first-out principle. On this basis, it is also divided into blocking, non-blocking, priority queues, etc.

Any queues are implemented in JavaQueueInterface and queue interface definition methods are as follows:

Method name describe Notice
add(E) Add a zombie Add an element, and the addition returns true; if the queue space is full, an exception is thrown
offer(E) Add a zombie Add an element, return true after successful addition; if the queue space is full, return false
remove() Return and delete the head of the team element If the queue is empty, an exception is thrown
poll() Return and delete the head of the team element If the queue is empty, return null
element() Return to the head of the team element, but not removed If the queue is empty, an exception will be thrown
peek() Return to the head of the team element, but not removed If the queue is empty, return null

1 Non-blocking queue

1 ConcurrentLinkedQueue (thread safe)

Unbounded concurrent queues, non-blocking queues with one-way linked list structures, and thread-safe is achieved by CAS

  • Is there a boundary: no boundary
  • Is it thread safe: Yes
  • Performance: High

2 ConcurrentLinkedDeque (thread safe)

The unbounded concurrent queue and non-blocking queue of the bidirectional linked list structure are thread-safe by CAS. Compared with ConcurrentLinkedQueue, ConcurrentLinkedDeque provides more flexible operations in some scenarios, such as inserting and deleting elements from both ends.

  • Is there a boundary: no boundary
  • Is it thread safe: Yes
  • Performance: slightly lower than ConcurrentLinkedQueue

3 PriorityQueue (thread unsafe)

PriorityQueue is a priority heap-based queue in Java, based on an array implementation, which can be sorted by the priority of elements and remains in an orderly state when inserting and deleting elements. Specifically, elements in PriorityQueue must implement the Comparable interface or provide a Comparator object through a constructor to compare and sort elements.

  • Is there a boundary: no boundary
  • Is thread safe? No (If you need thread safe, you can use PriorityBlockingQueue)
  • Performance: slightly lower than normal queues (both insertion and fetch operations require heap adjustment and sorting operations)

PriorityQueue is based on arrays, which can automatically expand capacity, but its capacity is limited. When inserting and deleting elements, PriorityQueue will readjust the heap structure according to the element's priority, ensuring that the top element of the heap must be the highest priority element.

2 Blocking queue (thread safety)

Blocking queueBlockingQueueInheritedQueueInterface is a type of queue, blocking queue isThread safety, blocking queues add two important interfaces to the queue basisput() take()

Method\processing method throw an exception Return true or false Always blocked Timeout Exit
Insert method add(e) offer(e) put(e) offer(e,time,unit)
Removal method remove() poll() take() poll(time,unit)
Inspection method element() peek() Not available Not available

Notice:

The blocking queue used in ThreadPoolExecutor is used to achieve thread failure to end to achieve thread reuse.

Common blocking queues, the commonly used blocking queues in Juc are as follows:

1. ArrayBlockingQueue

A bounded blocking queue implemented based on an array structure must be set when creating an ArrayBlockingQueue object. And fairness and inequity can be specified, which is not fair by default, that is, the queue with the longest waiting time is not guaranteed to be accessed first.

It uses a synchronization mechanism to ensure that multiple threads access to the queue will not have race conditions. Specifically, ArrayBlockingQueue usesReentrantLockandConditionTo ensure thread safety and queue blocking and wake-up.

  • Is there a boundary:
  • Is it thread safe: Yes
  • Performance: High

2. LinkedBlockingQueue

A bounded blocking queue implemented based on a linked list structure. If the capacity size is not specified when creating a LinkedBlockingQueue object, the default size is Integer.MAX_VALUE

  • Is there boundary: None
  • Is it thread safe: Yes
  • Performance: Performance is lower than ArrayBlockingQueue

The implementation of LinkedBlockingQueue uses two locks, one for access to the producer thread and the other for access to the consumer thread. This can avoid race conditions in high concurrency scenarios and ensure thread safety.

LinkedBlockingQueue has slightly lower performance than ArrayBlockingQueue because it uses linked lists instead of arrays to store elements.

However, since LinkedBlockingQueue is an unbounded queue, it is more flexible when it needs to dynamically adjust its capacity, and can automatically expand or shrink the capacity according to actual conditions.

In general, LinkedBlockingQueue is a very practical data structure and is widely used in multi-threaded scenarios such as producer-consumer models and thread pools.

3. SynchronousQueue

SynchronousQueue does not store any elements inside. It only blocks when waiting for other threads to insert elements into or delete the queue. Each put operation must wait for a take operation, otherwise the elements cannot be added.

  • Is there a boundary:
  • Is it thread safe: Yes
  • performance:-

Specifically, when a thread calls SynchronousQueueput()When a method is used, if no other thread is waiting to remove elements from the queue, the current thread will block until another thread calls ittake()Method to remove this element.

Similarly, when a thread calls the take() method, if no other threads are waiting to insert elements into the queue, the current thread will also block until another thread calls the put() method to insert elements.

SynchronousQueue can be used in special scenarios, such as efficient interaction between producers and consumers. In this case, the producer thread handes the element directly to the consumer thread for processing without the need to pass the intermediate cache. In addition, SynchronousQueue can also be used to implement some concurrency algorithms and data structures, such as fair exchangers.

It should be noted that SynchronousQueue does not allow null elements. If you try to insert null elements, a NullPointerException will be thrown.

4. LinkedTransferQueue

An unbounded blocking queue implemented based on the linked list structure isSynchronousQueue and LinkedBlockingQueueIt uses CAS (Compare and Swap) operations to ensure concurrency security. Unlike other blocking queues, LinkedTransferQueue uses multiple nodes to implement queues. Each node contains an element and a pointer to the next node. This method can avoid performance bottlenecks caused by using locks in concurrent scenarios. The performance is higher than LinkedBlockingQueue (no lock operation), and can store more elements than SynchronousQueue.

  • Is there boundary: None
  • Is it thread safe: Yes
  • Performance: High

Compared with ordinary blocking queues, the TransferQueue interface has added several methods:

public interface TransferQueue<E> extends BlockingQueue<E> {
    // If possible, transfer the elements immediately to the waiting consumer.    // More precisely, if there is a consumer who has been waiting to receive it (in take or timed poll (long, TimeUnit) poll), the specified element is sent immediately, otherwise false is returned.    boolean tryTransfer(E e);

    // Transfer elements to the consumer and wait if needed.    // More precisely, if there is a consumer who has been waiting to receive it (in take or timed poll (long, TimeUnit) poll), the specified element is immediately transmitted, otherwise it will wait until the element is received by the consumer.    void transfer(E e) throws InterruptedException;

    // Set the timeout based on the above method    boolean tryTransfer(E e, long timeout, TimeUnit unit) throws InterruptedException;

    // Return true if at least one consumer is waiting    boolean hasWaitingConsumer();

    // Return the estimated value of the number of people waiting for consumers    int getWaitingConsumerCount();
}

However, when using LinkedTransferQueue, you need to pay attention to some details:

  • LinkedTransferQueue does not support null elements. If a null element is inserted, a NullPointerException will be thrown.
  • When using the transfer() method, if the queue is full or empty, the thread calling the transfer() method will be blocked until another thread inserts the element or takes it away. Therefore, when using the transfer() method, you need to pay attention to thread blocking problems to avoid deadlocks or thread starvation.
  • LinkedTransferQueue's iterator can only be used to traverse elements in the current queue, and there is no guarantee that elements in the queue will not be modified or deleted during iteration.

5. LinkedBlockingDeque

A bidirectional blocking queue implemented based on a bidirectional linked list structure, which can insert and delete elements from both ends of the queue at the same time, so it supportsqueueandStackoperation.

  • Is there a boundary: no boundary, but the queue upper limit can be set
  • Is it thread safe: Yes
  • Performance: High

The following points should be noted when using LinkedBlockingDeque:

  • LinkedBlockingDeque does not support null elements. If a null element is inserted, a NullPointerException will be thrown.
  • The blocking feature of LinkedBlockingDeque may cause thread starvation, i.e. some threads have been unable to get access to the queue. To avoid this, you can specify the capacity when creating LinkedBlockingDeque, or useputFirst()putLast()takeFirst()takeLast()Non-blocking operations.

6. PriorityBlockingQueue

A blocking queue based on priority, it is implemented using an array and has unlimited capacity. It sorts elements according to the priority of the elements and dequeues in priority order. Each element dequeue is the highest priority element.

The precautions for PriorityBlockingQueue are as follows:

  • PriorityBlockingQueue allows insertion of null elements, but does not allow insertion of incomparable elements. If an incomparable element is inserted, a ClassCastException will be thrown.
  • The iteration order of PriorityBlockingQueue is not arranged in the order of priority of elements. Although PriorityBlockingQueue can ensure that each element retrieved is the highest priority element, its order cannot be guaranteed when it comes to iterating PriorityBlockingQueue.
  • PriorityBlockingQueue When inserting an element, it sorts the elements according to the priority of the element. Therefore, if the priority of elements in the queue changes, the offer, add, and put methods of elements in the queue need to be called to reorder.
  • PriorityBlockingQueue does not support remove(Object) operations because it cannot quickly find and delete the specified element. If you need to delete the specified element, you can first copy the element in the PriorityBlockingQueue to another collection, then delete the specified element from the new collection, and finally add the element in the new collection to the PriorityBlockingQueue.
  • PriorityBlockingQueue uses ReentrantLock to ensure thread safety when iterating, inserting and deleting elements. Therefore, when using PriorityBlockingQueue, you need to pay attention to avoiding deadlocks and hunger.

Simple example:

import ;

public class Task implements Comparable<Task> {
    private String name;
    private int priority;

    public Task(String name, int priority) {
         = name;
         = priority;
    }

    public String getName() {
        return name;
    }

    public int getPriority() {
        return priority;
    }

    @Override
    public int compareTo(Task o) {
        return (, );
    }
}

public class PriorityBlockingQueueExample {
    public static void main(String[] args) {
        // Create a PriorityBlockingQueue object        PriorityBlockingQueue<Task> queue = new PriorityBlockingQueue<Task>();
        // Add tasks to queue        (new Task("Task1", 3));
        (new Task("Task2", 2));
        (new Task("Task3", 1));
        // Remove tasks in the queue        while (!()) {
            Task task = ();
            ("Execute task " + () + " with priority " + ());
        }
    }
}

Output result:
Execute task Task1 with priority 3
Execute task Task2 with priority 2
Execute task Task3 with priority 1

7. DelayQueue

It is a delay blocking queue, which can be stored and implementedDelayedThe element of the interface, where each element has an expiration time, that is, when the expiration time of the element arrives, the element will be taken out.

  • Is there a bounded: unbounded, but in practical applications, we can control the size of the queue by specifying the initial capacity to avoid problems such as memory overflow.
  • Is it thread safe: Yes
  • Performance: The performance is slightly lower than that of ordinary blocking queues (both insertion and acquisition operations require heap adjustment and sorting operations)

DelayQueue internal usePriorityQueueImplementation, elements will be sorted by expiration time, that is, elements with the shortest expiration time are ranked at the head of the queue. The insertion operation of DelayQueue is non-blocking, but the withdrawal operation is blocking, that is, when there are no expired elements in the queue, the withdrawal operation will be blocked until an expired element in the queue is inserted.

The following points should be noted when using DelayQueue:

  • The elements of DelayQueue must be implementedDelayedInterface and rewritegetDelay()andcompareTo()Method, where the getDelay() method returns the remaining time (in milliseconds) of the element's expiration time from the current time, and the compareTo() method is used to compare the expiration time of the element.
  • DelayQueue internal useReentrantLockandConditionImplement thread synchronization and blocking operations, so it is thread-safe.
  • The fetch operation of DelayQueue may be blocked. If you need to return immediately when the queue is empty, you can use the poll() method instead of the take() method.
  • DelayQueue is theoretically unbounded because it uses PriorityQueue to store elements, and the length of PriorityQueue is not limited. However, if you want to control the size of the DelayQueue, you can specify an initial capacity when creating the DelayQueue instance.
DelayQueue<DelayedTask> queue = new DelayQueue<DelayedTask>(new ArrayList<DelayedTask>(capacity));

Below is a simple example code using DelayQueue. When the element is retrieved, the queue will automatically sort according to the element's expiration time and wait for the element to expire before taking it out.

Note that in actual use, the expiration time and comparison method of elements need to be set according to the specific business scenario:

import ;
import ;
import ;

public class DelayQueueDemo {
    public static void main(String[] args) throws InterruptedException {
        DelayQueue&lt;DelayElement&gt; queue = new DelayQueue&lt;&gt;();

        // Add elements to queue        (new DelayElement("A", 3000));
        (new DelayElement("B", 2000));
        (new DelayElement("C", 1000));

        // Remove the element        (());
        (());
        (());
    }
}

class DelayElement implements Delayed {
    private String name;
    private long expireTime;

    public DelayElement(String name, long delayTime) {
         = name;
         = () + delayTime;
    }

    @Override
    public long getDelay(TimeUnit unit) {
        return expireTime - ();
    }

    @Override
    public int compareTo(Delayed o) {
        return (, ((DelayElement) o).expireTime);
    }

    @Override
    public String toString() {
        return "DelayElement{" +
                "name='" + name + '\'' +
                '}';
    }
}

Summarize

The above is personal experience. I hope you can give you a reference and I hope you can support me more.