Java thread pool and thread pool tool classes

Introduction to thread pool

Starting a new thread involves interacting with the operating system, and the cost is relatively high. The thread pool creates a large number of idle threads when the program starts. The program passes the thread object to the thread pool, and the thread will start a thread to execute its run() or call() method. After the method is executed, the thread will not die, but return to the thread pool called idle state again and wait for the next execution. Using thread pool can avoid frequent creation and destruction of threads and reuse the created threads. At the same time, using thread pool can effectively control the number of concurrent threads in the system.

Thread pool usage and parameter introduction

Thread pool usage

ThreadPoolExecutor ThreadPoolExecutor = new ThreadPoolExecutor (parameter);

public static void main(String[] args) {
        MyTestThreadPool myTestThreadPool = new MyTestThreadPool(3,3,0L,TimeUnit.MILLISECONDS,
            new ArrayBlockingQueue<>(10));

        for (int i = 0; i < 100; i++) {
            Runnable target = () -> {
                for (int j = 0; j < 10; j++) {
                    System.out.println("j = " + j);
                }
            };
            myTestThreadPool.submit(target);
        }
    }

Thread pool full parameter construction source code:

    public ThreadPoolExecutor(int corePoolSize,
                              int maximumPoolSize,
                              long keepAliveTime,
                              TimeUnit unit,
                              BlockingQueue<Runnable> workQueue,
                              ThreadFactory threadFactory,
                              RejectedExecutionHandler handler) {
        if (corePoolSize < 0 ||
            maximumPoolSize <= 0 ||
            maximumPoolSize < corePoolSize ||
            keepAliveTime < 0)
            throw new IllegalArgumentException();
        if (workQueue == null || threadFactory == null || handler == null)
            throw new NullPointerException();
        this.acc = System.getSecurityManager() == null ?
                null :
                AccessController.getContext();
        this.corePoolSize = corePoolSize;
        this.maximumPoolSize = maximumPoolSize;
        this.workQueue = workQueue;
        this.keepAliveTime = unit.toNanos(keepAliveTime);
        this.threadFactory = threadFactory;
        this.handler = handler;
    }

Introduction to thread pool parameters

Seven core parameters

corePoolSize
Specifies the number of core threads in the thread pool
maximumPoolSize
Specifies the maximum number of threads in the thread pool: number of core threads + number of non core threads
keepAliveTime
When the number of threads in the thread pool exceeds the corePoolSize, the survival time of redundant threads (non core thread survival time), that is, the time after which non core threads are destroyed
unit
The time unit of keepAliveTime
workQueue
Task queue, which is submitted to the task that has not been executed at that time
threadFactory
Thread factory for creating threads
handler
Reject policy. How to reject a task when there are too many tasks to handle.

Waiting queue

SynchronousQueue
Synchronous queue has no capacity. Each put operation needs to wait for a corresponding take operation. If synchronous queue is used, the submitted task will not be saved, but the new task will be submitted to the thread for execution.
ArrayBlockingQueue
Array implements a bounded blocking queue. The constructor of ArrayBlockingQueue must take parameters to specify the maximum capacity of the queue. This queue sorts elements in first in first out order. You can set whether the queue is fair new ArrayBlockingQueue(int capacity, boolean fair) through the constructor. By default, it is an unfair access queue.
LinkedBlockingQueue
LinkedBlockingQueue is based on linked list. The LinkedBlockingQueue uses independent locks for producers and consumers to control data synchronization, which also means that in the case of high concurrency, producers and consumers can operate the data in the queue in parallel to improve the concurrency performance of the whole queue.
PriorityBlockingQueue
Priority task queue, with execution priority, can control the order of task execution. You can customize the compareTo method to sort, or you can specify the construction parameter Comparator to sort the data.
DelayQueue
It is an unbounded blocking queue that supports Delayed acquisition of elements. The queue is implemented by PriorityQueue. The elements in the queue must implement the Delayed interface. When creating elements, you can specify how long to obtain the current element from the queue. Elements can only be extracted from the queue when the delay expires.

Reject policy

Built in rejection policy in JDK

When the number of tasks in the thread pool exceeds the load, the reject policy is used. Four rejection policies are built in the JDK:

  • AbortPolicy: this policy directly throws an exception to prevent the thread from continuing to work;
  • CallerRunsPolicy: this policy runs the current task directly in the caller thread as long as the thread pool is not closed. This method may cause the performance of task submission thread to decline sharply;
  • DiscardOldestPolicy: this policy will discard the oldest request, that is, a task to be executed, and try to submit it again;
  • DiscardPolicy: this policy discards unprocessable tasks by default.

Custom implementation rejection policy

Extend the RejectedExecutionHandler interface by itself.

Thread pool execution process

Brief description: core thread (next step if unsatisfied, otherwise execute the task) - > wait queue (next step if unsatisfied, otherwise execute the task) - > non core thread (next step if unsatisfied, otherwise execute the task) - > reject policy.
Specific steps:

  • When adding a task, the thread will make the following judgment:
    (1) If the number of executing threads is less than corePoolSize, create a thread to execute the task immediately;
    (2) If the number of executing threads is greater than or equal to corePoolSize, put the task into the waiting queue;
    (3) If the queue is full and the number of running threads is less than maximumPoolSize, create a non core thread to process the task;
    (4) If the queue is full and the number of running threads is greater than or equal to maximumPoolSize, RejectedExecutionHandler (the rejection policy set by the thread pool) will be thrown.
  • When a thread completes a task, it will take a task from the queue to execute.
  • When a thread has nothing to do and exceeds the set waiting time, if the number of threads in the current thread pool is greater than maximumPoolSize, the thread will be killed and the number of threads in the thread pool will shrink to the number of core threads.

Thread pool tool class to create thread pool

Thread pools with specific functions are provided in the java.util.concurrent.Executors tool class.

newFixedThreadPool

This method creates a reusable thread pool with a fixed number of threads. The number of threads in the pool is always the same. When a new task is submitted, if there are idle threads in the thread pool, it will be executed immediately. If not, the new task will be put into the waiting queue.

Source code analysis:

public static ExecutorService newFixedThreadPool(int nThreads) {
    return new ThreadPoolExecutor(nThreads, nThreads,
                                  0L, TimeUnit.MILLISECONDS,
                                  new LinkedBlockingQueue<Runnable>());
}

The number of core threads in the thread pool is equal to the maximum number of threads. LinkedBlockingQueue is used as the waiting queue. The default size of the queue is Integer.MAX_VALUE, when the task is submitted very frequently, it may cause a large number of threads to accumulate.

newSingleThreadExecutor

This method creates a thread pool with only a single thread. If the thread is deleted when submitting the task, the task enters the waiting queue. When the thread is idle, the tasks in the queue are executed in the order of first in first out.
Source code analysis:

public static ExecutorService newSingleThreadExecutor() {
    return new FinalizableDelegatedExecutorService
        (new ThreadPoolExecutor(1, 1,
                                0L, TimeUnit.MILLISECONDS,
                                new LinkedBlockingQueue<Runnable>()));
}

Both the core thread pool and the maximum thread pool of the thread pool are 1. LinkedBlockingQueue is used as the waiting queue. The default size of the queue is Integer.MAX_VALUE, which may cause a large number of threads to accumulate.

newCachedThreadPool

This method creates a thread pool with caching function. The system creates threads as needed, and these threads will be cached in the thread pool. The number of threads in the thread pool is uncertain, but if there are idle threads that can be reused, the reusable threads will be used first. If all threads are working, a new thread is created to handle task. After the execution of the current thread, all threads will return to the thread pool for reuse.
Source code analysis:

public static ExecutorService newCachedThreadPool() {
    return new ThreadPoolExecutor(0, Integer.MAX_VALUE,
                                  60L, TimeUnit.SECONDS,
                                  new SynchronousQueue<Runnable>());
}

The number of core threads in the thread pool is 0, and the maximum number of threads is Integer.MAX_VALUE uses the SynchronousQueue waiting queue, which does not store tasks. If a task is put in, it must wait for a thread to process the task. Idle threads are recycled after 60 seconds. Using this thread pool, if a large number of tasks are submitted at the same time and the task execution is not fast, the system will create a large number of threads to process the task, which may quickly exhaust system resources.

newScheduledThreadPool

This method creates a thread pool with a specified number of threads, which can execute thread tasks or periodically execute a task after a specified delay. This method returns a ScheduledExecutorService object.
Source code analysis:

public static ScheduledExecutorService newScheduledThreadPool(int corePoolSize) {
    return new ScheduledThreadPoolExecutor(corePoolSize);
}
public ScheduledThreadPoolExecutor(int corePoolSize) {
    super(corePoolSize, Integer.MAX_VALUE, 0, NANOSECONDS,
          new DelayedWorkQueue());
}

newSingleThreadScheduledExecutor

This method returns the ScheduledExecutorService object. The size of the thread pool is 1.
Source code analysis:

public static ScheduledExecutorService newSingleThreadScheduledExecutor() {
    return new DelegatedScheduledExecutorService
        (new ScheduledThreadPoolExecutor(1));
}

newWorkStealingPool

This method holds enough thread pools to support a given level of parallelism. It also uses multiple queues to reduce contention.
Source code analysis:

    public static ExecutorService newWorkStealingPool() {
        return new ForkJoinPool
            (Runtime.getRuntime().availableProcessors(),
             ForkJoinPool.defaultForkJoinWorkerThreadFactory,
             null, true);
    }

Custom exception handler

1. When customizing the thread pool, override the protected void afterExecute(Runnable r, Throwable t) method to handle exceptions;
2. Define your own exception handler through setUncaughtExceptionHandler(new MyExHandler()) of Thread; Method to set the exception handler.

class MyExHandler implements  Thread.UncaughtExceptionHandler{
    @Override
    public void uncaughtException(Thread t, Throwable e) {
        System.out.println("thread " + t + "An exception occurred:" + e);
    }
}

Tags: Java Multithreading Concurrent Programming

Posted on Wed, 10 Nov 2021 03:46:43 -0500 by egmax