High Concurrency

When facing high concurrency applications, we often find a number of generic problems. In this article I will focus on the problems of resources (CPU and memory). For now on, I will focus on the most typical and most direct solutions.

When we discover threads and the advantages of parallel processing it can happen that we end up abusing their use. We have a lot of threads (100 ¿? 1000?) simultaneously, and the processor will be jumping from one to another without stopping, not letting them finish, no matter how fast is their real excution. And over time there will be more and more threads only slowing down the process. To the cost of execution of each thread, we must consider also the added cost of creating and destroying threads. It can can become significant when we talk about so many threads at once.

High Concurrency with the Thread Pool Pattern
High Concurrency with the Thread Pool Pattern

Threads: the holy grail

In this case, the first method that we think of is the Thread Pool Pattern . This pattern will limit the number of threads running at the same time.
Instead of creating new threads, we create tasks, which are piled. Also, we have a pool of threads that will work picking these up and running as soon as possible. A classic example of this thread can be found on SwingWorker. If we want to implement bare hands our own pattern, we should take a look at the interface ExecutorService.

If you have a background thread that is making heavy use of processor, but we do not mind slowing it down for performance, we can use the command sleep ( Thread.sleep (...)) to periodically release the thread processor, allowing other threads to run faster .

This is useful for threads running in maintenance mode, which must be kept running but do not have to respond in real time. Another way to temporarily stop a running thread while another is using the method join ( Thread.Join () ), which makes a thread wait until another thread ends. Although more useful if we have a clearly higher priority thread than another, it is not viable if we can not have a reference to a higher priority thread from the lowest priority to tell which thread has to wait.

High Concurrency issues

But the high turnout is not given only by the use of the processor. It may be that multiple threads need access to large amounts of information almost simultaneously. These threads will not only be repeating the information in memory but often will be repeating the entire process of extracting that information.

This problem is usually solved in the majority of data access libraries (mostly database). For example, we have ehcache , which uses threads to store information ( Thread-Specific Storage Pattern ). This way, access and storage of this information is shared. Thus decreasing both the memory usage required and the processor time required to extract and shape information. As the threads wants to process this information, they will be asking ehcache for the data, which will optimize these hits.

To improve this solution have the concurrent collections. This allow different threads to use the same objects without any problems of concurrency.

There are more solutions to improve the high turnout (without going into optimizations to the code itself). But those described here are usually good ideas to start.

Useful References:

Author: María Arias de Reyna Domínguez

Redhatter, Feminist, Geoinquieta, Atheist, crazy of the pussy and Social Justice Warrior. Chaotic good.

One thought on “High Concurrency”

Leave a Reply

Your e-mail address will not be published. Required fields are marked *