SynchronousQueue is a very special kind of queue - it implements a rendezvous approach in which producer waits until consumer is ready, consumer waits until producer is ready behind the interface of Queue whereas in Blocking Queue producer don't wait for consumer to ready and consumer won't waits for producer. In this article, we look in the concept of Synchronous Queue and how it differ from Blocking Queue with examples.
Synchronous Queue vs Blocking Queue
Before looking into Synchronous Queue let's see how Blocking Queue works and its approach differ from Synchronous queue.
Blocking queues are those queue that additionally supports operations that wait for the queue to become non-empty when retrieving an element, and wait for space to become available in the queue when storing an element.
Blocking queues support the producer consumer design pattern.
In a producer consumer design built around a blocking queue, producers place data onto the queue as it becomes available, and consumers retrieve data from the queue when they are ready to take the appropriate action. Producers don't need to know anything about the identity or number of consumers, or even whether they are the only producer all they have to do is place data items on the queue. Similarly, consumers need not know who the producers are or where the work came from.
Example : Two people washing the dishes is an example of a producer consumer design.
One person washes the dishes and places them in the dish rack, and the other person retrieves the dishes from the rack and dries them. In this scenario, the dish rack acts as a blocking queue; if there are no dishes in the rack, the consumer waits until there are dishes to dry, and if the rack fills up, the producer has to stop washing until there is more space.
Blocking queues simplify the coding of consumers, since take blocks until data is available. If the producers don't generate work fast enough to keep the consumers busy, the consumers just wait until more work is available. Sometimes this is perfectly acceptable (as in a server application when no client is requesting service), and sometimes it indicates that the ratio of producer threads to consumer threads should be adjusted to achieve better utilization.
If the producers consistently generate work faster than the consumers can process it, eventually the application will run out of memory because work items will queue up without bound.
Synchronous Queue
SynchronousQueue, is not really a queue at all, in that it maintains no storage space for queued elements. Instead, it maintains a list of queued threads waiting to enqueue or dequeue an element.
In the dish washing analogy, this would be like having no dish rack, but instead handing the washed dishes directly to the next available dryer
While this may seem a strange way to implement a queue, it reduces the latency associated with moving data from producer to consumer because the work can be handed off directly. In a traditional queue, the enqueue and dequeue operations must complete sequentially before a unit of work can be handed off.
The direct handoff also feeds back more information about the state of the task to the producer; when the handoff is accepted, it knows a consumer has taken responsibility for it, rather than simply letting it sit on a queue somewhere much like the difference between handing a task to a colleague and merely putting it in her mailbox and hoping she gets it soon.
Since a SynchronousQueue has no storage capacity, put and take will block unless another thread is already waiting to participate in the handoff. Synchronous queues are generally suitable only when there are enough consumers that there nearly always will be one ready to take the handoff. They are well suited for handoff designs, in which an object running in one thread must sync up with an object running in another thread in order to hand it some information, event, or task. This lead to Exchanger which do similar kind of task.
Output:
Washing dish number : Washer
Drying dish number : Washer
Here the output can be first washing dish or drying dish because by default order of waiting thread is not guaranteed.
If you know anyone who has started learning Java, why not help them out! Just share this post with them. Thanks for studying today!...
Synchronous Queue vs Blocking Queue
Before looking into Synchronous Queue let's see how Blocking Queue works and its approach differ from Synchronous queue.
Blocking queues are those queue that additionally supports operations that wait for the queue to become non-empty when retrieving an element, and wait for space to become available in the queue when storing an element.
Blocking queues support the producer consumer design pattern.
- separates the identification of work to be done from the execution of that work by placing work items
- it removes code dependencies between producer and consumer classes
- simplifies workload management by decoupling activities that may produce or consume data at different or variable rates.
In a producer consumer design built around a blocking queue, producers place data onto the queue as it becomes available, and consumers retrieve data from the queue when they are ready to take the appropriate action. Producers don't need to know anything about the identity or number of consumers, or even whether they are the only producer all they have to do is place data items on the queue. Similarly, consumers need not know who the producers are or where the work came from.
Example : Two people washing the dishes is an example of a producer consumer design.
One person washes the dishes and places them in the dish rack, and the other person retrieves the dishes from the rack and dries them. In this scenario, the dish rack acts as a blocking queue; if there are no dishes in the rack, the consumer waits until there are dishes to dry, and if the rack fills up, the producer has to stop washing until there is more space.
Blocking Queue - disk rack |
Blocking queues simplify the coding of consumers, since take blocks until data is available. If the producers don't generate work fast enough to keep the consumers busy, the consumers just wait until more work is available. Sometimes this is perfectly acceptable (as in a server application when no client is requesting service), and sometimes it indicates that the ratio of producer threads to consumer threads should be adjusted to achieve better utilization.
If the producers consistently generate work faster than the consumers can process it, eventually the application will run out of memory because work items will queue up without bound.
- The blocking nature of put greatly simplifies coding of producers; if we use a bounded queue, then when the queue fills up the producers block, giving the consumers time to catch up because a blocked producer cannot generate more work.
- Blocking queues also provide an offer method, which returns a failure status if the item cannot be enqueued. This enables you to create more flexible policies for dealing with overload, such as shedding load, serializing excess work items and writing them to disk, reducing the number of producer threads, or throttling producers in some other manner.
Synchronous Queue
SynchronousQueue, is not really a queue at all, in that it maintains no storage space for queued elements. Instead, it maintains a list of queued threads waiting to enqueue or dequeue an element.
In the dish washing analogy, this would be like having no dish rack, but instead handing the washed dishes directly to the next available dryer
Synchronous Queue - no dish rack |
- A synchronous queue does not have any internal capacity, not even a capacity of one.
- Cannot peek at a synchronous queue because an element is only present when you try to remove it.
- Cannot insert an element (using any method) unless another thread is trying to remove it
- Cannot iterate as there is nothing to iterate
- Does not permit null elements.
- By default, ordering of waiting producer and consumer threads is not guaranteed.
- It is a member of the Java Collections Framework.
While this may seem a strange way to implement a queue, it reduces the latency associated with moving data from producer to consumer because the work can be handed off directly. In a traditional queue, the enqueue and dequeue operations must complete sequentially before a unit of work can be handed off.
The direct handoff also feeds back more information about the state of the task to the producer; when the handoff is accepted, it knows a consumer has taken responsibility for it, rather than simply letting it sit on a queue somewhere much like the difference between handing a task to a colleague and merely putting it in her mailbox and hoping she gets it soon.
Since a SynchronousQueue has no storage capacity, put and take will block unless another thread is already waiting to participate in the handoff. Synchronous queues are generally suitable only when there are enough consumers that there nearly always will be one ready to take the handoff. They are well suited for handoff designs, in which an object running in one thread must sync up with an object running in another thread in order to hand it some information, event, or task. This lead to Exchanger which do similar kind of task.
Washing dish number : Washer
Drying dish number : Washer
Here the output can be first washing dish or drying dish because by default order of waiting thread is not guaranteed.
If you know anyone who has started learning Java, why not help them out! Just share this post with them. Thanks for studying today!...
No comments:
Post a Comment