Packages

final class ConcurrentQueue[F[_], A] extends Serializable

A high-performance, back-pressured, generic concurrent queue implementation.

This is the pure and generic version of monix.execution.AsyncQueue.

Example

import cats.implicits._
import cats.effect._
import monix.execution.Scheduler.global

// For being able to do IO.start
implicit val cs = SchedulerEffect.contextShift[IO](global)
// We need a `Timer` for this to work
implicit val timer = SchedulerEffect.timer[IO](global)

def consumer(queue: ConcurrentQueue[IO, Int], index: Int): IO[Unit] =
  queue.poll.flatMap { a =>
    println(s"Worker $$index: $$a")
    consumer(queue, index)
  }

for {
  queue     <- ConcurrentQueue[IO].bounded[Int](capacity = 32)
  consumer1 <- consumer(queue, 1).start
  consumer2 <- consumer(queue, 1).start
  // Pushing some samples
  _         <- queue.offer(1)
  _         <- queue.offer(2)
  _         <- queue.offer(3)
  // Stopping the consumer loops
  _         <- consumer1.cancel
  _         <- consumer2.cancel
} yield ()

Back-Pressuring and the Polling Model

The initialized queue can be limited to a maximum buffer size, a size that could be rounded to a power of 2, so you can't rely on it to be precise. Such a bounded queue can be initialized via ConcurrentQueue.bounded. Also see BufferCapacity, the configuration parameter that can be passed in the ConcurrentQueue.withConfig builder.

On offer, when the queue is full, the implementation back-pressures until the queue has room again in its internal buffer, the task being completed when the value was pushed successfully. Similarly poll awaits the queue to have items in it. This works for both bounded and unbounded queues.

For both offer and poll, in case awaiting a result happens, the implementation does so asynchronously, without any threads being blocked.

Multi-threading Scenario

This queue supports a ChannelType configuration, for fine tuning depending on the needed multi-threading scenario. And this can yield better performance:

  • MPMC: multi-producer, multi-consumer
  • MPSC: multi-producer, single-consumer
  • SPMC: single-producer, multi-consumer
  • SPSC: single-producer, single-consumer

The default is MPMC, because that's the safest scenario.

import monix.execution.ChannelType.MPSC
import monix.execution.BufferCapacity.Bounded

val queue = ConcurrentQueue[IO].withConfig[Int](
  capacity = Bounded(128),
  channelType = MPSC
)

WARNING: default is MPMC, however any other scenario implies a relaxation of the internal synchronization between threads.

This means that using the wrong scenario can lead to severe concurrency bugs. If you're not sure what multi-threading scenario you have, then just stick with the default MPMC.

Source
ConcurrentQueue.scala
Linear Supertypes
Serializable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. ConcurrentQueue
  2. Serializable
  3. AnyRef
  4. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def clear: F[Unit]

    Removes all items from the queue.

    Removes all items from the queue.

    Called from the consumer thread, subject to the restrictions appropriate to the implementation indicated by ChannelType.

    WARNING: the clear operation should be done on the consumer side, so it must be called from the same thread(s) that call poll.

  6. def clone(): AnyRef
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.CloneNotSupportedException]) @native()
  7. def drain(minLength: Int, maxLength: Int): F[Seq[A]]

    Fetches multiple elements from the queue, if available.

    Fetches multiple elements from the queue, if available.

    This operation back-pressures until the minLength requirement is achieved.

    minLength

    specifies the minimum length of the returned sequence; the operation back-pressures until this length is satisfied

    maxLength

    is the capacity of the used buffer, being the max length of the returned sequence

    returns

    a future with a sequence of length between minLength and maxLength; it can also be cancelled, interrupting the wait

  8. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  9. def equals(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef → Any
  10. def finalize(): Unit
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.Throwable])
  11. final def getClass(): Class[_ <: AnyRef]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  12. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  13. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  14. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  15. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  16. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  17. def offer(a: A): F[Unit]

    Pushes a value in the queue, or if the queue is full, then repeats the operation until it succeeds.

    Pushes a value in the queue, or if the queue is full, then repeats the operation until it succeeds.

    returns

    a task that when evaluated, will complete with a value, or wait until such a value is ready

  18. def offerMany(seq: Iterable[A]): F[Unit]

    Pushes multiple values in the queue.

    Pushes multiple values in the queue. Back-pressures if the queue is full.

    returns

    a task that will eventually complete when the push has succeeded; it can also be cancelled, interrupting the waiting

  19. def poll: F[A]

    Fetches a value from the queue, or if the queue is empty it awaits asynchronously until a value is made available.

    Fetches a value from the queue, or if the queue is empty it awaits asynchronously until a value is made available.

    returns

    a task that when evaluated, will eventually complete after the value has been successfully pushed in the queue

  20. final def synchronized[T0](arg0: => T0): T0
    Definition Classes
    AnyRef
  21. def toString(): String
    Definition Classes
    AnyRef → Any
  22. def tryOffer(a: A): F[Boolean]

    Try pushing a value to the queue.

    Try pushing a value to the queue.

    The protocol is unsafe because usage of the "try*" methods imply an understanding of concurrency, or otherwise the code can be very fragile and buggy.

    a

    is the value pushed in the queue

    returns

    true if the operation succeeded, or false if the queue is full and cannot accept any more elements

    Annotations
    @UnsafeProtocol()
  23. def tryPoll: F[Option[A]]

    Try pulling a value out of the queue.

    Try pulling a value out of the queue.

    The protocol is unsafe because usage of the "try*" methods imply an understanding of concurrency, or otherwise the code can be very fragile and buggy.

    returns

    Some(a) in case a value was successfully retrieved from the queue, or None in case the queue is empty

    Annotations
    @UnsafeProtocol()
  24. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  25. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  26. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException]) @native()

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped