mirror of
https://github.com/c-cube/moonpool.git
synced 2025-12-06 03:05:30 -05:00
98 lines
8.6 KiB
HTML
98 lines
8.6 KiB
HTML
<!DOCTYPE html>
|
||
<html xmlns="http://www.w3.org/1999/xhtml"><head><title>Picos_std_sync (picos_std.Picos_std_sync)</title><meta charset="utf-8"/><link rel="stylesheet" href="../../_odoc-theme/odoc.css"/><meta name="generator" content="odoc 3.1.0"/><meta name="viewport" content="width=device-width,initial-scale=1.0"/><script src="../../highlight.pack.js"></script><script>hljs.initHighlightingOnLoad();</script></head><body class="odoc"><nav class="odoc-nav"><a href="../index.html">Up</a> – <a href="../../index.html">Index</a> » <a href="../index.html">picos_std</a> » Picos_std_sync</nav><header class="odoc-preamble"><h1>Module <code><span>Picos_std_sync</span></code></h1><p>Basic communication and synchronization primitives for <a href="../../picos/Picos/index.html"><code>Picos</code></a>.</p><p>This library essentially provides a conventional set of communication and synchronization primitives for concurrent programming with any Picos compatible scheduler.</p><p>For the <a href="#examples" title="examples">examples</a> we open some modules:</p><pre class="language-ocaml"><code> open Picos_std_structured
|
||
open Picos_std_sync</code></pre></header><div class="odoc-tocs"><nav class="odoc-toc odoc-local-toc"><ul><li><a href="#modules">Modules</a></li><li><a href="#examples">Examples</a><ul><li><a href="#a-simple-bounded-queue">A simple bounded queue</a></li></ul></li><li><a href="#conventions">Conventions</a></li></ul></nav></div><div class="odoc-content"><h2 id="modules"><a href="#modules" class="anchor"></a>Modules</h2><div class="odoc-spec"><div class="spec module anchored" id="module-Mutex"><a href="#module-Mutex" class="anchor"></a><code><span><span class="keyword">module</span> <a href="Mutex/index.html">Mutex</a></span><span> : <span class="keyword">sig</span> ... <span class="keyword">end</span></span></code></div><div class="spec-doc"><p>A mutual-exclusion lock or mutex.</p></div></div><div class="odoc-spec"><div class="spec module anchored" id="module-Condition"><a href="#module-Condition" class="anchor"></a><code><span><span class="keyword">module</span> <a href="Condition/index.html">Condition</a></span><span> : <span class="keyword">sig</span> ... <span class="keyword">end</span></span></code></div><div class="spec-doc"><p>A condition variable.</p></div></div><div class="odoc-spec"><div class="spec module anchored" id="module-Semaphore"><a href="#module-Semaphore" class="anchor"></a><code><span><span class="keyword">module</span> <a href="Semaphore/index.html">Semaphore</a></span><span> : <span class="keyword">sig</span> ... <span class="keyword">end</span></span></code></div><div class="spec-doc"><p><a href="Semaphore/Counting/index.html"><code>Counting</code></a> and <a href="Semaphore/Binary/index.html"><code>Binary</code></a> semaphores.</p></div></div><div class="odoc-spec"><div class="spec module anchored" id="module-Lazy"><a href="#module-Lazy" class="anchor"></a><code><span><span class="keyword">module</span> <a href="Lazy/index.html">Lazy</a></span><span> : <span class="keyword">sig</span> ... <span class="keyword">end</span></span></code></div><div class="spec-doc"><p>A lazy suspension.</p></div></div><div class="odoc-spec"><div class="spec module anchored" id="module-Latch"><a href="#module-Latch" class="anchor"></a><code><span><span class="keyword">module</span> <a href="Latch/index.html">Latch</a></span><span> : <span class="keyword">sig</span> ... <span class="keyword">end</span></span></code></div><div class="spec-doc"><p>A dynamic single-use countdown latch.</p></div></div><div class="odoc-spec"><div class="spec module anchored" id="module-Ivar"><a href="#module-Ivar" class="anchor"></a><code><span><span class="keyword">module</span> <a href="Ivar/index.html">Ivar</a></span><span> : <span class="keyword">sig</span> ... <span class="keyword">end</span></span></code></div><div class="spec-doc"><p>An incremental or single-assignment poisonable variable.</p></div></div><div class="odoc-spec"><div class="spec module anchored" id="module-Stream"><a href="#module-Stream" class="anchor"></a><code><span><span class="keyword">module</span> <a href="Stream/index.html">Stream</a></span><span> : <span class="keyword">sig</span> ... <span class="keyword">end</span></span></code></div><div class="spec-doc"><p>A lock-free, poisonable, many-to-many, stream.</p></div></div><h2 id="examples"><a href="#examples" class="anchor"></a>Examples</h2><h3 id="a-simple-bounded-queue"><a href="#a-simple-bounded-queue" class="anchor"></a>A simple bounded queue</h3><p>Here is an example of a simple bounded (blocking) queue using a mutex and condition variables:</p><pre class="language-ocaml"><code> module Bounded_q : sig
|
||
type 'a t
|
||
val create : capacity:int -> 'a t
|
||
val push : 'a t -> 'a -> unit
|
||
val pop : 'a t -> 'a
|
||
end = struct
|
||
type 'a t = {
|
||
mutex : Mutex.t;
|
||
queue : 'a Queue.t;
|
||
capacity : int;
|
||
not_empty : Condition.t;
|
||
not_full : Condition.t;
|
||
}
|
||
|
||
let create ~capacity =
|
||
if capacity < 0 then
|
||
invalid_arg "negative capacity"
|
||
else {
|
||
mutex = Mutex.create ();
|
||
queue = Queue.create ();
|
||
capacity;
|
||
not_empty = Condition.create ();
|
||
not_full = Condition.create ();
|
||
}
|
||
|
||
let is_full_unsafe t =
|
||
t.capacity <= Queue.length t.queue
|
||
|
||
let push t x =
|
||
let was_empty =
|
||
Mutex.protect t.mutex @@ fun () ->
|
||
while is_full_unsafe t do
|
||
Condition.wait t.not_full t.mutex
|
||
done;
|
||
Queue.push x t.queue;
|
||
Queue.length t.queue = 1
|
||
in
|
||
if was_empty then
|
||
Condition.broadcast t.not_empty
|
||
|
||
let pop t =
|
||
let elem, was_full =
|
||
Mutex.protect t.mutex @@ fun () ->
|
||
while Queue.length t.queue = 0 do
|
||
Condition.wait
|
||
t.not_empty t.mutex
|
||
done;
|
||
let was_full = is_full_unsafe t in
|
||
Queue.pop t.queue, was_full
|
||
in
|
||
if was_full then
|
||
Condition.broadcast t.not_full;
|
||
elem
|
||
end</code></pre><p>The above is definitely not the fastest nor the most scalable bounded queue, but we can now demonstrate it with the cooperative <code>Picos_mux_fifo</code> scheduler:</p><pre class="language-ocaml"><code> # Picos_mux_fifo.run @@ fun () ->
|
||
|
||
let bq =
|
||
Bounded_q.create ~capacity:3
|
||
in
|
||
|
||
Flock.join_after ~on_return:`Terminate begin fun () ->
|
||
Flock.fork begin fun () ->
|
||
while true do
|
||
Printf.printf "Popped %d\n%!"
|
||
(Bounded_q.pop bq)
|
||
done
|
||
end;
|
||
|
||
for i=1 to 5 do
|
||
Printf.printf "Pushing %d\n%!" i;
|
||
Bounded_q.push bq i
|
||
done;
|
||
|
||
Printf.printf "All done?\n%!";
|
||
|
||
Control.yield ();
|
||
end;
|
||
|
||
Printf.printf "Pushing %d\n%!" 101;
|
||
Bounded_q.push bq 101;
|
||
|
||
Printf.printf "Popped %d\n%!"
|
||
(Bounded_q.pop bq)
|
||
Pushing 1
|
||
Pushing 2
|
||
Pushing 3
|
||
Pushing 4
|
||
Popped 1
|
||
Popped 2
|
||
Popped 3
|
||
Pushing 5
|
||
All done?
|
||
Popped 4
|
||
Popped 5
|
||
Pushing 101
|
||
Popped 101
|
||
- : unit = ()</code></pre><p>Notice how the producer was able to push three elements to the queue after which the fourth push blocked and the consumer was started. Also, after canceling the consumer, the queue could still be used just fine.</p><h2 id="conventions"><a href="#conventions" class="anchor"></a>Conventions</h2><p>The optional <code>padded</code> argument taken by several constructor functions, e.g. <a href="Latch/index.html#val-create"><code>Latch.create</code></a>, <a href="Mutex/index.html#val-create"><code>Mutex.create</code></a>, <a href="Condition/index.html#val-create"><code>Condition.create</code></a>, <a href="Semaphore/Counting/index.html#val-make"><code>Semaphore.Counting.make</code></a>, and <a href="Semaphore/Binary/index.html#val-make"><code>Semaphore.Binary.make</code></a>, defaults to <code>false</code>. When explicitly specified as <code>~padded:true</code> the object is allocated in a way to avoid <a href="https://en.wikipedia.org/wiki/False_sharing">false sharing</a>. For relatively long lived objects this can improve performance and make performance more stable at the cost of using more memory. It is not recommended to use <code>~padded:true</code> for short lived objects.</p><p>The primitives provided by this library are generally optimized for low contention scenariors and size. Generally speaking, for best performance and scalability, you should try to avoid high contention scenarios by architecting your program to distribute processing such that sequential bottlenecks are avoided. If high contention is unavoidable then other communication and synchronization primitive implementations may provide better performance.</p></div></body></html>
|