- for `delete_all` this is a bugfix
(physical equality was documented but not implemented)
- `delete_one` is unchanged, it already had complexity O(n)
and ensured physical equality
Committing to these complexities in documentation is not a constraint
for representation of heaps, because they are achieved by every
well-known representation (for some of them, in amortized time):
https://en.wikipedia.org/wiki/Template:Heap_Running_Times
- `find_min`: O(1)
- `take`: O(log n)
- `insert`: O(log n)
- `merge`: O(log(m+n)) (excepted binary heaps which only achieve O(m+n))
- `add_seq`: O(n log(m+n)) (trivially, by repeated insertion)
+ this can be improved to O(log(m) + n), regardless of the
representation of heaps (to be done in a later commit)
- `of_seq`: O(n log n) (ditto: can be improved to O(n))
Less trivial:
- `filter`, `delete_{one,all}`:
+ O(n) can be achieved for any reasonable representation of heaps, by
using `of_seq` and `to_seq` which, as said, can always be made O(n).
+ With the current implementation, it is not obvious, but the
complexity of `filter` and `delete_all` is Θ(n log n); the
complexity of `delete_one` is O(n). Indeed, node rebuilding with
`_make_node` is in O(1), merging is in Θ(log n), and every element
deletion induces one merge; there are heap instances that achieve
the worst case Ω(n log n), for instance:
x
/ \
x y
/ \
... y
/
x
/ \
h y
with n/3 occurrences of x, n/3 occurrences of y, a sub-heap h of n/3
elements, and when y is greater than all elements of h; then,
deleting all occurrences of x performs the following computation:
merge (merge (merge (merge h y) …) y) y
where each `merge` takes time Θ(log n).