Soft heap

In computer science, a soft heap is a variant on the simple heap data structure that has constant amortized time complexity for 5 types of operations. This is achieved by carefully "corrupting" (increasing) the keys of at most a constant number of values in the heap.

Definition and performance

[edit]

The constant time operations are:[1][2]

  • create(S): Create a new soft heap
  • insert(S, x): Insert an element into a soft heap
  • meld(S, S' ): Combine the contents of two soft heaps into one, destroying both
  • delete(S, x): Delete an element from a soft heap
  • findmin(S): Get the element with minimum key in the soft heap

Other heaps such as Fibonacci heaps achieve most of these bounds without any corruption, but cannot provide a constant-time bound on the critical delete operation. The amount of corruption can be controlled by the choice of a parameter , but the lower this is set, the more time insertions require: expressed using Big-O notation, the amortized time will be for an error rate of .[1][2] Some versions of soft heaps allow the create, insert, and meld operations to take constant time in the worst case, producing amortized rather than worst-case performance only for findmin and delete.[3] As with comparison sort, these algorithms access the keys only by comparisons; if arithmetic operations on integer keys are allowed, the time dependence on can be reduced to or (with randomization) .[4]

More precisely, the error guarantee offered by the soft heap is the following: each soft heap is initialized with a parameter , chosen between 0 and 1/2. Then at any point in time it will contain at most corrupted keys, where is the number of elements inserted so far. Note that this does not guarantee that only a fixed percentage of the keys currently in the heap are corrupted: in an unlucky sequence of insertions and deletions, it can happen that all elements in the heap will have corrupted keys. Similarly, there is no guarantee that in a sequence of elements extracted from the heap with findmin and delete, only a fixed percentage will have corrupted keys: in an unlucky scenario only corrupted elements are extracted from the heap. When a key is corrupted, the value stored for it in the soft key is higher than its initially-given value; corruption can never decrease the value of any key. The findmin operation finds the minimum value among the currently stored keys, including the corrupted ones.[1][2]

The soft heap was designed by Bernard Chazelle in 2000. The term "corruption" in the structure is the result of what Chazelle called "carpooling" in a soft heap. Each node in the soft heap contains a linked list of keys and one common key. The common key is an upper bound on the values of the keys in the linked list. Once a key is added to the linked list, it is considered corrupted because its value is never again relevant in any of the soft heap operations: only the common keys are compared. This is what makes soft heaps "soft"; one cannot be sure whether any particular value put into it will be corrupted. The purpose of these corruptions is effectively to lower the information entropy of the data, enabling the data structure to break through information-theoretic barriers regarding heaps.[1]

Applications

[edit]

Despite their limitations and unpredictable nature, soft heaps are useful in the design of deterministic algorithms. For instance, they have been used to achieve the best complexity to date for finding a minimum spanning tree.[5] Other problems whose efficient solution has been simplified using soft heaps include finding the th smallest element in several classes of structured sets of values, including heap-ordered trees, sorted matrices, and sumsets.[6]

Another simple example is a selection algorithm, to find the th smallest of a group of numbers:[1]

  • Initialize a soft heap with error rate , allowing at most 33% of its inserted keys to be corrupted.
  • Insert all elements into the heap.
  • Repeat times: perform a findmin operation and delete the key that it returns.
  • Let be the deleted element whose correct key is largest.
  • Compare to all of the given elements, partition them into the subset less than and the subset greater than , placing elements that are equal to into the first subset when they were deleted and into the second subset otherwise.
  • Recursively call the same selection algorithm in the subset that contains the th smallest.

After the comparison step of the algorithm, the first of the two subsets contains all of the deleted keys, so it includes at least elements. Among the elements that were not deleted, at most are corrupted, so at least are uncorrupted. These uncorrupted and undeleted elements must all belong to the second subset, because they are greater than or equal to the soft heap's (possibly corrupted) value of , which is in turn greater than the true value of . Thus, both subsets have between 33% and 66% of the elements. Because each level of recursion reduces the problem size by a constant factor, the total time of the algorithm can be bounded by a geometric series, showing that it is .[1]

References

[edit]
  1. ^ a b c d e f Chazelle, Bernard (November 2000). "The soft heap: an approximate priority queue with optimal error rate" (PDF). Journal of the ACM. 47 (6): 1012–1027. CiteSeerX 10.1.1.5.9705. doi:10.1145/355541.355554. S2CID 12556140.
  2. ^ a b c Kaplan, Haim; Zwick, Uri (2009). "A simpler implementation and analysis of Chazelle's soft heaps". Proceedings of the Nineteenth Annual ACM–SIAM Symposium on Discrete Algorithms. Society for Industrial and Applied Mathematics. pp. 477–485. CiteSeerX 10.1.1.215.6250. doi:10.1137/1.9781611973068.53. ISBN 978-0-89871-680-1.
  3. ^ Kaplan, Haim; Tarjan, Robert E.; Zwick, Uri (2013). "Soft heaps simplified". SIAM Journal on Computing. 42 (4): 1660–1673. doi:10.1137/120880185. MR 3084181.
  4. ^ Thorup, Mikkel; Zamir, Or; Zwick, Uri (2019). "Dynamic ordered sets with approximate queries, approximate heaps and soft heaps". In Baier, Christel; Chatzigiannakis, Ioannis; Flocchini, Paola; Leonardi, Stefano (eds.). 46th International Colloquium on Automata, Languages, and Programming, ICALP 2019, July 9–12, 2019, Patras, Greece. LIPIcs. Vol. 132. Schloss Dagstuhl – Leibniz-Zentrum für Informatik. pp. 95:1–95:13. doi:10.4230/LIPICS.ICALP.2019.95.
  5. ^ Chazelle, Bernard (2000). "A minimum spanning tree algorithm with inverse-Ackermann type complexity". Journal of the ACM. 47 (6): 1028–1047. doi:10.1145/355541.355562. MR 1866456.
  6. ^ Kaplan, Haim; Kozma, László; Zamir, Or; Zwick, Uri (2019). "Selection from heaps, row-sorted matrices, and X+Y using soft heaps". In Fineman, Jeremy T.; Mitzenmacher, Michael (eds.). 2nd Symposium on Simplicity in Algorithms, SOSA 2019, January 8–9, 2019, San Diego, CA, USA. OASIcs. Vol. 69. Schloss Dagstuhl – Leibniz-Zentrum für Informatik. pp. 5:1–5:21. doi:10.4230/OASICS.SOSA.2019.5.