Worst case performance and "impossibly unlikely" conditions
Worst case performance and "impossibly unlikely" conditions
Posted Jan 12, 2012 3:42 UTC (Thu) by rgmoore (✭ supporter ✭, #75)In reply to: Worst case performance and "impossibly unlikely" conditions by rbrito
Parent article: Denial of service via hash collisions
I think I may have inadvertently confused you. The sort implementation I'm talking about is a different example than the hash implementation mentioned in the parent article, where the performance regression is only likely to be important in the context of a deliberate DOS attack. Perl sorting is just a real world example of an algorithm encountering pathological data far more often than you might naively expect. Practical experience with pathological data forced them to start considering worst-case performance rather than just average case.
FWIW, the Perl documentation also points out that merge sort uses fewer comparisons than quicksort in the typical case (and no more in the worst case), so it may be faster if the comparisons themselves are expensive. One more example of why it's important to have several algorithms with different and well understood performance characteristics for important tasks.