Worst case performance and "impossibly unlikely" conditions
Posted Jan 11, 2012 22:16 UTC (Wed) by rgmoore
In reply to: Worst case performance and "impossibly unlikely" conditions
Parent article: Denial of service via hash collisions
In programming we rarely worry about worst-case performance of an algorithm. We are happy to take the code that has good average-case performance over the code which is usually ten times slower but has fewer pathological cases
Not always. A good example is Perl's decision to switch its default sorting algorithm from quicksort to merge sort. Quicksort is O(NlogN) for the general case but O(N**2) for some pathological inputs, while the merge sort they chose has worst case of O(NlogN) but has worse benchmark performance on non-pathological data. In that case, it turned out that the pathological case (IIRC it was pre-sorted data) was common enough that it was worse paying a little bit more in the average case to avoid it. OTOH, that was just changing the default; you can still force it to use quicksort if you're confident your data won't be pathological.
to post comments)