Worst case performance and "impossibly unlikely" conditions
Worst case performance and "impossibly unlikely" conditions
Posted Jan 13, 2012 20:49 UTC (Fri) by joey (guest, #328)In reply to: Worst case performance and "impossibly unlikely" conditions by rgmoore
Parent article: Denial of service via hash collisions
To test this I used code like this, which exercises the instability of the sort:
perl -MData::Dumper -e 'print Dumper sort { $a->[0] <=> $b->[1] } ([1,1],[2,3],[1,2],[2,1],[2,4])'
perl -le 'sub c { $a->[0] <=> $b->[1] } @l=([1,1],[2,3],[1,2],[2,1],[2,4]); print sort(\&c, @l) == sort(&c, @l)'
Posted Jan 13, 2012 23:20 UTC (Fri)
by rgmoore (✭ supporter ✭, #75)
[Link]
That's because they changed the default sort to merge sort, which is stable, starting in 5.8. If you want to test whether their quicksort is stable from run to run over the same data, you'd need to
Worst case performance and "impossibly unlikely" conditions
use sort '_quicksort'; no sort 'stable';
or the like to force it to use quicksort.