These are the efficient comparison-based sorts that run in O(n log n) time. They use divide-and-conquer or heap data structures to achieve better performance than O(n²) algorithms.
Merge Sort
Idea: Divide the array in half, recursively sort each half, then merge the sorted halves together.
Time: O(n log n) in all cases
Space: O(n) for the temporary arrays
Stable: Yes
Best for: Linked lists, external sorting, when stability is required
Quick Sort
Idea: Pick a pivot element, partition the array so elements less than pivot are on the left and greater are on the right, then recursively sort each partition.
Time: O(n log n) average, O(n²) worst (sorted input with bad pivot)
Space: O(log n) for recursion stack
Stable: No
Best for: General purpose, in-place sorting, cache efficiency
Pivot selection matters: Random pivot or median-of-three avoids O(n²) on sorted input.
Heap Sort
Idea: Build a max-heap from the array, then repeatedly extract the maximum element and place it at the end.
Time: O(n log n) in all cases
Space: O(1) - truly in-place
Stable: No
Best for: Memory-constrained environments, guaranteed O(n log n)
Comparison
Algorithm
Average
Worst
Space
Stable
Merge Sort
O(n log n)
O(n log n)
O(n)
Yes
Quick Sort
O(n log n)
O(n²)
O(log n)
No
Heap Sort
O(n log n)
O(n log n)
O(1)
No
When to Use Each
Situation
Best Choice
Why
General purpose
Quick Sort
Fastest in practice, cache efficient
Need stability
Merge Sort
Only stable O(n log n) sort
Linked lists
Merge Sort
No random access needed
Memory constrained
Heap Sort
O(1) extra space
Guaranteed worst case
Merge/Heap Sort
Always O(n log n)
External sorting
Merge Sort
Sequential access pattern
Interview Tips
Know Quick Sort's partition: The partition function is used for QuickSelect (finding kth element in O(n)).
Merge Sort for linked lists: If asked to sort a linked list, merge sort is almost always the answer.
Space matters: If asked for in-place O(n log n), heap sort is the only option.