Complexity Analysis
Merge sort has consistent, predictable performance.
Time Complexity
All Cases: O(n log n)
- Best case: O(n log n)
- Average case: O(n log n)
- Worst case: O(n log n)
This is because we always divide in half and merge all elements.
Space Complexity
O(n) - Linear Space
- Needs extra array for merging
- Recursion stack: O(log n)
- Total: O(n)
This is the main disadvantage compared to in-place sorts.
Why O(n log n)?
- Divide: O(log n) levels (divide in half each time)
- Merge at each level: O(n) work
- Total: O(n) × O(log n) = O(n log n)
Comparison with Other Algorithms
**Advantages:**
• Guaranteed O(n log n)
• Stable algorithm
• Predictable performance
• Easy to parallelize
**Disadvantages:**
• O(n) extra space
• Not in-place
• Slower constant factors
**Advantages:**
• O(n log n) average
• In-place (O(log n) space)
• Fast in practice
• Good cache performance
**Disadvantages:**
• O(n²) worst case
• Not stable
• Performance depends on pivot
Optimizations
- Use insertion sort for small arrays: Insertion sort is faster for small n
- Avoid copying: Use indices instead of slicing
- Bottom-up merge sort: Iterative version avoids recursion overhead
When to Use
Merge sort is ideal when:
- You need guaranteed O(n log n) performance
- Stability is required
- You have extra memory available
- Sorting linked lists (efficient for linked structures)
What’s Next?
Let’s test your understanding with practice problems and a quiz.
Progress 86%
Page 6 of 7
← Previous
→ Next