# Complete Sorting Algorithms Guide 2023

Sorting is a fundamental concept in computer science and has been widely used in various applications, from simple tasks like arranging a list of numbers in ascending or descending order to complex tasks like searching and indexing databases. Understanding and implementing various sorting algorithms is essential for any computer science enthusiast or software developer. This comprehensive guide will help you understand the most popular sorting algorithms, their implementation, and their complexity. We will also cover some frequently asked questions related to sorting algorithms.

## Bubble Sort

Bubble Sort is a simple sorting algorithm that works by repeatedly stepping through the list, comparing each pair of adjacent items, and swapping them if they are in the wrong order. The algorithm continues to pass through the list until no swaps are needed, indicating that the list is sorted. The algorithm gets its name because the smaller elements “bubble” to the top of the list as it iterates.

### Algorithm

- Iterate through the list.
- Compare each pair of adjacent elements.
- If the elements are in the wrong order, swap them.
- Continue iterating through the list until no more swaps are needed.

### Example

`def bubble_sort(arr):`

n = len(arr)

for i in range(n - 1):

for j in range(0, n - i - 1):

if arr[j] > arr[j + 1]:

arr[j], arr[j + 1] = arr[j + 1], arr[j]

`arr = [64, 34, 25, 12, 22, 11, 90]`

bubble_sort(arr)

print("Sorted array is:", arr)

### Time Complexity

- Best Case: O(n) – When the input list is already sorted.
- Average Case: O(n^2) – When the input list is in random order.
- Worst Case: O(n^2) – When the input list is in reverse order.

### Space Complexity

- O(1) – Bubble Sort is an in-place sorting algorithm, meaning it doesn’t require any additional memory.

## Selection Sort

Selection Sort is another simple sorting algorithm that works by dividing the input list into two parts: the sorted part and the unsorted part. Initially, the sorted part is empty, and the unsorted part contains all the elements. The algorithm repeatedly selects the smallest (or largest) element from the unsorted part and moves it to the end of the sorted part. This process continues until the unsorted part becomes empty, and the list is sorted.

### Algorithm

- Find the minimum (or maximum) element in the unsorted part of the list and swap it with the first unsorted element.
- Move the boundary between the sorted and unsorted parts one element to the right.
- Repeat steps 1 and 2 until the entire list is sorted.

### Example

`def selection_sort(arr):`

n = len(arr)

for i in range(n):

min_idx = i

for j in range(i + 1, n):

if arr[j] < arr[min_idx]:

min_idx = j

arr[i], arr[min_idx] = arr[min_idx], arr[i]

`arr = [64, 34, 25, 12, 22, 11, 90]`

selection_sort(arr)

print("Sorted array is:", arr)

### Time Complexity

- Best Case: O(n^2) – Regardless of the input order, the algorithm always performs the same number of comparisons.
- Average Case: O(n^2) – Same as the best case.
- Worst Case: O(n^2) – Same as the best case.

### Space Complexity

- O(1) – Selection Sort is an in-place sorting algorithm.

## Insertion Sort

Insertion Sort is another simple sorting algorithm that builds the sorted list one element at a time. It is much less efficient on large lists than more advanced algorithms like Quick Sort, Heap Sort, or Merge Sort. However, Insertion Sort provides several advantages, such as its simplicity, efficiency for small datasets, and ability to sort a list as it is received.

### Algorithm

- Iterate through the list from the second element to the last element.
- For each element, compare it with the elements to its left.
- If the current element is smaller than the left element, swap them and continue comparing with the previous elements until the correct position is found or the beginning of the list is reached.

### Example

`def insertion_sort(arr):`

n = len(arr)

for i in range(1, n):

key = arr[i]

j = i - 1

while j >= 0 and arr[j] > key:

arr[j + 1] = arr[j]

j -= 1

arr[j + 1] = key

`arr = [64, 34, 25, 12, 22, 11, 90]`

insertion_sort(arr)

print("Sorted array is:", arr)

### Time Complexity

- Best Case: O(n) – When the input list is already sorted.
- Average Case: O(n^2) – When the input list is in random order.
- Worst Case: O(n^2) – When the input list is in reverse order.

### Space Complexity

- O(1) – Insertion Sort is an in-place sorting algorithm.

## Merge Sort

Merge Sort is a divide-and-conquer sorting algorithm that works by recursively dividing the list into two halves, sorting them, and then merging the sorted halves to produce the final sorted list. Merge Sort is a stable sort, which means that the relative order of equal elements is preserved.

### Algorithm

- If the list has only one element, it is already sorted. Return the list.
- Split the list into two halves.
- Recursively sort each half.
- Merge the two sorted halves to produce the final sorted list.

### Example

`def merge_sort(arr):`

if len(arr) <= 1:

return arr
mid = len(arr) // 2

left = merge_sort(arr[:mid])

right = merge_sort(arr[mid:])

return merge(left, right)

def merge(left, right):

result = []

i = j = 0

while i < len(left) and j < len(right):

if left[i] < right[j]:

result.append(left[i])

i += 1

else:

result.append(right[j])

j += 1

result += left[i:]

result += right[j:]

return result

`arr = [64, 34, 25, 12, 22, 11, 90]`

sorted_arr = merge_sort(arr)

print("Sorted array is:", sorted_arr)

### Time Complexity

- Best Case: O(n * log(n)) – Regardless of the input order, the algorithm always has the same performance.
- Average Case: O(n * log(n)) – Same as the best case.
- Worst Case: O(n * log(n)) – Same as the best case.

### Space Complexity

- O(n) – Merge Sort requires additional memory for temporary storage during the merge operation.

## Quick Sort

Quick Sort is another divide-and-conquer sorting algorithm that works by selecting a ‘pivot’ element from the list and partitioning the other elements into two groups: those less than the pivot and thosegreater than the pivot. The algorithm then recursively sorts the two groups. Quick Sort is an efficient, in-place, and unstable sorting algorithm.

### Algorithm

- Choose a pivot element from the list.
- Partition the list into two groups: elements less than the pivot and elements greater than the pivot.
- Recursively sort the two groups.
- Combine the sorted groups and the pivot element to produce the final sorted list.

### Example

`def quick_sort(arr):`

if len(arr) <= 1:

return arr
pivot = arr[len(arr) // 2]

left = [x for x in arr if x < pivot]

middle = [x for x in arr if x == pivot]

right = [x for x in arr if x > pivot]

return quick_sort(left) + middle + quick_sort(right)

`arr = [64, 34, 25, 12, 22, 11, 90]`

sorted_arr = quick_sort(arr)

print("Sorted array is:", sorted_arr)

### Time Complexity

- Best Case: O(n * log(n)) – When the pivot element is always chosen as the median.
- Average Case: O(n * log(n)) – When the input list is in random order.
- Worst Case: O(n^2) – When the pivot element is always chosen as the smallest or largest element.

### Space Complexity

- O(log(n)) – Quick Sort is an in-place sorting algorithm, but it requires additional memory for recursion.

## Heap Sort

Heap Sort is a comparison-based sorting algorithm that works by dividing the input into a sorted and an unsorted region, and iteratively extracting the largest (or smallest) element from the unsorted region and moving it to the beginning of the sorted region. Heap Sort uses a binary heap data structure to efficiently find the largest (or smallest) element in the unsorted region.

### Algorithm

- Build a max-heap (or min-heap) from the input list.
- Swap the root element of the heap with the last element in the unsorted region.
- Decrease the size of the unsorted region by one.
- Restore the max-heap (or min-heap) property.
- Repeat steps 2-4 until the entire list is sorted.

### Example

`def heapify(arr, n, i):`

largest = i

left = 2 * i + 1

right = 2 * i + 2
if left < n and arr[largest] < arr[left]:

largest = left

if right < n and arr[largest] < arr[right]:

largest = right

if largest != i:

arr[i], arr[largest] = arr[largest], arr[i]

heapify(arr, n, largest)

def heap_sort(arr):

n = len(arr)

for i in range(n // 2 - 1, -1, -1):

heapify(arr, n, i)

for i in range(n - 1, 0, -1):

arr[i], arr[0] = arr[0], arr[i]

heapify(arr, i, 0)

`arr = [64, 34, 25, 12, 22, 11, 90]`

heap_sort(arr)

print("Sorted array is:", arr)

### Time Complexity

- Best Case: O(n * log(n)) – Regardless of the input order, the algorithm always has the same performance.
- Average Case: O(n * log(n)) – Same as the best case.
- Worst Case: O(n * log(n)) – Same as thebest case.

### Space Complexity

- O(1) – Heap Sort is an in-place sorting algorithm.

## FAQ

**1. What are the factors to consider when choosing a sorting algorithm?**

When choosing a sorting algorithm, consider the following factors:

- Time complexity: The efficiency of the algorithm with respect to the size of the input list.
- Space complexity: The amount of additional memory the algorithm requires.
- Stability: Whether the relative order of equal elements is preserved.
- Adaptivity: Whether the algorithm’s performance improves when given partially sorted input.
- In-place: Whether the algorithm sorts the input list without using additional memory.

**2. What is the difference between stable and unstable sorting algorithms?**

A stable sorting algorithm preserves the relative order of equal elements in the sorted list, whereas an unstable sorting algorithm does not guarantee the preservation of the relative order of equal elements. Stability is essential in applications where the input list contains multiple keys, and the output must maintain the original order of equal keys.

**3. Which sorting algorithm should I use in real-world applications?**

In real-world applications, it’s common to use the following sorting algorithms:

- For small datasets or when simplicity is preferred, use Insertion Sort.
- For medium-sized datasets or when a stable sort is required, use Merge Sort.
- For large datasets or when an in-place sort is required, use Quick Sort.
- In some specific cases, like when working with integers or strings, you can use specialized sorting algorithms like Radix Sort or Counting Sort.

Keep in mind that the choice of a sorting algorithm largely depends on the specific requirements and constraints of your application.

**4. How do I choose the best sorting algorithm for my specific use case?**

To choose the best sorting algorithm for your use case, consider the following questions:

- What is the size of the dataset you need to sort?
- Do you require a stable sorting algorithm?
- Is the input list partially sorted or in random order?
- Do you need an in-place sorting algorithm?
- What are the performance requirements of your application?

Answering these questions will help you narrow down the options and choose the most appropriate sorting algorithm for your needs.

**5. What is the difference between internal and external sorting?**

Internal sorting refers to sorting algorithms that work on data that is entirely stored in the main memory (RAM) of the computer. Examples include Bubble Sort, Insertion Sort, and Quick Sort. External sorting refers to sorting algorithms that work on data that is too large to fit in the main memory and must be stored on external storage devices like hard disks or tapes. Examples of external sorting algorithms include External Merge Sort and Polyphase Merge Sort.

### Sharing is caring

Did you like what Mehul Mohan wrote? Thank them for their work by sharing it on social media.

## No comments so far

- What is Quick Sort?
- Which is better – Merge sort or Quick sort?
- What is merge sort?