Loading...

Sorting an Array in C (with examples)

Sorting an Array in C (with examples)

Sorting arrays is a fundamental aspect of programming, pivotal in organizing data efficiently for further operations like search and optimization tasks. It’s an area that demands understanding of both basic and complex algorithmic strategies to enhance the performance and efficiency of software solutions.

Introduction

In computer science, sorting is not merely an academic exercise but a critical operation with extensive practical applications. It involves arranging the elements of an array or list in a certain order, typically ascending or descending. The importance of sorting spans various types of algorithms, each with unique principles and efficiencies, including but not limited to Bubble Sort, Selection Sort, Insertion Sort, Counting Sort, and Radix Sort.

Why Sorting is Important

Sorting plays a vital role in real-world applications, from the organization of large databases to the functioning of search engines and the optimization of algorithms. Efficient sorting reduces complexity, improves the speed of data retrieval, and significantly enhances the performance of software applications. It’s a cornerstone in the development of efficient, scalable, and performant software systems.

Basics of Arrays in C

In C, an array is a collection of items stored at contiguous memory locations, intended to be accessed using an index. They can be declared in C using syntax like int arr[10]; which declares an array named arr of type int with space for 10 integers. Initializing an array can be done at declaration, e.g., int arr[] = {1, 2, 3, 4, 5};, and elements can be accessed using the index notation arr[0], arr[1], etc., with indices starting at 0.

Sorting Algorithms Overview

Sorting algorithms are primarily categorized into comparison-based and non-comparison-based algorithms. Comparison sorts, like Bubble, Selection, and Insertion Sort, function by comparing elements and deciding their order based on their values. Non-comparison sorts, such as Counting and Radix Sort, do not compare elements directly but use their characteristics to sort. The choice of algorithm affects performance, with time complexities ranging from (O(n^2)) for simple comparison sorts to (O(n+k)) for more sophisticated non-comparison sorts.

Comparison-based Sorting Algorithms

  • Bubble Sort: A simple algorithm that repeatedly steps through the list, compares adjacent elements, and swaps them if they are in the wrong order.
  • Insertion Sort: Builds the final sorted array one item at a time, with each new item being inserted into its correct position within the sorted part.
  • Selection Sort: Divides the array into a sorted and an unsorted region. It repeatedly selects the minimum (or maximum) from the unsorted region and moves it to the sorted region.

Non-Comparison Sorting Algorithms

  • Counting Sort: Counts the occurrences of each element within a range, then calculates the position of each element in the sorted array.
  • Radix Sort: Sorts the numbers digit by digit, starting from the least significant digit to the most significant digit, using a stable sort as a subroutine to sort each digit.

Bubble Sort

Explanation

Bubble Sort is a straightforward algorithm where each pair of adjacent elements is compared, and the elements are swapped if they are not in order. This process is repeated until no swaps are needed, indicating that the array is sorted.

Example

Consider an array arr[] = {5, 1, 4, 2, 8}. The algorithm repeatedly passes through the array, compares each pair of adjacent elements, and swaps them if necessary. After the first pass, the largest element is bubbled to the end. This process repeats, with each pass reducing the number of elements to be compared by one.

Implementation in C

void bubbleSort(int arr[], int n) {
int i, j;
for (i = 0; i < n-1; i++)
for (j = 0; j < n-i-1; j++)
if (arr[j] > arr[j+1])
swap(&arr[j], &arr[j+1]);
}

Complexity Analysis

The time complexity of Bubble Sort is (O(n^2)) in the worst and average cases, and (O(n)) in the best case if the array is already sorted. The space complexity is (O(1)).

Selection Sort

Explanation

Selection Sort improves upon the brute-force nature of Bubble Sort by selecting the minimum element from the unsorted portion and swapping it with the leftmost unsorted element, moving the boundary of the unsorted portion one element to the right.

Example

Given arr[] = {64, 25, 12, 22, 11}, the algorithm first finds the minimum and swaps it with the element in the first position, then moves to the second position and repeats the process for the remaining unsorted elements.

Implementation in C

void selectionSort(int arr[], int n) {
int i, j, min_idx;
for (i = 0; i < n-1; i++) {
min_idx = i;
for (j = i+1; j < n; j++)
if (arr[j] < arr[min_idx])
min_idx = j;
swap(&arr[min_idx], &arr[i]);
}
}

Complexity Analysis

The time complexity of Selection Sort is (O(n^2)), making it inefficient on large lists, and like Bubble Sort, its space complexity is (O(1)).

Insertion Sort

Explanation

Insertion Sort builds the final sorted array (or list) one item at a time. It is much less efficient on large lists than more advanced algorithms like quicksort, heapsort, or merge sort.

Example

Starting with the second element of the array, compare the current element to its predecessor. If the key element is smaller than its predecessor, compare it to the elements before. Move the greater elements one position up to make space for the swapped element.

Implementation in C

1void insertionSort(int arr[], int n) {
2 int i, key, j;
3 for (i = 1; i < n; i++) {
4 key = arr[i];
5 j = i - 1;
6 while (j >= 0 && arr[j] > key) {
7 arr[j + 1] = arr[j];
8 j = j - 1;
9 }
10 arr[j + 1] = key;
11 }
12}

Complexity Analysis

Insertion Sort has a time complexity of (O(n^2)) in the worst case, but it can perform significantly better on partially sorted arrays, with a best-case time complexity of (O(n)). Its space complexity is (O(1)), making it advantageous for small datasets or as a part of more complex algorithms.

Merge Sort

Explanation

Merge Sort is a highly efficient, comparison-based, divide and conquer sorting algorithm. The main idea behind Merge Sort is to divide the array into smaller arrays, sort those arrays (if they are not already sorted), and then merge them together into a single sorted array. This approach makes Merge Sort a recursive algorithm. The process of dividing the array continues until we have arrays of single elements, which are, by definition, sorted. The merging process takes care of combining these sorted arrays into larger sorted arrays until the whole dataset is assembled back into a single sorted array.

Example

Consider an array A with the elements [38, 27, 43, 3, 9, 82, 10]. The Merge Sort algorithm would work on this array as follows:

  1. Divide the array into two halves: [38, 27, 43, 3] and [9, 82, 10].
  2. Continue dividing the arrays until you have arrays that cannot be divided further, e.g., [38], [27], [43], [3], [9], [82], [10].
  3. Start merging these arrays together by comparing their elements and assembling them into sorted arrays: [27, 38], [3, 43], [9, 82], [10].
  4. Continue the merging process: [3, 27, 38, 43], [9, 10, 82].
  5. Merge the final arrays into one sorted array: [3, 9, 10, 27, 38, 43, 82].

Implementation in C

1#include <stdio.h>
2#include <stdlib.h>
3
4void merge(int arr[], int l, int m, int r) {
5 int i, j, k;
6 int n1 = m - l + 1;
7 int n2 = r - m;
8
9 int L[n1], R[n2];
10
11 for (i = 0; i < n1; i++)
12 L[i] = arr[l + i];
13 for (j = 0; j < n2; j++)
14 R[j] = arr[m + 1+ j];
15
16 i = 0;
17 j = 0;
18 k = l;
19 while (i < n1 && j < n2) {
20 if (L[i] <= R[j]) {
21 arr[k] = L[i];
22 i++;
23 } else {
24 arr[k] = R[j];
25 j++;
26 }
27 k++;
28 }
29
30 while (i < n1) {
31 arr[k] = L[i];
32 i++;
33 k++;
34 }
35
36 while (j < n2) {
37 arr[k] = R[j];
38 j++;
39 k++;
40 }
41}
42
43void mergeSort(int arr[], int l, int r) {
44 if (l < r) {
45 int m = l+(r-l)/2;
46 mergeSort(arr, l, m);
47 mergeSort(arr, m+1, r);
48 merge(arr, l, m, r);
49 }
50}
51
52int main() {
53 int arr[] = {38, 27, 43, 3, 9, 82, 10};
54 int arr_size = sizeof(arr)/sizeof(arr[0]);
55
56 mergeSort(arr, 0, arr_size - 1);
57
58 printf("Sorted array is \n");
59 for (int i = 0; i < arr_size; i++)
60 printf("%d ", arr[i]);
61 printf("\n");
62 return 0;
63}

Complexity Analysis

The time complexity of Merge Sort is O(n log n) in all cases (worst, average, and best) as the array is always divided into two halves and the merge process takes linear time. The space complexity is O(n) due to the temporary arrays used for merging.

Quick Sort

Explanation

Quick Sort is another divide and conquer algorithm much like Merge Sort but with a different approach to dividing the array. It selects an element as a pivot and partitions the given array around the chosen pivot. There are many different versions of quick sort that pick pivot in different ways:

  1. Always pick the first element as pivot.
  2. Always pick the last element as pivot.
  3. Pick a random element as pivot.
  4. Pick median as pivot.

The key process in Quick Sort is the partitioning process, which rearranges the array so that all elements smaller than the pivot come before the pivot, while all elements greater than the pivot come after it. After the partitioning, the pivot is in its final position. This process is then applied recursively to the sub-arrays formed by splitting the array around the pivot.

Example

Given the array [10, 80, 30, 90, 40, 50, 70], and choosing the last element as pivot:

  1. Initially, the pivot is 70. After partitioning, the array looks like [10, 30, 40, 50, 70, 90, 80].
  2. Apply Quick Sort recursively on the sub-array to the left of 70 and the sub-array to the right of 70.

Implementation in C

1#include<stdio.h>
2
3void swap(int* a, int* b) {
4 int t = *a;
5 *a = *b;
6 *b = t;
7}
8
9int partition (int arr[], int low, int high) {
10 int pivot = arr[high];
11 int i = (low - 1);
12
13 for (int j = low; j <= high- 1; j++) {
14 if (arr[j] < pivot) {
15 i++;
16 swap(&arr[i], &arr[j]);
17 }
18 }
19 swap(&arr[i + 1], &arr[high]);
20 return (i + 1);
21}
22
23void quickSort(int arr[], int low, int high) {
24 if (low < high) {
25 int pi = partition(arr, low, high);
26
27 quickSort(arr, low, pi - 1);
28 quickSort(arr, pi + 1, high);
29 }
30}
31
32int main() {
33 int arr[] = {10, 80, 30, 90, 40, 50, 70};
34 int n = sizeof(arr)/sizeof(arr[0]);
35 quickSort(arr, 0, n-1);
36 printf("Sorted array: \n");
37 for (int i=0; i < n; i++)
38 printf("%d ", arr[i]);
39 printf("\n");
40 return 0;
41}

Complexity Analysis

The average and best-case time complexity of Quick Sort is O(n log n), while its worst-case time complexity is O(n^2), which occurs when the smallest or largest element is always chosen as the pivot. However, this worst-case scenario can be mostly avoided by using techniques such as randomized pivot selection. The space complexity of Quick Sort is O(log n) due to the recursive calls.

Heap Sort

Explanation

Heap Sort is a comparison-based sorting algorithm that uses a binary heap data structure. The idea is to build a Max Heap from the given data and then repeatedly remove the maximum element from the heap and add it to the end of the sorted array. This process continues until all elements are removed from the heap and are added to the array, resulting in a sorted array.

Example

Given an array [4, 10, 3, 5, 1], Heap Sort would start by building a max heap from the array, resulting in [10, 5, 3, 4, 1]. The largest element (10) is then moved to the end of the array, and the heap is adjusted to form [5, 4, 3, 1] with 10 in the correct position in the array. This process repeats until the array is sorted.

Implementation in C

1#include <stdio.h>
2
3void swap(int *a, int *b) {
4 int temp = *a;
5 *a = *b;
6 *b = temp;
7}
8
9void heapify(int arr[], int n, int i) {
10 int largest = i;
11 int left = 2*i + 1;
12 int right = 2*i + 2;
13
14 if (left < n && arr[left] > arr[largest])
15 largest = left;
16
17 if (right < n && arr[right] > arr[largest])
18 largest = right;
19
20 if (largest != i) {
21 swap(&arr[i], &arr[largest]);
22 heapify(arr, n, largest);
23 }
24}
25
26void heapSort(int arr[], int n) {
27 for (int i = n / 2 - 1; i >= 0; i--)
28 heapify(arr, n, i);
29
30 for (int i=n-1; i>0; i--) {
31 swap(&arr[0], &arr[i]);
32 heapify(arr, i, 0);
33 }
34}
35
36int main() {
37 int arr[] = {4, 10, 3, 5, 1};
38 int n = sizeof(arr)/sizeof(arr[0]);
39
40 heapSort(arr, n);
41
42 printf("Sorted array is \n");
43 for (int i=0; i<n; ++i)
44 printf("%d ", arr[i]);
45 printf("\n");
46 return 0;
47}

Complexity

The time complexity of Heap Sort is O(n log n) for all cases because we perform n insertions into the heap, and each insertion takes logarithmic time. The space complexity is O(1), as the sorting is done in place.

Shell Sort

Explanation

Shell Sort is mainly a variation of Insertion Sort. In insertion sort, elements only move one position ahead. Shell Sort allows the exchange of far items. This capability significantly reduces the amount of shifts required by insertion sort. The idea is to sort elements separated by a certain ‘gap’, and as the algorithm progresses, the gap is reduced. The process continues until the gap is 1, which means the array is sorted using insertion sort, but by this time, the array is already almost sorted, which is an ideal scenario for insertion sort.

Example

Given an array [35, 33, 42, 10, 14, 19, 27, 44], choosing an initial gap of 4, the array is divided into sub-arrays: [35, 14], [33, 19], [42, 27], [10, 44] which are individually sorted. The gap is then reduced, and the process is repeated until the gap is 1, and the entire array is sorted using insertion sort.

Implementation in C

1#include <stdio.h>
2
3void shellSort(int arr[], int n) {
4 for (int gap = n/2; gap > 0; gap /= 2) {
5 for (int i = gap; i < n; i += 1) {
6 int temp = arr[i];
7 int j;
8 for (j = i; j >= gap && arr[j - gap] > temp; j -= gap)
9 arr[j] = arr[j - gap];
10 arr[j] = temp;
11 }
12 }
13}
14
15int main() {
16 int arr[] = {35, 33, 42, 10, 14, 19, 27, 44};
17 int n = sizeof(arr)/sizeof(arr[0]);
18 shellSort(arr, n);
19 printf("Sorted array: \n");
20 for (int i=0; i<n; i++)
21 printf("%d ", arr[i]);
22 printf("\n");
23 return 0;
24}

Complexity

The time complexity of Shell Sort depends on the gap sequence it uses. For many practical variants, it tends to fall between O(n) and O(n^2). The space complexity is O(1), as it sorts in place without using any additional storage.

Counting Sort

Explanation

Counting Sort is an integer sorting algorithm that operates by counting the number of objects that have each distinct key value. Instead of making comparisons, this algorithm uses key values as indexes in an auxiliary array, thus increasing the count for each key value. The algorithm then iterates over the counts, producing a sorted array. Counting Sort is efficient when the range of input data (k) is not significantly greater than the number of elements in the input array (n).

Example

Given an array [4, 2, 2, 8, 3, 3, 1], Counting Sort would first count the occurrences of each number, resulting in counts [1, 2, 2, 1, 1] for numbers [1, 2, 3, 4, 8]. It then uses these counts to place numbers in the correct position in the sorted array.

Implementation in C

1#include <stdio.h>
2#include <string.h>
3#define RANGE 255
4
5void countingSort(int arr[], int n) {
6 int output[n];
7 int count[RANGE + 1], i;
8 memset(count, 0, sizeof(count));
9
10 for(i = 0; arr[i]; ++i)
11 ++count[arr[i]];
12
13 for (i = 1; i <= RANGE; ++i)
14 count[i] += count[i-1];
15
16 for (i = 0; arr[i]; ++i) {
17 output[count[arr[i]]-1] = arr[i];
18 --count[arr[i]];
19 }
20
21 for (i = 0; arr[i]; ++i)
22 arr[i] = output[i];
23}
24
25int main() {
26 int arr[] = {4, 2, 2, 8, 3, 3, 1};
27 int n = sizeof(arr)/sizeof(arr[0]);
28 countingSort(arr, n);
29 printf("Sorted array is: \n");
30 for (int i=0; i<n; ++i)
31 printf("%d ", arr[i]);
32 printf("\n");
33 return 0;
34}

Complexity

The time complexity of Counting Sort is O(n+k) where n is the number of elements in the input array and k is the range of the input. The space complexity is O(k), due to the auxiliary count array used.

Radix Sort

Explanation

Radix Sort is a non-comparative integer sorting algorithm that sorts data with integer keys by grouping keys by the individual digits which share the same significant position and value. A positional notation is needed, but because integers can represent strings of characters and specially formatted floating-point numbers, Radix Sort is not limited to integers. It processes the digits of the numbers starting from the least significant digit (LSD) to the most significant digit (MSD).

Example

Given an array [170, 45, 75, 90, 802, 24, 2, 66], Radix Sort would first sort the elements based on the least significant digit (units place), then based on the next significant digit (tens place), and so on until the most significant digit.

Implementation in C

1#include <stdio.h>
2#define MAX 10
3
4int getMax(int arr[], int n) {
5 int mx = arr[0];
6 for (int i = 1; i < n; i++)
7 if (arr[i] > mx)
8 mx = arr[i];
9 return mx;
10}
11
12void countSort(int arr[], int n, int exp) {
13 int output[n];
14 int i, count[MAX] = {0};
15
16 for (i = 0; i < n; i++)
17 count[(arr[i] / exp) % 10]++;
18
19 for (i = 1; i < MAX; i++)
20 count[i] += count[i - 1];
21
22 for (i = n - 1; i >= 0; i--) {
23 output[count[(arr[i] / exp) % 10] - 1] = arr[i];
24 count[(arr[i] / exp) % 10]--;
25 }
26
27 for (i = 0; i < n; i++)
28 arr[i] = output[i];
29}
30
31void radixSort(int arr[], int n) {
32 int m = getMax(arr, n);
33
34 for (int exp = 1; m / exp > 0; exp *= 10)
35 countSort(arr, n, exp);
36}
37
38int main() {
39 int arr[] = {170, 45, 75, 90, 802, 24, 2, 66};
40 int n = sizeof(arr)/sizeof(arr[0]);
41 radixSort(arr, n);
42 printf("Sorted array is: \n");
43 for (int i = 0; i < n; i++)
44 printf("%d ", arr[i]);
45 printf("\n");
46 return 0;
47}

Complexity

The time complexity of Radix Sort is O((n+b) * logb(k)), where n is the number of elements, b is the base for representing numbers (for example, for the decimal system, b is 10), and k is the maximum number in the array. The space complexity is O(n).

Comparing Sorting Algorithms

Sorting algorithms have different characteristics, and choosing the right one depends on the context. Below is a comparative analysis of when each sorting algorithm is best used, including a table of time and space complexities:

Algorithm Best Case Average Case Worst Case Space Complexity In-place Stable
Merge Sort O(n log n) O(n log n) O(n log n) O(n) No Yes
Quick Sort O(n log n) O(n log n) O(n^2) O(log n) Yes No
Heap Sort O(n log n) O(n log n) O(n log n) O(1) Yes No
Shell Sort O(n log n) Varies O(n (log n)^2) O(1) Yes No
Counting Sort O(n+k) O(n+k) O(n+k) O(k) No Yes
Radix Sort O(nk) O(nk) O(nk) O(n+k) No Yes
  • Merge Sort is best used when memory space is not a constraint, and a stable sort is needed.
  • Quick Sort is preferred for arrays and when average performance matters more than the worst-case scenario.
  • Heap Sort is suitable for in-place sorting without additional memory use.
  • Shell Sort is used when a lighter-weight sorting is needed, and slightly less efficiency than the O(n log n) algorithms is acceptable.
  • Counting Sort excels when the range of input data (k) is not significantly greater than the number of elements in the input array.

Hope this blog post helped you understand sorting. All the best!

Sharing is caring

Did you like what Mehul Mohan wrote? Thank them for their work by sharing it on social media.

0/10000

No comments so far

Curious about this topic? Continue your journey with these coding courses: