Sorting in Data Structure: Categories & Types [With Examples]
By Rohit Sharma
Updated on Mar 13, 2025 | 22 min read | 142.9k views
Share:
For working professionals
For fresh graduates
More
By Rohit Sharma
Updated on Mar 13, 2025 | 22 min read | 142.9k views
Share:
Table of Contents
Sorting in data structures is the process of arranging data in a specific order, usually ascending or descending. Sorting makes data easier to navigate and improves the efficiency of tasks like searching and analysis. Think of a dictionary—without alphabetical order, finding a word would be difficult and time-consuming. For data professionals, sorting organizes raw data into a structured format, making it manageable and accessible.
Without sorting, handling large datasets becomes challenging, slowing down essential operations. Sorting brings clarity and structure, allowing faster access to key information.
In this post, we’ll explore the main types of sorting algorithms in data structures. First, let’s look at what a sorting algorithm is and its role in data organization.
Continue reading as we cover the types of sorting algorithms, categories, and their role in data structure management.
Sorting makes life easier. Explore our data science courses to learn more about data science algorithms. Check out now!
Sorting in data structure is fundamental because it directly impacts the performance of search and indexing operations. In a sorted dataset, algorithms like binary search can be applied, reducing search time complexity from O(n) to O(log n). Sorting in data structure is also essential for efficient data processing in areas like ranking, priority-based retrieval, and merging data.
Checkout: Types of Binary Tree
Key Technical Benefits:
Sorting ensures that data is organized, making it accessible and efficient for further operations in data processing and analysis.
Sorting algorithms are classified into internal sorting and external sorting in data structure based on data size and memory constraints.
Sorting Type |
Definition |
Optimal Use Case |
Common Algorithms |
Internal Sorting |
Sorting done entirely in RAM. |
Suitable for small to medium datasets. |
Quick Sort, Merge Sort, Heap Sort |
External Sorting |
Sorting that requires external storage, usually disk-based. |
Used for large datasets that exceed memory. |
External Merge Sort, Multi-way Merge Sort |
Internal sorting in data structure is used when the entire dataset can fit within main memory (RAM), allowing for faster processing without requiring external storage access.
Typical Use: Internal sorting is used in applications where data fits in memory, such as sorting lists or arrays in moderate-sized datasets.
External sorting in data structure is required when the dataset is too large to fit into main memory and must use external storage for sorting operations.
Typical Use: Used in systems managing large data files, such as database systems, where sorting needs exceed available RAM.
Sorting algorithms are categorized into comparison-based and non-comparison-based types, each with specific technical characteristics suited for different data structures and application requirements.
Category |
Definition |
Examples |
Optimal Use Case |
Comparison-based Sorting |
Uses direct element comparisons, O(n log n) limit. |
Bubble Sort, Quick Sort, Merge Sort, Heap Sort |
Flexible, general-purpose sorting on datasets where comparisons are feasible. |
Non-Comparison-based Sorting |
Bypasses comparisons, uses other properties for sorting, achieving O(n) in constrained cases. |
Counting Sort, Radix Sort, Bucket Sort |
Ideal for large datasets with known structures, achieving linear time in specific scenarios. |
Comparison-based sorting algorithms determine order by comparing elements directly, with a lower bound time complexity of O(n log n) due to comparison limits in sorting in data structure.
Use Cases:
Non-comparison sorting algorithms bypass element comparisons, using techniques like digit processing or counting, achieving linear time complexity under specific conditions.
Use Cases:
For each algorithm, you’ll find a clear explanation, practical examples, and Python code to see them in action.
How it Works:
Bubble Sort is a simple, comparison-based algorithm that repeatedly steps through the list, compares adjacent elements, and swaps them if they are in the wrong order. Each pass through the list places the next-largest element in its correct position, “bubbling” the largest unsorted elements to the end of the list.
Example Use Case:
Bubble Sort is often used in educational contexts to demonstrate sorting fundamentals. In practice, it may be used in scenarios with small, nearly sorted datasets where simplicity is prioritized over efficiency.
Step-by-Step Example: Given an input array [6, 3, 7, 1, 2, 4]:
Python Code:
def bubbleSort(arr):
n = len(arr)
for i in range(n):
for j in range(0, n-i-1):
if arr[j] > arr[j+1]:
arr[j], arr[j+1] = arr[j+1], arr[j]
return arr
# Example
arr = [64, 34, 25, 12, 22, 11, 90]
print("Sorted array is:", bubbleSort(arr))
How it Works:
Quick Sort is a divide-and-conquer algorithm that selects a “pivot” element and partitions the array around the pivot. Elements smaller than the pivot are placed on its left, and elements greater than the pivot on its right. This partitioning continues recursively until the array is fully sorted.
Example Use Case:
Quick Sort is ideal for large datasets and is commonly used in applications like search engines and database sorting where fast sorting is required.
Step-by-Step Example: For an array [6, 3, 7, 1, 2, 4], using 4 as the pivot:
Python Code:
def quickSort(arr):
if len(arr) <= 1:
return arr
pivot = arr[len(arr) // 2]
left = [x for x in arr if x < pivot]
middle = [x for x in arr if x == pivot]
right = [x for x in arr if x > pivot]
return quickSort(left) + middle + quickSort(right)
# Example
arr = [3, 6, 8, 10, 1, 2, 1]
print("Sorted array is:", quickSort(arr))
How it Works:
Merge Sort is a stable, divide-and-conquer algorithm that recursively divides the array into two halves, sorts each half, and merges the sorted halves back together. It’s highly efficient for sorting linked lists or external sorting in database systems, as it maintains stable ordering.
Example Use Case:
Merge Sort is preferred for large data stored on external devices, such as in database systems where merging sorted chunks of data is required.
Step-by-Step Example: For an array [6, 3, 7, 1, 2, 4]:
Python Code:
def mergeSort(arr):
if len(arr) > 1:
mid = len(arr) // 2
L = arr[:mid]
R = arr[mid:]
mergeSort(L)
mergeSort(R)
i = j = k = 0
while i < len(L) and j < len(R):
if L[i] < R[j]:
arr[k] = L[i]
i += 1
else:
arr[k] = R[j]
j += 1
k += 1
while i < len(L):
arr[k] = L[i]
i += 1
k += 1
while j < len(R):
arr[k] = R[j]
j += 1
k += 1
# Example
arr = [12, 11, 13, 5, 6, 7]
mergeSort(arr)
print("Sorted array is:", arr)
How it Works:
Heap Sort is an efficient, comparison-based algorithm that builds a max heap from the input data. It repeatedly removes the root (maximum) element from the heap, places it at the end of the array, and “heapifies” the remaining elements. This process continues until the array is fully sorted.
Python Code:
def heapify(arr, n, i):
largest = i
left = 2 * i + 1
right = 2 * i + 2
if left < n and arr[i] < arr[left]:
largest = left
if right < n and arr[largest] < arr[right]:
largest = right
if largest != i:
arr[i], arr[largest] = arr[largest], arr[i]
heapify(arr, n, largest)
def heapSort(arr):
n = len(arr)
for i in range(n // 2 - 1, -1, -1):
heapify(arr, n, i)
for i in range(n-1, 0, -1):
arr[i], arr[0] = arr[0], arr[i]
heapify(arr, i, 0)
# Example
arr = [4, 10, 3, 5, 1]
heapSort(arr)
print("Sorted array is:", arr)
How it Works:
Counting Sort works by counting the occurrences of each distinct element in the input array and storing these counts in an auxiliary array. The sorted array is then constructed by using the cumulative counts to place each element in its correct position.
Python Code:
def countingSort(arr):
max_val = max(arr)
count = [0] * (max_val + 1)
output = [0] * len(arr)
for num in arr:
count[num] += 1
for i in range(1, len(count)):
count[i] += count[i - 1]
for num in reversed(arr):
output[count[num] - 1] = num
count[num] -= 1
return output
# Example
arr = [4, 2, 2, 8, 3, 3, 1]
sorted_arr = countingSort(arr)
print("Sorted array is:", sorted_arr)
How it Works:
Radix Sort sorts integers by processing each digit individually, starting from the least significant digit to the most significant digit. It uses Counting Sort as a subroutine to sort numbers based on each digit, one place at a time.
Python Code:
def countingSortForRadix(arr, exp):
n = len(arr)
output = [0] * n
count = [0] * 10
for i in range(n):
index = arr[i] // exp
count[index % 10] += 1
for i in range(1, 10):
count[i] += count[i - 1]
for i in range(n - 1, -1, -1):
index = arr[i] // exp
output[count[index % 10] - 1] = arr[i]
count[index % 10] -= 1
for i in range(n):
arr[i] = output[i]
def radixSort(arr):
max_val = max(arr)
exp = 1
while max_val // exp > 0:
countingSortForRadix(arr, exp)
exp *= 10
# Example
arr = [170, 45, 75, 90, 802, 24, 2, 66]
radixSort(arr)
print("Sorted array is:", arr)
How it Works:
Selection Sort divides the array into a sorted and an unsorted region. It repeatedly searches for the smallest element in the unsorted portion and swaps it with the leftmost unsorted element, gradually building a sorted list on the left side of the array.
Python Code:
def selectionSort(arr):
for i in range(len(arr)):
min_idx = i
for j in range(i+1, len(arr)):
if arr[min_idx] > arr[j]:
min_idx = j
arr[i], arr[min_idx] = arr[min_idx], arr[i]
# Example
arr = [6, 25, 10, 28, 11]
selectionSort(arr)
print("Sorted array is:", arr)
How it Works:
Insertion Sort is an adaptive, comparison-based algorithm that builds the sorted array one element at a time by inserting each new element into its correct position within the sorted portion. This is similar to the way people organize playing cards by picking one card at a time and placing it in the correct order with the already-sorted cards.
Python Code:
def insertionSort(arr):
for i in range(1, len(arr)):
key = arr[i]
j = i - 1
while j >= 0 and key < arr[j]:
arr[j + 1] = arr[j]
j -= 1
arr[j + 1] = key
# Example
arr = [12, 11, 13, 5, 6]
insertionSort(arr)
print("Sorted array is:", arr)
How it Works:
Bucket Sort works by dividing the dataset into a set number of “buckets” based on a specific range. Each bucket holds elements that fall within a particular range. After distributing elements into buckets, each bucket is sorted individually, often using another sorting algorithm like Insertion Sort. Finally, the sorted buckets are combined to form the complete sorted array.
Python Code:
def bucketSort(arr):
bucket_count = 10
max_value = max(arr)
size = max_value / bucket_count
buckets = [[] for _ in range(bucket_count)]
for num in arr:
bucket_index = int(num / size)
if bucket_index != bucket_count:
buckets[bucket_index].append(num)
else:
buckets[bucket_count - 1].append(num)
for bucket in buckets:
bucket.sort()
sorted_arr = []
for bucket in buckets:
sorted_arr.extend(bucket)
return sorted_arr
# Example
arr = [0.78, 0.17, 0.39, 0.26, 0.72, 0.94, 0.21, 0.12, 0.23, 0.68]
sorted_arr = bucketSort(arr)
print("Sorted array is:", sorted_arr)
How it Works:
Shell Sort is an enhanced version of Insertion Sort that improves sorting efficiency by first sorting elements far apart, gradually reducing the gap until it becomes 1 (like Insertion Sort). By using a decreasing gap sequence, Shell Sort allows elements to be moved closer to their correct position faster, reducing the total number of shifts needed.
Python Code:
def shellSort(arr):
n = len(arr)
gap = n // 2
while gap > 0:
for i in range(gap, n):
temp = arr[i]
j = i
while j >= gap and arr[j - gap] > temp:
arr[j] = arr[j - gap]
j -= gap
arr[j] = temp
gap //= 2
# Example
arr = [12, 34, 54, 2, 3]
shellSort(arr)
print("Sorted array is:", arr)
Curious about Data Frames in Python? Check out our in-depth tutorial and level up your Python skills today!
Sorting in data structures comes with challenges that affect speed, efficiency, and memory use, especially with large datasets.
Time Complexity is a measure of the amount of time an algorithm takes to complete as a function of the input size (n). It reflects the number of operations an algorithm performs and is generally expressed in Big O notation. Time complexity helps in understanding the efficiency of an algorithm across different scenarios:
Space Complexity is a measure of the amount of memory an algorithm requires during execution. This includes memory needed for input, auxiliary variables, and any additional data structures. Like time complexity, space complexity is usually expressed in Big O notation. It helps in evaluating how memory-efficient an algorithm is, especially when working with memory constraints.
This table provides a comparison of time complexity (best, worst, and average cases) and space complexity for each commonly used sorting algorithm. Examples are included to highlight scenarios where each algorithm performs best based on data size and memory considerations.
Sorting Algorithm |
Best Case |
Average Case |
Worst Case |
Space Complexity |
Example Use Cases |
Bubble Sort |
O(n) |
O(n²) |
O(n²) |
O(1) |
Small datasets or nearly sorted data where simplicity is prioritized over speed. |
Selection Sort |
O(n²) |
O(n²) |
O(n²) |
O(1) |
Small datasets where memory is limited but time efficiency is not critical. |
Insertion Sort |
O(n) |
O(n²) |
O(n²) |
O(1) |
Nearly sorted or small datasets where adaptive performance is needed. |
Merge Sort |
O(n log n) |
O(n log n) |
O(n log n) |
O(n) |
Large datasets, especially for external sorting (e.g., databases) requiring stable sorting. |
Quick Sort |
O(n log n) |
O(n log n) |
O(n²) |
O(log n) |
Large, unsorted datasets where speed is crucial, such as in search engines. |
Heap Sort |
O(n log n) |
O(n log n) |
O(n log n) |
O(1) |
Priority queues or large datasets where in-place sorting is required. |
Counting Sort |
O(n + k) |
O(n + k) |
O(n + k) |
O(k) |
Fixed-range integer data, such as grading or categorizing data by ranges. |
Radix Sort |
O(nk) |
O(nk) |
O(nk) |
O(n + k) |
Sorting large numbers or data with a fixed digit length. |
Bucket Sort |
O(n) |
O(n) |
O(n²) |
O(n) |
Uniformly distributed data, like decimal numbers between 0 and 1. |
Shell Sort |
O(n log n) |
Varies (depends on gap sequence) |
Varies (depends on gap sequence) |
O(1) |
Medium-sized datasets that are already partially sorted. |
Stability, in-place sorting, and adaptiveness can help choose the right sorting algorithm based on data requirements.
Sorting algorithms are everywhere in real life, and help organize data for faster access and better management. Here are a few examples:
Here’s a quick guide:
Examples:
Sorting algorithms help organize and process data efficiently. upGrad’s data science courses can teach you to apply them in real-world projects.
Check them out to build hands-on skills for your career.
Quick Sort
Merge Sort
Heap Sort
Counting Sort
Build your skills with upGrad’s data science courses. Learn essential sorting algorithms and gain real-world experience. Start today!
Ready to become a data expert? Check out our Online Data Science Courses!
Learn from top industry professionals and see your career progress in this high-demand field. Start your journey today!
Enhance your expertise with our Data Science Courses. Explore the programs below to find your perfect fit.
Elevate your sought-after data science skills with our leading programs. Explore the perfect course for your goals below.
Discover trending articles on data science to deepen your knowledge. Browse the resources below to find your ideal match.
Get Free Consultation
By submitting, I accept the T&C and
Privacy Policy
Start Your Career in Data Science Today
Top Resources