Explore Courses
Liverpool Business SchoolLiverpool Business SchoolMBA by Liverpool Business School
  • 18 Months
Bestseller
Golden Gate UniversityGolden Gate UniversityMBA (Master of Business Administration)
  • 15 Months
Popular
O.P.Jindal Global UniversityO.P.Jindal Global UniversityMaster of Business Administration (MBA)
  • 12 Months
New
Birla Institute of Management Technology Birla Institute of Management Technology Post Graduate Diploma in Management (BIMTECH)
  • 24 Months
Liverpool John Moores UniversityLiverpool John Moores UniversityMS in Data Science
  • 18 Months
Popular
IIIT BangaloreIIIT BangalorePost Graduate Programme in Data Science & AI (Executive)
  • 12 Months
Bestseller
Golden Gate UniversityGolden Gate UniversityDBA in Emerging Technologies with concentration in Generative AI
  • 3 Years
upGradupGradData Science Bootcamp with AI
  • 6 Months
New
University of MarylandIIIT BangalorePost Graduate Certificate in Data Science & AI (Executive)
  • 8-8.5 Months
upGradupGradData Science Bootcamp with AI
  • 6 months
Popular
upGrad KnowledgeHutupGrad KnowledgeHutData Engineer Bootcamp
  • Self-Paced
upGradupGradCertificate Course in Business Analytics & Consulting in association with PwC India
  • 06 Months
OP Jindal Global UniversityOP Jindal Global UniversityMaster of Design in User Experience Design
  • 12 Months
Popular
WoolfWoolfMaster of Science in Computer Science
  • 18 Months
New
Jindal Global UniversityJindal Global UniversityMaster of Design in User Experience
  • 12 Months
New
Rushford, GenevaRushford Business SchoolDBA Doctorate in Technology (Computer Science)
  • 36 Months
IIIT BangaloreIIIT BangaloreCloud Computing and DevOps Program (Executive)
  • 8 Months
New
upGrad KnowledgeHutupGrad KnowledgeHutAWS Solutions Architect Certification
  • 32 Hours
upGradupGradFull Stack Software Development Bootcamp
  • 6 Months
Popular
upGradupGradUI/UX Bootcamp
  • 3 Months
upGradupGradCloud Computing Bootcamp
  • 7.5 Months
Golden Gate University Golden Gate University Doctor of Business Administration in Digital Leadership
  • 36 Months
New
Jindal Global UniversityJindal Global UniversityMaster of Design in User Experience
  • 12 Months
New
Golden Gate University Golden Gate University Doctor of Business Administration (DBA)
  • 36 Months
Bestseller
Ecole Supérieure de Gestion et Commerce International ParisEcole Supérieure de Gestion et Commerce International ParisDoctorate of Business Administration (DBA)
  • 36 Months
Rushford, GenevaRushford Business SchoolDoctorate of Business Administration (DBA)
  • 36 Months
KnowledgeHut upGradKnowledgeHut upGradSAFe® 6.0 Certified ScrumMaster (SSM) Training
  • Self-Paced
KnowledgeHut upGradKnowledgeHut upGradPMP® certification
  • Self-Paced
IIM KozhikodeIIM KozhikodeProfessional Certification in HR Management and Analytics
  • 6 Months
Bestseller
Duke CEDuke CEPost Graduate Certificate in Product Management
  • 4-8 Months
Bestseller
upGrad KnowledgeHutupGrad KnowledgeHutLeading SAFe® 6.0 Certification
  • 16 Hours
Popular
upGrad KnowledgeHutupGrad KnowledgeHutCertified ScrumMaster®(CSM) Training
  • 16 Hours
Bestseller
PwCupGrad CampusCertification Program in Financial Modelling & Analysis in association with PwC India
  • 4 Months
upGrad KnowledgeHutupGrad KnowledgeHutSAFe® 6.0 POPM Certification
  • 16 Hours
O.P.Jindal Global UniversityO.P.Jindal Global UniversityMaster of Science in Artificial Intelligence and Data Science
  • 12 Months
Bestseller
Liverpool John Moores University Liverpool John Moores University MS in Machine Learning & AI
  • 18 Months
Popular
Golden Gate UniversityGolden Gate UniversityDBA in Emerging Technologies with concentration in Generative AI
  • 3 Years
IIIT BangaloreIIIT BangaloreExecutive Post Graduate Programme in Machine Learning & AI
  • 13 Months
Bestseller
IIITBIIITBExecutive Program in Generative AI for Leaders
  • 4 Months
upGradupGradAdvanced Certificate Program in GenerativeAI
  • 4 Months
New
IIIT BangaloreIIIT BangalorePost Graduate Certificate in Machine Learning & Deep Learning (Executive)
  • 8 Months
Bestseller
Jindal Global UniversityJindal Global UniversityMaster of Design in User Experience
  • 12 Months
New
Liverpool Business SchoolLiverpool Business SchoolMBA with Marketing Concentration
  • 18 Months
Bestseller
Golden Gate UniversityGolden Gate UniversityMBA with Marketing Concentration
  • 15 Months
Popular
MICAMICAAdvanced Certificate in Digital Marketing and Communication
  • 6 Months
Bestseller
MICAMICAAdvanced Certificate in Brand Communication Management
  • 5 Months
Popular
upGradupGradDigital Marketing Accelerator Program
  • 05 Months
Jindal Global Law SchoolJindal Global Law SchoolLL.M. in Corporate & Financial Law
  • 12 Months
Bestseller
Jindal Global Law SchoolJindal Global Law SchoolLL.M. in AI and Emerging Technologies (Blended Learning Program)
  • 12 Months
Jindal Global Law SchoolJindal Global Law SchoolLL.M. in Intellectual Property & Technology Law
  • 12 Months
Jindal Global Law SchoolJindal Global Law SchoolLL.M. in Dispute Resolution
  • 12 Months
upGradupGradContract Law Certificate Program
  • Self paced
New
ESGCI, ParisESGCI, ParisDoctorate of Business Administration (DBA) from ESGCI, Paris
  • 36 Months
Golden Gate University Golden Gate University Doctor of Business Administration From Golden Gate University, San Francisco
  • 36 Months
Rushford Business SchoolRushford Business SchoolDoctor of Business Administration from Rushford Business School, Switzerland)
  • 36 Months
Edgewood CollegeEdgewood CollegeDoctorate of Business Administration from Edgewood College
  • 24 Months
Golden Gate UniversityGolden Gate UniversityDBA in Emerging Technologies with Concentration in Generative AI
  • 36 Months
Golden Gate University Golden Gate University DBA in Digital Leadership from Golden Gate University, San Francisco
  • 36 Months
Liverpool Business SchoolLiverpool Business SchoolMBA by Liverpool Business School
  • 18 Months
Bestseller
Golden Gate UniversityGolden Gate UniversityMBA (Master of Business Administration)
  • 15 Months
Popular
O.P.Jindal Global UniversityO.P.Jindal Global UniversityMaster of Business Administration (MBA)
  • 12 Months
New
Deakin Business School and Institute of Management Technology, GhaziabadDeakin Business School and IMT, GhaziabadMBA (Master of Business Administration)
  • 12 Months
Liverpool John Moores UniversityLiverpool John Moores UniversityMS in Data Science
  • 18 Months
Bestseller
O.P.Jindal Global UniversityO.P.Jindal Global UniversityMaster of Science in Artificial Intelligence and Data Science
  • 12 Months
Bestseller
IIIT BangaloreIIIT BangalorePost Graduate Programme in Data Science (Executive)
  • 12 Months
Bestseller
O.P.Jindal Global UniversityO.P.Jindal Global UniversityO.P.Jindal Global University
  • 12 Months
WoolfWoolfMaster of Science in Computer Science
  • 18 Months
New
Liverpool John Moores University Liverpool John Moores University MS in Machine Learning & AI
  • 18 Months
Popular
Golden Gate UniversityGolden Gate UniversityDBA in Emerging Technologies with concentration in Generative AI
  • 3 Years
Rushford, GenevaRushford Business SchoolDoctorate of Business Administration (AI/ML)
  • 36 Months
Ecole Supérieure de Gestion et Commerce International ParisEcole Supérieure de Gestion et Commerce International ParisDBA Specialisation in AI & ML
  • 36 Months
Golden Gate University Golden Gate University Doctor of Business Administration (DBA)
  • 36 Months
Bestseller
Ecole Supérieure de Gestion et Commerce International ParisEcole Supérieure de Gestion et Commerce International ParisDoctorate of Business Administration (DBA)
  • 36 Months
Rushford, GenevaRushford Business SchoolDoctorate of Business Administration (DBA)
  • 36 Months
Liverpool Business SchoolLiverpool Business SchoolMBA with Marketing Concentration
  • 18 Months
Bestseller
Golden Gate UniversityGolden Gate UniversityMBA with Marketing Concentration
  • 15 Months
Popular
Jindal Global Law SchoolJindal Global Law SchoolLL.M. in Corporate & Financial Law
  • 12 Months
Bestseller
Jindal Global Law SchoolJindal Global Law SchoolLL.M. in Intellectual Property & Technology Law
  • 12 Months
Jindal Global Law SchoolJindal Global Law SchoolLL.M. in Dispute Resolution
  • 12 Months
IIITBIIITBExecutive Program in Generative AI for Leaders
  • 4 Months
New
IIIT BangaloreIIIT BangaloreExecutive Post Graduate Programme in Machine Learning & AI
  • 13 Months
Bestseller
upGradupGradData Science Bootcamp with AI
  • 6 Months
New
upGradupGradAdvanced Certificate Program in GenerativeAI
  • 4 Months
New
KnowledgeHut upGradKnowledgeHut upGradSAFe® 6.0 Certified ScrumMaster (SSM) Training
  • Self-Paced
upGrad KnowledgeHutupGrad KnowledgeHutCertified ScrumMaster®(CSM) Training
  • 16 Hours
upGrad KnowledgeHutupGrad KnowledgeHutLeading SAFe® 6.0 Certification
  • 16 Hours
KnowledgeHut upGradKnowledgeHut upGradPMP® certification
  • Self-Paced
upGrad KnowledgeHutupGrad KnowledgeHutAWS Solutions Architect Certification
  • 32 Hours
upGrad KnowledgeHutupGrad KnowledgeHutAzure Administrator Certification (AZ-104)
  • 24 Hours
KnowledgeHut upGradKnowledgeHut upGradAWS Cloud Practioner Essentials Certification
  • 1 Week
KnowledgeHut upGradKnowledgeHut upGradAzure Data Engineering Training (DP-203)
  • 1 Week
MICAMICAAdvanced Certificate in Digital Marketing and Communication
  • 6 Months
Bestseller
MICAMICAAdvanced Certificate in Brand Communication Management
  • 5 Months
Popular
IIM KozhikodeIIM KozhikodeProfessional Certification in HR Management and Analytics
  • 6 Months
Bestseller
Duke CEDuke CEPost Graduate Certificate in Product Management
  • 4-8 Months
Bestseller
Loyola Institute of Business Administration (LIBA)Loyola Institute of Business Administration (LIBA)Executive PG Programme in Human Resource Management
  • 11 Months
Popular
Goa Institute of ManagementGoa Institute of ManagementExecutive PG Program in Healthcare Management
  • 11 Months
IMT GhaziabadIMT GhaziabadAdvanced General Management Program
  • 11 Months
Golden Gate UniversityGolden Gate UniversityProfessional Certificate in Global Business Management
  • 6-8 Months
upGradupGradContract Law Certificate Program
  • Self paced
New
IU, GermanyIU, GermanyMaster of Business Administration (90 ECTS)
  • 18 Months
Bestseller
IU, GermanyIU, GermanyMaster in International Management (120 ECTS)
  • 24 Months
Popular
IU, GermanyIU, GermanyB.Sc. Computer Science (180 ECTS)
  • 36 Months
Clark UniversityClark UniversityMaster of Business Administration
  • 23 Months
New
Golden Gate UniversityGolden Gate UniversityMaster of Business Administration
  • 20 Months
Clark University, USClark University, USMS in Project Management
  • 20 Months
New
Edgewood CollegeEdgewood CollegeMaster of Business Administration
  • 23 Months
The American Business SchoolThe American Business SchoolMBA with specialization
  • 23 Months
New
Aivancity ParisAivancity ParisMSc Artificial Intelligence Engineering
  • 24 Months
Aivancity ParisAivancity ParisMSc Data Engineering
  • 24 Months
The American Business SchoolThe American Business SchoolMBA with specialization
  • 23 Months
New
Aivancity ParisAivancity ParisMSc Artificial Intelligence Engineering
  • 24 Months
Aivancity ParisAivancity ParisMSc Data Engineering
  • 24 Months
upGradupGradData Science Bootcamp with AI
  • 6 Months
Popular
upGrad KnowledgeHutupGrad KnowledgeHutData Engineer Bootcamp
  • Self-Paced
upGradupGradFull Stack Software Development Bootcamp
  • 6 Months
Bestseller
KnowledgeHut upGradKnowledgeHut upGradBackend Development Bootcamp
  • Self-Paced
upGradupGradUI/UX Bootcamp
  • 3 Months
upGradupGradCloud Computing Bootcamp
  • 7.5 Months
PwCupGrad CampusCertification Program in Financial Modelling & Analysis in association with PwC India
  • 5 Months
upGrad KnowledgeHutupGrad KnowledgeHutSAFe® 6.0 POPM Certification
  • 16 Hours
upGradupGradDigital Marketing Accelerator Program
  • 05 Months
upGradupGradAdvanced Certificate Program in GenerativeAI
  • 4 Months
New
upGradupGradData Science Bootcamp with AI
  • 6 Months
Popular
upGradupGradFull Stack Software Development Bootcamp
  • 6 Months
Bestseller
upGradupGradUI/UX Bootcamp
  • 3 Months
PwCupGrad CampusCertification Program in Financial Modelling & Analysis in association with PwC India
  • 4 Months
upGradupGradCertificate Course in Business Analytics & Consulting in association with PwC India
  • 06 Months
upGradupGradDigital Marketing Accelerator Program
  • 05 Months

Sorting in Data Structure: Categories & Types [With Examples]

Updated on 30 October, 2024

142.09K+ views
22 min read

Sorting in data structures is the process of arranging data in a specific order, usually ascending or descending. Sorting makes data easier to navigate and improves the efficiency of tasks like searching and analysis. Think of a dictionary—without alphabetical order, finding a word would be difficult and time-consuming. For data professionals, sorting organizes raw data into a structured format, making it manageable and accessible.

Without sorting, handling large datasets becomes challenging, slowing down essential operations. Sorting brings clarity and structure, allowing faster access to key information.

In this post, we’ll explore the main types of sorting algorithms in data structures. First, let’s look at what a sorting algorithm is and its role in data organization.

  • What is Sorting in Data Structure?: A set of steps that arranges elements in a specific order, starting with an unsorted array and resulting in an ordered sequence.
  • Example: Input array: [h, j, k, i, n, m, o, l]
    Sorted output: [h, i, j, k, l, m, n, o]

Continue reading as we cover the types of sorting algorithms, categories, and their role in data structure management.

Sorting makes life easier. Explore our data science courses to learn more about data science algorithms. Check out now!

 

Why is Sorting Important in Data Structures?

Sorting in data structure is fundamental because it directly impacts the performance of search and indexing operations. In a sorted dataset, algorithms like binary search can be applied, reducing search time complexity from O(n) to O(log n). Sorting is also essential for efficient data processing in areas like ranking, priority-based retrieval, and merging data.

Checkout: Types of Binary Tree

Key Technical Benefits:

  • Enhanced Search Efficiency: Sorted data enables binary search, significantly reducing search time.
  • Improved Data Access: Operations like finding the minimum, maximum, or median are faster with sorted data.
  • Basis for Complex Algorithms: Sorting is required for several advanced techniques, such as divide-and-conquer strategies and tree-based structures, where ordered data is necessary.

Sorting ensures that data is organized, making it accessible and efficient for further operations in data processing and analysis.

Sorting Categories: Internal vs External Sorting

Sorting algorithms are classified into internal sorting and external sorting based on data size and memory constraints.

Sorting Type

Definition

Optimal Use Case

Common Algorithms

Internal Sorting

Sorting done entirely in RAM.

Suitable for small to medium datasets.

Quick Sort, Merge Sort, Heap Sort

External Sorting

Sorting that requires external storage, usually disk-based.

Used for large datasets that exceed memory.

External Merge Sort, Multi-way Merge Sort

Internal Sorting

Internal sorting in data structure is used when the entire dataset can fit within main memory (RAM), allowing for faster processing without requiring external storage access.

  • Characteristics:
    • Memory-Based Processing: Works entirely in RAM, which is faster than accessing external storage.
    • Best for Small to Medium Datasets: Suitable when data size does not exceed available memory.
  • Common Algorithms:
    • Quick Sort: An in-place algorithm with average-case complexity of O(n log n). Effective but has a worst-case time complexity of O(n²) if the pivot selection is poor.
    • Merge Sort: A stable algorithm with O(n log n) complexity, requiring additional memory for merging.
    • Heap Sort: Uses a binary heap structure to achieve O(n log n) complexity, often used in priority-based sorting tasks.

Typical Use: Internal sorting is used in applications where data fits in memory, such as sorting lists or arrays in moderate-sized datasets.

External Sorting

External sorting in data structure is required when the dataset is too large to fit into main memory and must use external storage for sorting operations.

  • Characteristics:
    • Disk-Based Processing: Requires data to be read from and written to external storage, impacting performance.
    • Best for Large Datasets: Necessary when data exceeds memory capacity, such as in large-scale data processing.
  • Common Algorithms:
    • External Merge Sort: Splits data into manageable chunks, sorts each chunk in memory, and merges them. Efficient for large datasets, with a time complexity of O(n log n).
    • Multi-way Merge Sort: Extends merge sort to sort and merge multiple chunks simultaneously, reducing the number of I/O operations.

Typical Use: Used in systems managing large data files, such as database systems, where sorting needs exceed available RAM.

Categories of Sorting Algorithms: Comparison vs Non-Comparison Sorting

Sorting algorithms are categorized into comparison-based and non-comparison-based types, each with specific technical characteristics suited for different data structures and application requirements.

Category

Definition

Examples

Optimal Use Case

Comparison-based Sorting

Uses direct element comparisons, O(n log n) limit.

Bubble Sort, Quick Sort, Merge Sort, Heap Sort

Flexible, general-purpose sorting on datasets where comparisons are feasible.

Non-Comparison-based Sorting

Bypasses comparisons, uses other properties for sorting, achieving O(n) in constrained cases.

Counting Sort, Radix Sort, Bucket Sort

Ideal for large datasets with known structures, achieving linear time in specific scenarios.

Comparison-based Sorting Algorithms

Comparison-based sorting algorithms determine order by comparing elements directly, with a lower bound time complexity of O(n log n) due to comparison limits in sorting.

  • Examples:
    • Bubble Sort: Uses adjacent comparisons and swaps, O(n²) time complexity, inefficient on large datasets but stable.
    • Quick Sort: Recursive, in-place, average O(n log n) with O(n²) worst case; relies on pivot selection for optimal performance.
    • Merge Sort: Stable, O(n log n) complexity, memory-intensive due to merging, making it effective for linked lists.
    • Heap Sort: Constructs a max-heap, sorts in O(n log n), suitable for priority sorting without additional memory.

Use Cases:

  • General-purpose sorting where elements can be directly compared.
  • Effective on data types with defined comparison operators.
  • Ideal for in-memory datasets with moderate size.

Non-Comparison-based Sorting Algorithms

Non-comparison sorting algorithms bypass element comparisons, using techniques like digit processing or counting, achieving linear time complexity under specific conditions.

  • Examples:
    • Counting Sort: Uses frequency counts; O(n + k) complexity, where k is value range, best for small integer ranges.
    • Radix Sort: Processes digits from least to most significant; O(d * (n + b)) complexity, where d is digit count and b is base, ideal for fixed-length integers or strings.
    • Bucket Sort: Divides data into "buckets"; O(n) complexity in the best case, efficient for uniformly distributed data.

Use Cases:

  • Optimal for data with known characteristics like limited ranges or digit-based values.
  • Achieves linear time for specific cases (e.g., integer or string sorting).
  • Commonly used for large datasets with restricted value ranges.

10 Top Algorithms of Sorting in Data Structures Explained

For each algorithm, you’ll find a clear explanation, practical examples, and Python code to see them in action.

1. Bubble Sort Algorithm: Step-by-Step Process 

How it Works:
Bubble Sort is a simple, comparison-based algorithm that repeatedly steps through the list, compares adjacent elements, and swaps them if they are in the wrong order. Each pass through the list places the next-largest element in its correct position, “bubbling” the largest unsorted elements to the end of the list.

  • Time Complexity: O(n²) in the average and worst cases, due to nested loops comparing and swapping each element.
  • Space Complexity: O(1), as Bubble Sort operates in place without requiring additional memory.
  • Characteristics: Stable, meaning it preserves the order of duplicate elements, but inefficient for large datasets due to quadratic time complexity.

Example Use Case:
Bubble Sort is often used in educational contexts to demonstrate sorting fundamentals. In practice, it may be used in scenarios with small, nearly sorted datasets where simplicity is prioritized over efficiency.

Step-by-Step Example: Given an input array [6, 3, 7, 1, 2, 4]:

  1. First Pass:
    • Compare 6 and 3; swap → [3, 6, 7, 1, 2, 4]
    • Compare 6 and 7; no swap → [3, 6, 7, 1, 2, 4]
    • Compare 7 and 1; swap → [3, 6, 1, 7, 2, 4]
    • Continue until the largest element “bubbles” to the end.

Python Code:

def bubbleSort(arr):

    n = len(arr)

    for i in range(n):

        for j in range(0, n-i-1):

            if arr[j] > arr[j+1]:

                arr[j], arr[j+1] = arr[j+1], arr[j]

    return arr


# Example

arr = [64, 34, 25, 12, 22, 11, 90]

print("Sorted array is:", bubbleSort(arr))

2. Quick Sort: Fastest Sorting for Large Datasets

How it Works:
Quick Sort is a divide-and-conquer algorithm that selects a “pivot” element and partitions the array around the pivot. Elements smaller than the pivot are placed on its left, and elements greater than the pivot on its right. This partitioning continues recursively until the array is fully sorted.

  • Time Complexity: O(n log n) on average. Worst case is O(n²), occurring when the smallest or largest element is always chosen as the pivot.
  • Space Complexity: O(log n) due to recursive stack calls in an ideal scenario.
  • Characteristics: Quick Sort is not stable but is highly efficient for large datasets due to its low average complexity and in-place sorting.

Example Use Case:
Quick Sort is ideal for large datasets and is commonly used in applications like search engines and database sorting where fast sorting is required.

Step-by-Step Example: For an array [6, 3, 7, 1, 2, 4], using 4 as the pivot:

  1. Partitioning: Elements less than 4 ([3, 1, 2]) and elements greater than 4 ([6, 7]).
  2. Recursively apply Quick Sort on each partition until the entire array is sorted.

Python Code:

def quickSort(arr):

    if len(arr) <= 1:

        return arr

    pivot = arr[len(arr) // 2]

    left = [x for x in arr if x < pivot]

    middle = [x for x in arr if x == pivot]

    right = [x for x in arr if x > pivot]

    return quickSort(left) + middle + quickSort(right)

# Example

arr = [3, 6, 8, 10, 1, 2, 1]

print("Sorted array is:", quickSort(arr))

3. Merge Sort: Efficient for External Sorting

How it Works:
Merge Sort is a stable, divide-and-conquer algorithm that recursively divides the array into two halves, sorts each half, and merges the sorted halves back together. It’s highly efficient for sorting linked lists or external sorting in database systems, as it maintains stable ordering.

  • Time Complexity: O(n log n) in all cases due to consistent halving and merging operations.
  • Space Complexity: O(n), as it requires additional memory for merging sorted halves.
  • Characteristics: Stable sorting, which maintains the relative order of duplicate elements, making it suitable for datasets where order preservation is important.

Example Use Case:
Merge Sort is preferred for large data stored on external devices, such as in database systems where merging sorted chunks of data is required.

Step-by-Step Example: For an array [6, 3, 7, 1, 2, 4]:

  1. Split into [6, 3, 7] and [1, 2, 4].
  2. Sort each half recursively, then merge the sorted halves into [1, 2, 3, 4, 6, 7].

Python Code:

def mergeSort(arr):

    if len(arr) > 1:

        mid = len(arr) // 2

        L = arr[:mid]

        R = arr[mid:]

        mergeSort(L)

        mergeSort(R)

        i = j = k = 0

        while i < len(L) and j < len(R):

            if L[i] < R[j]:

                arr[k] = L[i]

                i += 1

            else:

                arr[k] = R[j]

                j += 1

            k += 1

        while i < len(L):

            arr[k] = L[i]

            i += 1

            k += 1

        while j < len(R):

            arr[k] = R[j]

            j += 1

            k += 1

# Example

arr = [12, 11, 13, 5, 6, 7]

mergeSort(arr)

print("Sorted array is:", arr)

4. Heap Sort: Memory-Efficient Sorting

How it Works:
Heap Sort is an efficient, comparison-based algorithm that builds a max heap from the input data. It repeatedly removes the root (maximum) element from the heap, places it at the end of the array, and “heapifies” the remaining elements. This process continues until the array is fully sorted.

  • Steps:
    1. Build a Max Heap from the input data.
    2. Extract the Maximum Element (root) and place it at the end of the array.
    3. Heapify the Remaining Elements to maintain the max heap structure.
    4. Repeat until all elements are sorted.
  • Time Complexity: O(n log n) for all cases, as heapifying each element takes log n time and is repeated for each element.
  • Space Complexity: O(1), as Heap Sort is an in-place sorting algorithm and doesn’t require additional memory.
  • Example: Used in applications like priority queue management, where items need to be retrieved based on priority order.

Python Code:

def heapify(arr, n, i):

    largest = i

    left = 2 * i + 1

    right = 2 * i + 2

    if left < n and arr[i] < arr[left]:

        largest = left

    if right < n and arr[largest] < arr[right]:

        largest = right

    if largest != i:

        arr[i], arr[largest] = arr[largest], arr[i]

        heapify(arr, n, largest)

def heapSort(arr):

    n = len(arr)

    for i in range(n // 2 - 1, -1, -1):

        heapify(arr, n, i)

    for i in range(n-1, 0, -1):

        arr[i], arr[0] = arr[0], arr[i]

        heapify(arr, i, 0)

# Example

arr = [4, 10, 3, 5, 1]

heapSort(arr)

print("Sorted array is:", arr)

5. Counting Sort: Best for Limited Range Data

How it Works:
Counting Sort works by counting the occurrences of each distinct element in the input array and storing these counts in an auxiliary array. The sorted array is then constructed by using the cumulative counts to place each element in its correct position.

  • Steps:
    1. Count Occurrences: Count each element and store it in the count array.
    2. Calculate Cumulative Counts: Modify the count array to reflect the positions of each element.
    3. Build the Sorted Array: Use the cumulative count array to place elements in the sorted order.
  • Time Complexity: O(n + k), where n is the number of elements and k is the range of the input.
  • Space Complexity: O(k), as additional memory is needed for the count array.
  • Example: Suitable for applications with fixed, limited-range data, such as sorting grades, where the data values fall within a specific range.

Python Code:

def countingSort(arr):

    max_val = max(arr)

    count = [0] * (max_val + 1)

    output = [0] * len(arr)

    for num in arr:

        count[num] += 1

    for i in range(1, len(count)):

        count[i] += count[i - 1]

    for num in reversed(arr):

        output[count[num] - 1] = num

        count[num] -= 1

    return output

# Example

arr = [4, 2, 2, 8, 3, 3, 1]

sorted_arr = countingSort(arr)

print("Sorted array is:", sorted_arr)

6. Radix Sort: Sorting Numbers by Their Digits

How it Works:
Radix Sort sorts integers by processing each digit individually, starting from the least significant digit to the most significant digit. It uses Counting Sort as a subroutine to sort numbers based on each digit, one place at a time.

  • Steps:
    1. Sort by Least Significant Digit: Sort the numbers based on their least significant digit (units).
    2. Proceed to the Next Digit: Move to the next digit and repeat the sorting process.
    3. Repeat for All Digits: Continue until all digit places are processed, resulting in a fully sorted array.
  • Time Complexity: O(nk), where n is the number of elements and k is the maximum number of digits.
  • Space Complexity: O(n + k), due to the additional memory used in Counting Sort for each digit.
  • Example: Ideal for sorting large numbers or strings with a fixed digit length, often used in applications needing digit-based sorting.

Python Code:

def countingSortForRadix(arr, exp):

    n = len(arr)

    output = [0] * n

    count = [0] * 10

    for i in range(n):

        index = arr[i] // exp

        count[index % 10] += 1

    for i in range(1, 10):

        count[i] += count[i - 1]

    for i in range(n - 1, -1, -1):

        index = arr[i] // exp

        output[count[index % 10] - 1] = arr[i]

        count[index % 10] -= 1

    for i in range(n):

        arr[i] = output[i]

def radixSort(arr):

    max_val = max(arr)

    exp = 1

    while max_val // exp > 0:

        countingSortForRadix(arr, exp)

        exp *= 10

# Example

arr = [170, 45, 75, 90, 802, 24, 2, 66]

radixSort(arr)

print("Sorted array is:", arr)

7. Selection Sort: Simple but Inefficient

How it Works:
Selection Sort divides the array into a sorted and an unsorted region. It repeatedly searches for the smallest element in the unsorted portion and swaps it with the leftmost unsorted element, gradually building a sorted list on the left side of the array.

  • Steps:
    1. Find the Minimum Element in the unsorted part of the array.
    2. Swap the minimum element with the leftmost unsorted element.
    3. Move the Boundary between sorted and unsorted sections by one position to the right.
    4. Repeat until the entire array is sorted.
  • Time Complexity: O(n²) in all cases, as it performs O(n) searches for each element.
  • Space Complexity: O(1), as it is an in-place algorithm requiring no additional memory.
  • Example: Selection Sort is best suited for small datasets where memory efficiency is critical but time efficiency is not a primary concern.

Python Code:

def selectionSort(arr):

    for i in range(len(arr)):

        min_idx = i

        for j in range(i+1, len(arr)):

            if arr[min_idx] > arr[j]:

                min_idx = j

        arr[i], arr[min_idx] = arr[min_idx], arr[i]        

# Example

arr = [6, 25, 10, 28, 11]

selectionSort(arr)

print("Sorted array is:", arr)

8. Insertion Sort: Best for Nearly Sorted Data

How it Works:
Insertion Sort is an adaptive, comparison-based algorithm that builds the sorted array one element at a time by inserting each new element into its correct position within the sorted portion. This is similar to the way people organize playing cards by picking one card at a time and placing it in the correct order with the already-sorted cards.

  • Steps:
    1. Start from the second element and treat the first element as sorted.
    2. Compare the current element with the elements in the sorted part of the array, moving from right to left.
    3. Shift Elements to the right to make space for the current element if it’s smaller.
    4. Insert the current element in its correct position within the sorted portion.
    5. Repeat until all elements are placed in order.
  • Time Complexity: O(n²) in the average and worst cases, but O(n) in the best case for nearly sorted data.
  • Space Complexity: O(1), as it is an in-place algorithm.
  • Example: Insertion Sort is ideal for nearly sorted data or small datasets where fast insertion is needed. Its adaptive nature makes it efficient for datasets that are already close to being sorted.

Python Code:

def insertionSort(arr):

    for i in range(1, len(arr)):

        key = arr[i]

        j = i - 1

        while j >= 0 and key < arr[j]:

            arr[j + 1] = arr[j]

            j -= 1

        arr[j + 1] = key

# Example

arr = [12, 11, 13, 5, 6]

insertionSort(arr)

print("Sorted array is:", arr)

9. Bucket Sort: Grouping Data into Buckets

How it Works:
Bucket Sort works by dividing the dataset into a set number of “buckets” based on a specific range. Each bucket holds elements that fall within a particular range. After distributing elements into buckets, each bucket is sorted individually, often using another sorting algorithm like Insertion Sort. Finally, the sorted buckets are combined to form the complete sorted array.

  • Steps:
    1. Create Buckets: Initialize an empty array of buckets, each representing a specific range.
    2. Distribute Elements: Place each element into the appropriate bucket based on its value.
    3. Sort Individual Buckets: Sort each bucket using an appropriate sorting algorithm (usually Insertion Sort).
    4. Concatenate Buckets: Combine the sorted elements from each bucket into the final sorted array.
  • Time Complexity: O(n + k), where n is the number of elements and k is the number of buckets. The complexity depends on how evenly distributed the data is and the sorting method within each bucket.
  • Space Complexity: O(n), as it requires additional memory for the buckets.
  • Example: Bucket Sort is commonly used for sorting uniformly distributed data, such as floating-point numbers between 0 and 1 or data with values that fit within a known range.

Python Code:

def bucketSort(arr):

    bucket_count = 10

    max_value = max(arr)

    size = max_value / bucket_count

    buckets = [[] for _ in range(bucket_count)]

    for num in arr:

        bucket_index = int(num / size)

        if bucket_index != bucket_count:

            buckets[bucket_index].append(num)

        else:

            buckets[bucket_count - 1].append(num)

    for bucket in buckets:

        bucket.sort()

    sorted_arr = []

    for bucket in buckets:

        sorted_arr.extend(bucket)

        return sorted_arr

# Example

arr = [0.78, 0.17, 0.39, 0.26, 0.72, 0.94, 0.21, 0.12, 0.23, 0.68]

sorted_arr = bucketSort(arr)

print("Sorted array is:", sorted_arr)

10. Shell Sort: Improved Insertion Sort

How it Works:
Shell Sort is an enhanced version of Insertion Sort that improves sorting efficiency by first sorting elements far apart, gradually reducing the gap until it becomes 1 (like Insertion Sort). By using a decreasing gap sequence, Shell Sort allows elements to be moved closer to their correct position faster, reducing the total number of shifts needed.

  • Steps:
    1. Choose a Gap Sequence: Determine an initial gap, typically half the array length, and reduce it progressively.
    2. Sort with the Current Gap: For each gap, compare and sort elements separated by this distance.
    3. Reduce the Gap: Continue reducing the gap until it reaches 1, performing Insertion Sort in the final iteration.
  • Time Complexity: O(n log n) in the best cases, but varies based on the gap sequence. Common sequences include Shell’s original sequence and Hibbard’s sequence.
  • Space Complexity: O(1), as Shell Sort is an in-place algorithm.
  • Example: Shell Sort is effective for medium-sized datasets and is particularly useful when dealing with datasets that are already partially sorted.

Python Code:

def shellSort(arr):

    n = len(arr)

    gap = n // 2

 while gap > 0:

        for i in range(gap, n):

            temp = arr[i]

            j = i

            while j >= gap and arr[j - gap] > temp:

                arr[j] = arr[j - gap]

                j -= gap

            arr[j] = temp

        gap //= 2

# Example

arr = [12, 34, 54, 2, 3]

shellSort(arr)

print("Sorted array is:", arr)

Curious about Data Frames in Python? Check out our in-depth tutorial and level up your Python skills today!

 

Time and Space Complexity of Sorting Algorithms

Sorting in data structures comes with challenges that affect speed, efficiency, and memory use, especially with large datasets. 

What is Time Complexity?

Time Complexity is a measure of the amount of time an algorithm takes to complete as a function of the input size (n). It reflects the number of operations an algorithm performs and is generally expressed in Big O notation. Time complexity helps in understanding the efficiency of an algorithm across different scenarios:

  • Best Case: The minimum time the algorithm requires (often with optimally arranged input).
  • Average Case: The expected time for a typical input, showing general performance.
  • Worst Case: The maximum time the algorithm may take, which provides an upper bound for performance.

What is Space Complexity?

Space Complexity is a measure of the amount of memory an algorithm requires during execution. This includes memory needed for input, auxiliary variables, and any additional data structures. Like time complexity, space complexity is usually expressed in Big O notation. It helps in evaluating how memory-efficient an algorithm is, especially when working with memory constraints.

This table provides a comparison of time complexity (best, worst, and average cases) and space complexity for each commonly used sorting algorithm. Examples are included to highlight scenarios where each algorithm performs best based on data size and memory considerations.

Sorting Algorithm

Best Case

Average Case

Worst Case

Space Complexity

Example Use Cases

Bubble Sort

O(n)

O(n²)

O(n²)

O(1)

Small datasets or nearly sorted data where simplicity is prioritized over speed.

Selection Sort

O(n²)

O(n²)

O(n²)

O(1)

Small datasets where memory is limited but time efficiency is not critical.

Insertion Sort

O(n)

O(n²)

O(n²)

O(1)

Nearly sorted or small datasets where adaptive performance is needed.

Merge Sort

O(n log n)

O(n log n)

O(n log n)

O(n)

Large datasets, especially for external sorting (e.g., databases) requiring stable sorting.

Quick Sort

O(n log n)

O(n log n)

O(n²)

O(log n)

Large, unsorted datasets where speed is crucial, such as in search engines.

Heap Sort

O(n log n)

O(n log n)

O(n log n)

O(1)

Priority queues or large datasets where in-place sorting is required.

Counting Sort

O(n + k)

O(n + k)

O(n + k)

O(k)

Fixed-range integer data, such as grading or categorizing data by ranges.

Radix Sort

O(nk)

O(nk)

O(nk)

O(n + k)

Sorting large numbers or data with a fixed digit length.

Bucket Sort

O(n)

O(n)

O(n²)

O(n)

Uniformly distributed data, like decimal numbers between 0 and 1.

Shell Sort

O(n log n)

Varies (depends on gap sequence)

Varies (depends on gap sequence)

O(1)

Medium-sized datasets that are already partially sorted.

Sorting Algorithm Characteristics: Stability, In-Place, and Adaptiveness

Stability, in-place sorting, and adaptiveness can help choose the right sorting algorithm based on data requirements.

  • Stability: A stable sorting algorithm preserves the relative order of elements with equal values. Stability is particularly important when sorting data with multiple keys, as it ensures that previous orderings are retained.
    • Stable Algorithms: Bubble Sort, Insertion Sort, Merge Sort, Counting Sort, Radix Sort.
    • Non-Stable Algorithms: Quick Sort, Heap Sort, Selection Sort.
  • In-Place Sorting: An in-place algorithm sorts the array without requiring additional memory proportional to the input size. In-place sorting is memory-efficient and preferred when working within strict memory constraints.
    • In-Place Algorithms: Quick Sort, Heap Sort, Bubble Sort, Selection Sort, Insertion Sort.
    • Not In-Place Algorithms: Merge Sort, Counting Sort, Radix Sort, Bucket Sort (require extra memory).
  • Adaptive Sorting: An adaptive algorithm improves its performance when the data is already partially sorted. This characteristic reduces the time complexity in nearly sorted arrays, making these algorithms efficient in such cases.
    • Adaptive Algorithms: Bubble Sort, Insertion Sort, and (depending on gap sequence) Shell Sort.
    • Non-Adaptive Algorithms: Merge Sort, Quick Sort, Heap Sort, Selection Sort.

Practical Applications of Sorting Algorithms in Real Life

Sorting algorithms are everywhere in real life, and help organize data for faster access and better management. Here are a few examples:

  • Quick Sort and Merge Sort help search engines quickly organize and retrieve vast amounts of data. Quick Sort is fast for partitioning data, while Merge Sort is great for handling large volumes of information reliably.
  • Sorting is essential for organizing transactions by time, amount, or account number. It helps banks and financial services keep records in order, track trades, and create time-based reports.
  • Databases use sorting to speed up data access by organizing records for faster searches. Sorting data in databases is essential for smooth, quick queries and easy access to stored information.
  • Sorting helps inventory systems and online stores organize products by price, popularity, or category. This makes it easier for customers to find what they need quickly.
  • Sorting keeps data packets organized in network systems and files in operating systems, making data access faster and more efficient.

How to Choose the Right Sorting in Data Structure Algorithm

Here’s a quick guide:

  • Dataset Size:
    • Small Datasets: Insertion Sort or Selection Sort work well for small datasets because they’re simple and don’t require much memory.
    • Large Datasets: Quick Sort and Merge Sort are better for large datasets. Quick Sort is fast and efficient for in-place sorting, while Merge Sort is a good choice for large datasets that need stable sorting.
  • Time Constraints:
    • Real-Time Needs: Quick Sort is a great choice when results are needed quickly, as it’s one of the fastest sorting methods.
    • Consistency: Merge Sort is more predictable, making it ideal when you need reliable sorting times.
  • Memory Availability:
    • Limited Memory: Heap Sort or in-place Quick Sort are best when memory is tight, as they don’t need much extra space.
    • More Memory Available: Merge Sort and Counting Sort are ideal when extra memory is available, as they use this to make sorting more stable and efficient.

Examples:

  • Quick Sort is ideal for search engines or apps that need fast sorting.
  • Merge Sort works well in e-commerce or databases where stable sorting is important.

Sorting Algorithm Cheat Sheet for Data Science

Sorting algorithms help organize and process data efficiently. upGrad’s data science courses can teach you to apply them in real-world projects. 

Check them out to build hands-on skills for your career.

Quick Sort

  • Best For: Large datasets needing quick sorting.
  • Key Feature: Uses a pivot for fast, efficient sorting.
  • upGrad Advantage: Learn Quick Sort in real data projects with upGrad’s data science courses.

Merge Sort

  • Best For: Large datasets where stability matters.
  • Key Feature: Uses a divide-and-conquer method for consistent results.
  • upGrad Advantage: Master Merge Sort with hands-on projects on upGrad.

Heap Sort

  • Best For: Situations with limited memory.
  • Key Feature: Memory-efficient sorting using binary heaps.
  • upGrad Advantage: Get guidance on Heap Sort in memory-constrained settings through upGrad’s expert-led sessions.

Counting Sort

  • Best For: Fixed-range data like grades or IDs.
  • Key Feature: Non-comparison sorting that handles limited values efficiently.
  • upGrad Advantage: Explore non-comparative sorting with upGrad’s data science modules.

Build your skills with upGrad’s data science courses. Learn essential sorting algorithms and gain real-world experience. Start today!

Ready to become a data expert? Check out our Online Data Science Courses!

Learn from top industry professionals and see your career progress in this high-demand field. Start your journey today!

 

Enhance your expertise with our Data Science Courses. Explore the programs below to find your perfect fit.

Elevate your sought-after data science skills with our leading programs. Explore the perfect course for your goals below.

Discover trending articles on data science to deepen your knowledge. Browse the resources below to find your ideal match.

Frequently Asked Questions (FAQs)

1. What is the most efficient sorting algorithm for large datasets?

Quick Sort and Merge Sort are top choices for large datasets. Quick Sort is generally faster and works directly on the data, saving memory, but Merge Sort is reliable since it maintains a consistent runtime even in the worst case.

2. When should I use Merge Sort instead of Quick Sort?

Use Merge Sort if you need stability—keeping equal elements in their original order—or if you’re working with really large datasets that might need external storage. Merge Sort also works well when you want a predictable, steady runtime.

3. What are in-place sorting algorithms?

In-place sorting algorithms don’t need extra memory to sort the data; they rearrange the data right in the original array. Examples include Quick Sort, Selection Sort, and Heap Sort. These algorithms are useful when memory is tight.

4. Which sorting algorithm is best for memory-limited environments?

Heap Sort and in-place Quick Sort are both good options when you’re low on memory, as they don’t need extra storage like Merge Sort or Counting Sort.

5. What does it mean for a sorting algorithm to be stable?

A stable sorting algorithm keeps equal elements in the same order they were in initially. This is helpful when sorting by multiple criteria, ensuring earlier sorted orders are preserved.

6. Can sorting algorithms be used in real-time systems?

Yes, sorting algorithms like Quick Sort and Radix Sort can work in real-time systems because they’re fast. But if you need more predictable performance, Merge Sort or Heap Sort are good choices as they give consistent sorting times.

7. Why is Bubble Sort rarely used in real-world applications?

Bubble Sort is pretty slow for large datasets, with a time complexity of O(n²). It’s mainly used in classrooms to explain sorting concepts because much faster sorting methods are available for practical use.

8. How does sorting affect search algorithms like binary search?

Sorting is a must for search algorithms like binary search. Binary search works by repeatedly dividing the search range in half, which only makes sense if the data is in order.

9. What is the difference between adaptive and non-adaptive sorting algorithms?

Adaptive sorting algorithms adjust based on how sorted the data is to begin with. For example, Insertion Sort is faster when the data is almost sorted. Non-adaptive algorithms, like Quick Sort, don’t adapt and treat all data similarly, whether it’s sorted or not.

10. How can I practice implementing sorting algorithms?

You can practice sorting algorithms on coding platforms like LeetCode, HackerRank, or CodeSignal. To get a good grasp on each one, try coding each algorithm from scratch, testing with different datasets, and analyzing how they perform.

11. What are the benefits of non-comparison sorting algorithms like Counting Sort?

Non-comparison algorithms like Counting Sort can sort in linear time (O(n)) by counting the occurrences of each value, making them very efficient for data with a limited range. They’re great for organizing data quickly without the usual comparisons in other sorting methods.