1. Home
Data Structure

Data Structure Tutorial: Everything You Need to Know

Learn all about data structures with our comprehensive tutorial. Master the fundamentals and advance your skills in organizing and managing data efficiently.

  • 60
  • 14
right-top-arrow

Tutorial Playlist

58 Lessons
53

Time and Space Complexity in Data Structures: A Detailed Guide

Updated on 23/08/2024438 Views

Introduction

Are you struggling to optimize your algorithms? The intriguing world of time and space in data structure opens the doors to optimized algorithms, essential in providing effective solutions to complex computational problems. In this comprehensive tutorial, we will dive deep into the subject to understand its efficacy in handling execution time and memory usage and its quantifiable impact on algorithmic performance.

What Is Time and Space Complexity in Data Structures?

Time complexity measures the duration of an algorithm's execution relative to the size of its input. Algorithm analysis is a crucial factor in determining how an algorithm's runtime changes as the input size increases. It also assists developers in choosing the most efficient algorithm for a specific task.

Types of Time Complexity

Constant time complexity (O(1))

It does not matter how large the input is; algorithms that have a constant time complexity have a runtime that is always the same. This indicates that the amount of time required to execute the algorithm does not increase as the size of the input increases.

Let’s look at an example to elaborate on the concept:

def constant_time_example(arr):

    return arr[0]

# Example usage

arr = [1, 2, 3, 4, 5]

result = constant_time_example(arr)

print(result)  # Output: 1

In this particular instance, the function ‘constant_time_example’ returns the initial element of the input array ‘arr’. Accessing the initial element of arr requires an invariant amount of time, irrespective of its size; thus, this property signifies constant time complexity.

Linear time complexity (O(n))

In linear time complexity, algorithms have a runtime that grows linearly in conjunction with the size of the output. This results in an increased execution time for an increase in input size.

Consider the following illustration:

def linear_time_example(arr):

    total = 0

    for num in arr:

        total += num

    return total

# Example usage

arr = [1, 2, 3, 4, 5]

result = linear_time_example(arr)

print(result)  # Output: 15

As we can see, the function ‘linear_time_example’ computes the total sum of the elements in the input array arr. As the size of the array (arr) grows, the number of iterations in the loop similarly grows linearly, leading to a linear time complexity.

Logarithmic time complexity (O(log n))

Algorithms that operate with logarithmic time complexity experience a runtime that increases logarithmically as the input size grows. This indicates that the algorithm's running time will gradually increase as the size of the input expands.

The following example explains this further:

def binary_search(arr, target):

    low = 0

    high = len(arr) - 1

    while low <= high:

        mid = (low + high) // 2

        if arr[mid] == target:

            return mid

        elif arr[mid] < target:

            low = mid + 1

        else:

            high = mid - 1

    return -1

# Example usage

arr = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]

target = 5

result = binary_search(arr, target)

print(result)  # Output: 4 (index of target element)

In this example, we performed a binary search on the sorted input array arr to find the index of the target element, which is the purpose of the function binary_search. With each iteration of the while loop, the search space is reduced by half, resulting in a logarithmic time complexity.

Quadratic Time Complexity (O(n^2))

Algorithms exhibiting quadratic time complexity experience a runtime that increases in a quadratic manner as the input size grows. Therefore, the execution time exhibits quadratic growth as the input size expands.

def bubble_sort(arr):

    n = len(arr)

    for i in range(n):

        for j in range(0, n-i-1):

            if arr[j] > arr[j+1]:

                arr[j], arr[j+1] = arr[j+1], arr[j]

# Example usage

arr = [64, 34, 25, 12, 22, 11, 90]

bubble_sort(arr)

print(arr)  # Output: [11, 12, 22, 25, 34, 64, 90]

The above example shows that the ‘bubble_sort’ function applies the bubble sort algorithm to arrange the input array arr in ascending order. Utilizing nested loops to iterate through the array, the algorithm checks adjacent elements and exchanges them if necessary, leading to a quadratic time complexity.

Time and Space Complexity of Sorting Algorithms

Sorting algorithms are one of the core concepts of computer science. To select the most efficient algorithm for a given task, it’s imperative to analyze the time and space complexities of sorting algorithms.

Let’s look at the foundational elements of sorting algorithms:

Bubble Sort: 

Comparing elements that are adjacent to one another and switching them out if they are in the wrong order is a typical approach in sorting algorithms.

The following Python illustration elucidates the concept further:

def bubble_sort(arr):

    n = len(arr)

    for i in range(n):

        for j in range(0, n-i-1):

            if arr[j] > arr[j+1]:

                arr[j], arr[j+1] = arr[j+1], arr[j]

# Example usage

arr = [64, 34, 25, 12, 22, 11, 90]

bubble_sort(arr)

print(arr)  # Output: [11, 12, 22, 25, 34, 64, 90]

In this example, we see sorting algorithms involve iterating through a list, comparing adjacent elements, and swapping them if necessary. Bubble sort is one such algorithm. This process iterates until the entire array is sorted.

Merge Sort

Merge sort partitions the array into two halves, independently sorts each half, and subsequently combines them into a single sorted array.

Let’s look at a simple Python example:

def merge_sort(arr):

    if len(arr) > 1:

        mid = len(arr) // 2

        left_half = arr[:mid]

        right_half = arr[mid:]

merge_sort(left_half)

merge_sort(right_half)

i = j = k = 0

while i < len(left_half) and j < len(right_half):

            if left_half[i] < right_half[j]:

                arr[k] = left_half[i]

                i += 1

            else:

                arr[k] = right_half[j]

                j += 1

            k += 1

while i < len(left_half):
arr[k] = left_half[i]

i += 1

k += 1


        while j < len(right_half):

            arr[k] = right_half[j]

            j += 1

            k += 1


# Example usage

arr = [64, 34, 25, 12, 22, 11, 90]

merge_sort(arr)

print(arr)  # Output: [11, 12, 22, 25, 34, 64, 90]

The above example shows that merge sort involves a divide-and-conquer approach where the input array is recursively divided into halves until each sub-array has only one element. It then combines these sub-arrays in a sorted manner to generate the ultimate sorted array.

Quick sort

Quick sort adopts a three-part process: first, it selects a pivot element, then it proceeds to partition the array into smaller segments, and finally, it sorts them recursively.

Let’s see how it’s done.

def quick_sort(arr):

    if len(arr) <= 1:

        return arr

    pivot = arr[len(arr) // 2]

    left = [x for x in arr if x < pivot]

    middle = [x for x in arr if x == pivot]

    right = [x for x in arr if x > pivot]

    return quick_sort(left) + middle + quick_sort(right)


# Example usage

arr = [64, 34, 25, 12, 22, 11, 90]

sorted_arr = quick_sort(arr)

print(sorted_arr)  # Output: [11, 12, 22, 25, 34, 64, 90]

You see how a quick sort is a divide-and-conquer algorithm in which the remaining elements are partitioned into sets smaller and larger than the pivot element after a pivot element is selected from the array. Following that, the smaller and larger partitions are sorted recursively, and the resulting array is sorted by concatenating them with the pivot.

Difference between time and space complexity

  • Definition: Time complexity measures the computing time needed to execute an algorithm, while space complexity measures the memory space needed during execution. 
    • Example: A sorting algorithm may have O(n log n) time and O(n) space complexity. Despite its rapid execution time, the method consumes more memory as the input size increases.
  • Focus: Time complexity refers to an algorithm's computational efficiency or how fast it processes input data. Space complexity focuses on the algorithm's memory usage and storage needs.
    • Example: A low-time complexity algorithm may execute quickly but consume a lot of memory, affecting system performance. An algorithm designed for space complexity may use the least amount of memory but take longer to execute.
  • Impact: Time complexity affects how fast and efficiently an algorithm performs in real-time applications like interactive software or time-sensitive tasks. Space complexity affects algorithm scalability and resource utilization, especially in low-memory contexts.
    • Prioritize space-efficient algorithms for embedded systems and mobile apps with limited memory. This prevents memory depletion and improves system stability.
  • Assessment: Big O notation is used to analyze an algorithm's worst-case, best-case, and average-case runtime performance. However, space complexity research studies memory use as input size increases.
    • A software engineer may notice that an algorithm's runtime quadratically increases with input size while its memory usage linearly or logarithmically increases.
  • Optimization: Time complexity and space complexity optimization methods differ. Software development time and space can be optimized with several methods.
    • Use quicksort or mergesort for the best time complexity. Hash tables and binary trees reduce space complexity, making data organization and access more efficient.

Real-world use cases of time and space complexity with examples

  • Database indexing: Time complexity plays a vital role in database indexing frameworks like B-trees. By deploying logarithmic time, they can efficiently retrieve key records.
  • Network routing: Algorithms like Dijkstra’s algorithm can be routed successfully by using time complexity. It can find the shortest path between network nodes quickly and accurately. 
  • Image process: Time complexity plays a pivotal role in managing image processing algorithms, such as convolution and edge detection, resulting in fast manipulation of pixel data.
  • Genomic alignment: You can identify sequence similarities very quickly by using time complexity in genomic sequencing.
  • Machine learning: In order to determine computational efficiency, time complexity can train and instill inferencing capabilities in ML models.  

Conclusion

If you are a developer keen on designing effective algorithms and systems, then you need to hone your skills in time and space complexity in data structure. In this in-depth guide, we went deep into various types of time and space complexity examples, explored the difference between time and space complexity, and checked some of the real-life implementations. 

We will now explore some time and space complexity questions.

FAQs

1) What is time and space complexity of set?

Generally, the time complexity of a set is O(log n) for tree-based use cases and O(1) for hash-based implementations. The space complexity, on the other hand, is O(n) for tree-based implementations and varies according to load factor and hash-table overheads for hash-based deployments.

2) What is time and space trade-off in data structure?

Trade-off is the balance between the required memory space for storing data and the estimated computational time for the operation.

3) What is space and time trade-off with example?

Choosing between linked lists and arrays to construct a stack. While arrays provide access in constant time but may consume unnecessary space, linked lists conserve space but offer access in linear time.

Rohan Vats

Rohan Vats

Passionate about building large scale web apps with delightful experiences. In pursuit of transforming engineers into leaders.

Get Free Career Counselling
form image
+91
*
By clicking, I accept theT&Cand
Privacy Policy
image
right-top-arrowleft-top-arrow

upGrad Learner Support

Talk to our experts. We’re available 24/7.

text

Indian Nationals

1800 210 2020

text

Foreign Nationals

+918045604032

Disclaimer

upGrad does not grant credit; credits are granted, accepted or transferred at the sole discretion of the relevant educational institution offering the diploma or degree. We advise you to enquire further regarding the suitability of this program for your academic, professional requirements and job prospects before enr...