- Data Structures & Algorithms
- DSA - Home
- DSA - Introduction
- DSA - Environment Setup
- DSA - Algorithms Basics
- DSA - Characteristics of Algorithms
- DSA - Asymptotic Analysis
- Data Structures
- DSA - Data Structure Basics
- DSA - Data Structures and Types
- DSA - Array Data Structure
- Linked Lists
- DSA - Linked List Data Structure
- DSA - Single Linked List Data Structure
- DSA - Doubly Linked List Data Structure
- DSA - Circular Linked List Data Structure
- Stack & Queue
- DSA - Stack Data Structure
- DSA - Expression Parsing
- DSA - Queue Data Structure
- Searching Algorithms
- DSA - Searching Algorithms
- DSA - Linear Search Algorithm
- DSA - Binary Search Algorithm
- DSA - Interpolation Search
- DSA - Jump Search Algorithm
- DSA - Exponential Search
- DSA - Fibonacci Search
- DSA - Sublist Search
- DSA - Hash Table
- Sorting Algorithms
- DSA - Sorting Algorithms
- DSA - Bubble Sort Algorithm
- DSA - Insertion Sort Algorithm
- DSA - Selection Sort Algorithm
- DSA - Merge Sort Algorithm
- DSA - Shell Sort Algorithm
- DSA - Heap Sort
- DSA - Bucket Sort Algorithm
- DSA - Counting Sort Algorithm
- DSA - Radix Sort Algorithm
- DSA - Quick Sort Algorithm
- Graph Data Structure
- DSA - Graph Data Structure
- DSA - Depth First Traversal
- DSA - Breadth First Traversal
- DSA - Spanning Tree
- Tree Data Structure
- DSA - Tree Data Structure
- DSA - Tree Traversal
- DSA - Binary Search Tree
- DSA - AVL Tree
- DSA - Red Black Trees
- DSA - B Trees
- DSA - B+ Trees
- DSA - Splay Trees
- DSA - Tries
- DSA - Heap Data Structure
- Recursion
- DSA - Recursion Algorithms
- DSA - Tower of Hanoi Using Recursion
- DSA - Fibonacci Series Using Recursion
- Divide and Conquer
- DSA - Divide and Conquer
- DSA - Max-Min Problem
- DSA - Strassen's Matrix Multiplication
- DSA - Karatsuba Algorithm
- Greedy Algorithms
- DSA - Greedy Algorithms
- DSA - Travelling Salesman Problem (Greedy Approach)
- DSA - Prim's Minimal Spanning Tree
- DSA - Kruskal's Minimal Spanning Tree
- DSA - Dijkstra's Shortest Path Algorithm
- DSA - Map Colouring Algorithm
- DSA - Fractional Knapsack Problem
- DSA - Job Sequencing with Deadline
- DSA - Optimal Merge Pattern Algorithm
- Dynamic Programming
- DSA - Dynamic Programming
- DSA - Matrix Chain Multiplication
- DSA - Floyd Warshall Algorithm
- DSA - 0-1 Knapsack Problem
- DSA - Longest Common Subsequence Algorithm
- DSA - Travelling Salesman Problem (Dynamic Approach)
- Approximation Algorithms
- DSA - Approximation Algorithms
- DSA - Vertex Cover Algorithm
- DSA - Set Cover Problem
- DSA - Travelling Salesman Problem (Approximation Approach)
- Randomized Algorithms
- DSA - Randomized Algorithms
- DSA - Randomized Quick Sort Algorithm
- DSA - Karger’s Minimum Cut Algorithm
- DSA - Fisher-Yates Shuffle Algorithm
Data Structures & Algorithms - Asymptotic Analysis
![]() Share with a Friend |
Asymptotic Analysis - Definition
Asymptotic analysis helps us evaluate an algorithm’s efficiency by analyzing its growth rate as the input size (n) increases. It provides insights into an algorithm’s performance without considering hardware constraints.
Common Asymptotic Notations
- Big O (O) → Upper bound (Worst-case scenario)
- Big Omega (Ω) → Lower bound (Best-case scenario)
- Big Theta (Θ) → Tight bound (Average-case scenario)
1. O(1) - Constant Time Complexity
An algorithm runs in O(1) time if it takes the same amount of time, regardless of input size.
🔹 Example: Accessing an array element
#include <stdio.h>
int main() {
int arr[] = {1, 2, 3, 4, 5};
printf("%d\n", arr[2]); // Output: 3
return 0;
}
Output
3
#include <iostream>
using namespace std;int main() {
int arr[] = {1, 2, 3, 4, 5};
cout << arr[2] << endl; // Output: 3
return 0;
}
Output
3
class ConstantTime {
public static void main(String[] args) {
int[] arr = {1, 2, 3, 4, 5};
System.out.println(arr[2]); // Output: 3
}
}
Output
3
arr = [1, 2, 3, 4, 5]
print(arr[2]) # Output: 3
Output
3
✅ Time Complexity: O(1) (Independent of input size)
2. O(log n) - Logarithmic Time Complexity
An algorithm runs in O(log n) time if it reduces the problem size at each step.
🔹 Example: Binary Search
#include <stdio.h>
int binarySearch(int arr[], int size, int target) {
int left = 0, right = size - 1;
while (left <= right) {
int mid = left + (right - left) / 2;
if (arr[mid] == target)
return mid;
else if (arr[mid] < target)
left = mid + 1;
else
right = mid - 1;
}
return -1; // Element not found
}int main() {
int arr[] = {1, 3, 5, 7, 9, 11, 13};
int size = sizeof(arr) / sizeof(arr[0]);
int target;
printf("Enter element to search: ");
scanf("%d", &target);
int result = binarySearch(arr, size, target);
if (result != -1)
printf("Element found at index %d\n", result);
else
printf("Element not found\n");
return 0;
}
Output
Enter element to search: 7
Element found at index 3
#include <iostream>
using namespace std;int binarySearch(int arr[], int size, int target) {
int left = 0, right = size - 1;
while (left <= right) {
int mid = left + (right - left) / 2;
if (arr[mid] == target)
return mid;
else if (arr[mid] < target)
left = mid + 1;
else
right = mid - 1;
}
return -1; // Element not found
}int main() {
int arr[] = {1, 3, 5, 7, 9, 11, 13};
int size = sizeof(arr) / sizeof(arr[0]);
int target;
cout << "Enter element to search: ";
cin >> target;
int result = binarySearch(arr, size, target);
if (result != -1)
cout << "Element found at index " << result << endl;
else
cout << "Element not found" << endl;
return 0;
}
Output
Enter element to search: 7
Element found at index 3
import java.util.Scanner;
class BinarySearch {
public static int binarySearch(int arr[], int target) {
int left = 0, right = arr.length - 1;
while (left <= right) {
int mid = left + (right - left) / 2;
if (arr[mid] == target)
return mid;
else if (arr[mid] < target)
left = mid + 1;
else
right = mid - 1;
}
return -1; // Element not found
}public static void main(String[] args) {
int arr[] = {1, 3, 5, 7, 9, 11, 13};
Scanner scanner = new Scanner(System.in);
System.out.print("Enter element to search: ");
int target = scanner.nextInt();
int result = binarySearch(arr, target);
if (result != -1)
System.out.println("Element found at index " + result);
else
System.out.println("Element not found");
scanner.close();
}
}
Output
Enter element to search: 7
Element found at index 3
def binary_search(arr, target):
left, right = 0, len(arr) - 1
while left <= right:
mid = left + (right - left) // 2
if arr[mid] == target:
return mid
elif arr[mid] < target:
left = mid + 1
else:
right = mid - 1
return -1 # Element not foundarr = [1, 3, 5, 7, 9, 11, 13]
target = int(input("Enter element to search: "))result = binary_search(arr, target)
if result != -1:
print(f"Element found at index {result}")
else:
print("Element not found")
Output
Enter element to search: 7
Element found at index 3
✅ Time Complexity: O(log n) (Halves input size in each step)
3. O(n) - Linear Time Complexity
An algorithm runs in O(n) time if it processes all elements once.
🔹 Example: Finding Maximum Element in an Array
#include <stdio.h>
int findMax(int arr[], int size) {
int max = arr[0]; // Initialize max with first elementfor (int i = 1; i < size; i++) {
if (arr[i] > max)
max = arr[i];
}return max;
}int main() {
int arr[] = {12, 45, 67, 89, 23, 90, 34};
int size = sizeof(arr) / sizeof(arr[0]);printf("Maximum element: %d\n", findMax(arr, size));
return 0;
}
Output
Maximum element: 90
#include <iostream>
using namespace std;int findMax(int arr[], int size) {
for (int i = 1; i < size; i++) {
int max = arr[0]; // Initialize max with first element
if (arr[i] > max)
max = arr[i];
} return max;
}int main() {
cout << "Maximum element: " << findMax(arr, size) << endl;
int arr[] = {12, 45, 67, 89, 23, 90, 34};
int size = sizeof(arr) / sizeof(arr[0]);
return 0;
}
Output
Maximum element: 90
import java.util.*;
class FindMax {
for (int i = 1; i < arr.length; i++) {
public static int findMax(int arr[]) {
int max = arr[0]; // Initialize max with first element
if (arr[i] > max)
max = arr[i];
} return max;
}public static void main(String[] args) {
int arr[] = {12, 45, 67, 89, 23, 90, 34};
System.out.println("Maximum element: " + findMax(arr));
}
}
Output
Maximum element: 90
def find_max(arr):
for num in arr:
max_element = arr[0] # Initialize max with first element
if num > max_element:
max_element = num return max_element arr = [12, 45, 67, 89, 23, 90, 34]
print("Maximum element:", find_max(arr))
Output
Maximum element: 90
4. O(n log n) - Log-Linear Time Complexity
Efficient sorting algorithms (Merge Sort, Quick Sort) run in O(n log n) time.
- These algorithms divide the problem into smaller parts (log n divisions) and process each part in O(n) time.
- Common O(n log n) Algorithms:
- Merge Sort 🏆
- Quick Sort
- Heap Sort
- Binary Search Tree Operations
🔹 Example: Merge Sort (O(n log n))
Merge Sort follows Divide and Conquer:
- Divide: Split the array into two halves.
- Sort: Recursively sort both halves.
- Merge: Combine the sorted halves.
🔹 Example: Merge Sort
#include <stdio.h>
void merge(int arr[], int left, int mid, int right) {
for (int i = 0; i < n1; i++)
int n1 = mid - left + 1;
int n2 = right - mid;
int L[n1], R[n2];
L[i] = arr[left + i];
for (int j = 0; j < n2; j++)
R[j] = arr[mid + 1 + j]; int i = 0, j = 0, k = left;
while (i < n1 && j < n2) {
if (L[i] <= R[j])
arr[k++] = L[i++];
else
arr[k++] = R[j++];
} while (i < n1)
arr[k++] = L[i++];
while (j < n2)
arr[k++] = R[j++];
}void mergeSort(int arr[], int left, int right) {
if (left < right) {
int mid = left + (right - left) / 2;
mergeSort(arr, left, mid);
mergeSort(arr, mid + 1, right);
merge(arr, left, mid, right);
}
}void printArray(int arr[], int size) {
for (int i = 0; i < size; i++)
printf("%d ", arr[i]);
printf("\n");
}int main() {
printf("Original array: ");
int arr[] = {12, 11, 13, 5, 6, 7};
int size = sizeof(arr) / sizeof(arr[0]);
printArray(arr, size); mergeSort(arr, 0, size - 1); printf("Sorted array: ");
printArray(arr, size); return 0;
}
Output
Original array: 12 11 13 5 6 7
Sorted array: 5 6 7 11 12 13
#include <iostream>
void merge(int arr[], int left, int mid, int right) {
using namespace std;
int n1 = mid - left + 1;
int n2 = right - mid; int L[n1], R[n2]; for (int i = 0; i < n1; i++)
L[i] = arr[left + i];
for (int j = 0; j < n2; j++)
R[j] = arr[mid + 1 + j]; int i = 0, j = 0, k = left;
while (i < n1 && j < n2) {
if (L[i] <= R[j])
arr[k++] = L[i++];
else
arr[k++] = R[j++];
} while (i < n1)
arr[k++] = L[i++];
while (j < n2)
arr[k++] = R[j++];
} void mergeSort(int arr[], int left, int right) {
if (left < right) {
int mid = left + (right - left) / 2;
mergeSort(arr, left, mid);
mergeSort(arr, mid + 1, right);
merge(arr, left, mid, right);
}
} void printArray(int arr[], int size) {
for (int i = 0; i < size; i++)
cout << arr[i] << " ";
cout << endl;
} int main() {
int arr[] = {12, 11, 13, 5, 6, 7};
int size = sizeof(arr) / sizeof(arr[0]); cout << "Original array: ";
printArray(arr, size); mergeSort(arr, 0, size - 1); cout << "Sorted array: ";
printArray(arr, size); return 0;
}
Output
Original array: 12 11 13 5 6 7
Sorted array: 5 6 7 11 12 13
import java.util.Arrays;
class MergeSort {
static void merge(int arr[], int left, int mid, int right) {
int n1 = mid - left + 1;
int n2 = right - mid; int L[] = new int[n1];
int R[] = new int[n2]; for (int i = 0; i < n1; i++)
L[i] = arr[left + i];
for (int j = 0; j < n2; j++)
R[j] = arr[mid + 1 + j]; int i = 0, j = 0, k = left; while (i < n1 && j < n2) {
if (L[i] <= R[j])
arr[k++] = L[i++];
else
arr[k++] = R[j++];
} while (i < n1)
arr[k++] = L[i++]; while (j < n2)
arr[k++] = R[j++];
} static void mergeSort(int arr[], int left, int right) {
if (left < right) {
int mid = left + (right - left) / 2;
mergeSort(arr, left, mid);
mergeSort(arr, mid + 1, right);
merge(arr, left, mid, right);
}
} public static void main(String args[]) {
int arr[] = {12, 11, 13, 5, 6, 7}; System.out.println("Original array: " + Arrays.toString(arr)); mergeSort(arr, 0, arr.length - 1); System.out.println("Sorted array: " + Arrays.toString(arr));
}
}
Output
Original array: 12 11 13 5 6 7
Sorted array: 5 6 7 11 12 13
def merge_sort(arr):
if len(arr) > 1:
mid = len(arr) // 2
left_half = arr[:mid]
right_half = arr[mid:] merge_sort(left_half)
merge_sort(right_half) i = j = k = 0 while i < len(left_half) and j < len(right_half):
if left_half[i] < right_half[j]:
arr[k] = left_half[i]
i += 1
else:
arr[k] = right_half[j]
j += 1
k += 1 while i < len(left_half):
arr[k] = left_half[i]
i += 1
k += 1 while j < len(right_half):
arr[k] = right_half[j]
j += 1
k += 1 arr = [12, 11, 13, 5, 6, 7]
print("Original array:", arr) merge_sort(arr) print("Sorted array:", arr)
Output
Original array: 12 11 13 5 6 7
Sorted array: 5 6 7 11 12 13
5. O(n²) - Quadratic Time Complexity
🔹 What is O(n²) Complexity?
- O(n²) complexity means the execution time increases quadratically with the input size.
- It typically occurs in nested loops, where each loop runs n times.
- Brute-force solutions often run in O(n²) time.
- Common O(n²) Algorithms:
- Bubble Sort 🏆
- Insertion Sort
- Selection Sort
- Matrix Multiplication
🔹 Example: Bubble Sort (O(n²))
Bubble Sort works as follows:
- Compare adjacent elements and swap if they are in the wrong order.
- Repeat this process until the array is sorted.
🔹 Example: Bubble Sort
#include <stdio.h>
void bubbleSort(int arr[], int n) {
for (int i = 0; i < n - 1; i++) {
for (int j = 0; j < n - i - 1; j++) {
if (arr[j] > arr[j + 1]) {
// Swap arr[j] and arr[j+1]
int temp = arr[j];
arr[j] = arr[j + 1];
arr[j + 1] = temp;
}
}
}
} void printArray(int arr[], int n) {
for (int i = 0; i < n; i++)
printf("%d ", arr[i]);
printf("\n");
} int main() {
int arr[] = {64, 34, 25, 12, 22, 11, 90};
int n = sizeof(arr) / sizeof(arr[0]); printf("Original array: ");
printArray(arr, n); bubbleSort(arr, n); printf("Sorted array: ");
printArray(arr, n); return 0;
}
Output
Original array: 64 34 25 12 22 11 90
Sorted array: 11 12 22 25 34 64 90
#include <iostream> using namespace std; void bubbleSort(int arr[], int n) { for (int i = 0; i < n - 1; i++) { for (int j = 0; j < n - i - 1; j++) { if (arr[j] > arr[j + 1]) { swap(arr[j], arr[j + 1]); } } } } void printArray(int arr[], int n) { for (int i = 0; i < n; i++) cout << arr[i] << " "; cout << endl; } int main() { int arr[] = {64, 34, 25, 12, 22, 11, 90}; int n = sizeof(arr) / sizeof(arr[0]); cout << "Original array: "; printArray(arr, n); bubbleSort(arr, n); cout << "Sorted array: "; printArray(arr, n); return 0; }
Output
Original array: 64 34 25 12 22 11 90
Sorted array: 11 12 22 25 34 64 90
import java.util.Arrays; class BubbleSort { static void bubbleSort(int arr[]) { int n = arr.length; for (int i = 0; i < n - 1; i++) { for (int j = 0; j < n - i - 1; j++) { if (arr[j] > arr[j + 1]) { int temp = arr[j]; arr[j] = arr[j + 1]; arr[j + 1] = temp; } } } } public static void main(String args[]) { int arr[] = {64, 34, 25, 12, 22, 11, 90}; System.out.println("Original array: " + Arrays.toString(arr)); bubbleSort(arr); System.out.println("Sorted array: " + Arrays.toString(arr)); } }
Output
Original array: 64 34 25 12 22 11 90
Sorted array: 11 12 22 25 34 64 90
def bubble_sort(arr): n = len(arr) for i in range(n - 1): for j in range(n - i - 1): if arr[j] > arr[j + 1]: arr[j], arr[j + 1] = arr[j + 1], arr[j] # Swap arr = [64, 34, 25, 12, 22, 11, 90] print("Original array:", arr) bubble_sort(arr) print("Sorted array:", arr)
Output
Original array: 64 34 25 12 22 11 90
Sorted array: 11 12 22 25 34 64 90
6. O(2ⁿ) - Exponential Time Complexity
🔹 What is O(2ⁿ) Complexity?
- O(2ⁿ) complexity means the execution time doubles with each additional input.
- It typically occurs in problems involving recursion with multiple calls at each step.
- Algorithms that explore all possibilities run in O(2ⁿ) time.
- Common O(2ⁿ) Algorithms:
- Recursive Fibonacci 🏆
- Subset Generation
- Tower of Hanoi
- Brute Force Algorithms (e.g., Traveling Salesman Problem)
🔹 Example: Recursive Fibonacci (O(2ⁿ))
The Fibonacci sequence is defined as:
F(n)=F(n−1)+F(n−2)F(n) = F(n-1) + F(n-2)F(n)=F(n−1)+F(n−2)
where:
- Base case: F(0)=0,F(1)=1F(0) = 0, F(1) = 1F(0)=0,F(1)=1
- Recursive case: Calls two subproblems, leading to O(2ⁿ) complexity.
1️⃣ C Program (Recursive Fibonacci)
🔹 Example: Fibonacci (Recursive)
#include <stdio.h> int fibonacci(int n) { if (n <= 1) return n; return fibonacci(n - 1) + fibonacci(n - 2); } int main() { int n = 10; printf("Fibonacci number at position %d: %d\n", n, fibonacci(n)); return 0; }
Output
Fibonacci number at position 10: 55
#include <iostream> using namespace std; int fibonacci(int n) { if (n <= 1) return n; return fibonacci(n - 1) + fibonacci(n - 2); } int main() { int n = 10; cout << "Fibonacci number at position " << n << ": " << fibonacci(n) << endl; return 0; }
Output
Fibonacci number at position 10: 55
class Fibonacci { static int fibonacci(int n) { if (n <= 1) return n; return fibonacci(n - 1) + fibonacci(n - 2); } public static void main(String[] args) { int n = 10; System.out.println("Fibonacci number at position " + n + ": " + fibonacci(n)); } }
Output
Fibonacci number at position 10: 55
def fibonacci(n): if n <= 1: return n return fibonacci(n - 1) + fibonacci(n - 2) n = 10 print(f"Fibonacci number at position {n}: {fibonacci(n)}")
Output
Fibonacci number at position 10: 55
7. O(n!) - Factorial Time Complexity
🔹 What is O(n!) Complexity?
- O(n!) complexity means the execution time grows factorially with the input size.
- It occurs in brute-force algorithms that generate all possible permutations or combinations.
- Brute-force permutations run in O(n!) time.
- Common O(n!) Algorithms:
- Traveling Salesman Problem (TSP) 🏆
- Generating All Permutations of a String/Array
- Backtracking Problems (e.g., N-Queens, Hamiltonian Path)
🔹 Example: Generating All Permutations (O(n!))
A permutation of a set is an arrangement of its elements in any order.
For n elements, there are n! possible permutations.
Example:
For {1, 2, 3} → Possible permutations are:
This algorithm has O(n!) complexity because:
- For n elements, we pick 1 element, then recursively permute the (n-1) remaining.
- This process leads to n! (n × (n-1) × (n-2) ... × 1) calls.
🔹 Example: Generating Permutations
#include <stdio.h> void swap(int *x, int *y) { int temp = *x; *x = *y; *y = temp; } void permute(int arr[], int l, int r) { if (l == r) { for (int i = 0; i <= r; i++) printf("%d ", arr[i]); printf("\n"); } else { for (int i = l; i <= r; i++) { swap(&arr[l], &arr[i]); permute(arr, l + 1, r); swap(&arr[l], &arr[i]); // Backtrack } } } int main() { int arr[] = {1, 2, 3}; int n = sizeof(arr) / sizeof(arr[0]); printf("All permutations:\n"); permute(arr, 0, n - 1); return 0; }
Output
All permutations:
1 2 3
1 3 2
2 1 3
2 3 1
3 1 2
3 2 1
#include <iostream> #include <vector> #include <algorithm> using namespace std; void permute(vector&arr, int l, int r) { if (l == r) { for (int num : arr) cout << num << " "; cout << endl; } else { for (int i = l; i <= r; i++) { swap(arr[l], arr[i]); permute(arr, l + 1, r); swap(arr[l], arr[i]); // Backtrack } } } int main() { vector arr = {1, 2, 3}; cout << "All permutations:\n"; permute(arr, 0, arr.size() - 1); return 0; }
Output
All permutations:
1 2 3
1 3 2
2 1 3
2 3 1
3 1 2
3 2 1
import java.util.Arrays; class Permutations { static void permute(int[] arr, int l, int r) { if (l == r) { System.out.println(Arrays.toString(arr)); } else { for (int i = l; i <= r; i++) { swap(arr, l, i); permute(arr, l + 1, r); swap(arr, l, i); // Backtrack } } } static void swap(int[] arr, int i, int j) { int temp = arr[i]; arr[i] = arr[j]; arr[j] = temp; } public static void main(String[] args) { int[] arr = {1, 2, 3}; System.out.println("All permutations:"); permute(arr, 0, arr.length - 1); } }
Output
All permutations:
1 2 3
1 3 2
2 1 3
2 3 1
3 1 2
3 2 1
from itertools import permutations arr = [1, 2, 3] print("All permutations:") for perm in permutations(arr): print(perm)
Output
All permutations:
1 2 3
1 3 2
2 1 3
2 3 1
3 1 2
3 2 1
Comparison of Complexities
| Complexity | Growth Rate |
|---|---|
| O(1) | Constant |
| O(log n) | Logarithmic |
| O(n) | Linear |
| O(n log n) | Log-Linear |
| O(n²) | Quadratic |
| O(2ⁿ) | Exponential |
| O(n!) | Factorial |
Conclusion
- Asymptotic analysis helps compare algorithms without hardware dependency.
- Efficient algorithms (O(log n), O(n), O(n log n)) are preferred for large inputs.
- Exponential (O(2ⁿ)) and Factorial (O(n!)) algorithms should be avoided.
