Algorithm Examples & Complexity Analysis
Learn from real-world examples of common algorithms and their Big O complexities
Searching Algorithms
Linear Search
Time: O(n) Space: O(1)
Linear search checks every element in the array sequentially until it finds the target or reaches the end.
def linear_search(arr, target):
for i in range(len(arr)):
if arr[i] == target:
return i
return -1
# Example usage
numbers = [64, 34, 25, 12, 22, 11, 90]
Analysis: In the worst case, we need to check all n elements, giving us O(n) time complexity. We only use a constant amount of extra space for the loop variable.
Use case: Best for small arrays or unsorted data where you can't use more efficient methods.
Binary Search
Time: O(log n) Space: O(1)
Binary search works on sorted arrays by repeatedly dividing the search interval in half.
def binary_search(arr, target):
left, right = 0, len(arr) - 1
while left <= right:
mid = (left + right) // 2
if arr[mid] == target:
return mid
elif arr[mid] < target:
left = mid + 1
else:
right = mid - 1
return -1
# Example usage
numbers = [11, 12, 22, 25, 34, 64, 90] # Must be sorted!
Analysis: Each iteration eliminates half of the remaining elements. With n elements, we need at most log₂(n) iterations, giving us O(log n) time complexity.
Use case: Extremely efficient for searching in sorted arrays. Much faster than linear search for large datasets.
Sorting Algorithms
Bubble Sort
Time: O(n²) Space: O(1)
Bubble sort repeatedly steps through the list, compares adjacent elements, and swaps them if they're in the wrong order.
def bubble_sort(arr):
n = len(arr)
for i in range(n):
# Flag to optimize: stop if no swaps occur
swapped = False
for j in range(0, n - i - 1):
if arr[j] > arr[j + 1]:
arr[j], arr[j + 1] = arr[j + 1], arr[j]
swapped = True
if not swapped:
break
return arr
# Example usage
numbers = [64, 34, 25, 12, 22, 11, 90]
Analysis: The nested loops give us O(n²) time complexity. The outer loop runs n times, and the inner loop runs up to n times for each iteration.
Use case: Educational purposes and very small datasets. Not recommended for production use.
Merge Sort
Time: O(n log n) Space: O(n)
Merge sort is a divide-and-conquer algorithm that divides the array into halves, sorts them, and merges them back.
def merge_sort(arr):
if len(arr) <= 1:
return arr
mid = len(arr) // 2
left = merge_sort(arr[:mid])
right = merge_sort(arr[mid:])
return merge(left, right)
def merge(left, right):
result = []
i = j = 0
while i < len(left) and j < len(right):
if left[i] <= right[j]:
result.append(left[i])
i += 1
else:
result.append(right[j])
j += 1
result.extend(left[i:])
result.extend(right[j:])
return result
# Example usage
numbers = [64, 34, 25, 12, 22, 11, 90]
Analysis: The array is divided log n times (divide phase), and merging takes O(n) time at each level, giving us O(n log n) total time complexity.
Use case: Excellent for large datasets. Guaranteed O(n log n) performance, but requires extra space.
Quick Sort
Average: O(n log n) Worst: O(n²) Space: O(log n)
Quick sort picks a pivot element and partitions the array around it, then recursively sorts the partitions.
def quick_sort(arr):
if len(arr) <= 1:
return arr
pivot = arr[len(arr) // 2]
left = [x for x in arr if x < pivot]
middle = [x for x in arr if x == pivot]
right = [x for x in arr if x > pivot]
return quick_sort(left) + middle + quick_sort(right)
# Example usage
numbers = [64, 34, 25, 12, 22, 11, 90]
Analysis: Average case is O(n log n) with good pivot selection. Worst case O(n²) occurs when the pivot is always the smallest or largest element (e.g., already sorted array with poor pivot choice).
Use case: Often the fastest sorting algorithm in practice. Used in many standard libraries.
Data Structure Operations
Array Operations
| Operation | Time Complexity | Description |
|---|---|---|
| Access by index | O(1) | Direct memory access |
| Search | O(n) | Must check each element |
| Insert at end | O(1) | Amortized constant time |
| Insert at beginning | O(n) | Must shift all elements |
| Delete | O(n) | May need to shift elements |
Hash Table Operations
| Operation | Average Case | Worst Case |
|---|---|---|
| Search | O(1) | O(n) |
| Insert | O(1) | O(n) |
| Delete | O(1) | O(n) |
Note: Worst case occurs with many hash collisions. Good hash functions make this rare.
Binary Search Tree Operations
| Operation | Average Case | Worst Case |
|---|---|---|
| Search | O(log n) | O(n) |
| Insert | O(log n) | O(n) |
| Delete | O(log n) | O(n) |
Note: Worst case occurs when the tree becomes unbalanced (like a linked list). Balanced trees (AVL, Red-Black) guarantee O(log n) for all operations.
String Algorithms
Palindrome Check
Time: O(n) Space: O(1)
def is_palindrome(s):
# Remove non-alphanumeric and convert to lowercase
s = ''.join(c.lower() for c in s if c.isalnum())
left, right = 0, len(s) - 1
while left < right:
if s[left] != s[right]:
return False
left += 1
right -= 1
return True
# Example usage
print(is_palindrome("A man, a plan, a canal: Panama")) # True
Analysis: We iterate through the string once to clean it (O(n)), then compare characters from both ends moving toward the center (O(n/2) = O(n)).
Anagram Check
Time: O(n) Space: O(n)
def are_anagrams(s1, s2):
if len(s1) != len(s2):
return False
# Count character frequencies
char_count = {}
for char in s1:
char_count[char] = char_count.get(char, 0) + 1
for char in s2:
if char not in char_count:
return False
char_count[char] -= 1
if char_count[char] < 0:
return False
return True
# Example usage
print(are_anagrams("listen", "silent")) # True
Analysis: We iterate through both strings once (O(n) each), and use a hash table to store character counts (O(n) space).
Dynamic Programming Examples
Fibonacci (Naive Recursion)
Time: O(2ⁿ) Space: O(n)
def fibonacci_naive(n):
if n <= 1:
return n
return fibonacci_naive(n - 1) + fibonacci_naive(n - 2)
# This is VERY slow for large n!
Analysis: Each call spawns two more calls, creating an exponential tree of recursive calls.
Fibonacci (Dynamic Programming)
Time: O(n) Space: O(n)
def fibonacci_dp(n):
if n <= 1:
return n
# Memoization: store computed values
dp = [0] * (n + 1)
dp[1] = 1
for i in range(2, n + 1):
dp[i] = dp[i - 1] + dp[i - 2]
return dp[n]
# Much faster!
Analysis: We compute each Fibonacci number exactly once and store it, reducing exponential time to linear time at the cost of O(n) space.
Fibonacci (Space Optimized)
Time: O(n) Space: O(1)
def fibonacci_optimized(n):
if n <= 1:
return n
prev2, prev1 = 0, 1
for i in range(2, n + 1):
current = prev1 + prev2
prev2, prev1 = prev1, current
return prev1
# Fast and memory efficient!
Analysis: We only need to remember the last two values, reducing space complexity to O(1) while maintaining O(n) time complexity.
Try These Examples
Want to verify these complexity analyses? Copy any of the code examples above and paste them into our Big O Calculator to see the AI's analysis. You can also modify the code and see how the complexity changes!
For more in-depth explanations of Big O notation concepts, visit our comprehensive guide. If you have questions, check out our FAQ page.