Introduction
Searching and sorting form the backbone of efficient algorithmic programming. Whether you're preparing for coding interviews, building scalable systems, or optimizing query performance, a solid understanding of these fundamentals is essential. This article presents a comprehensive look at the most widely used search and sort algorithms in DSA—explaining logic, complexity, code snippets, practical use, and optimization.
Table of Contents
1. Searching Algorithms
1.1 Linear Search
Linear search, also known as sequential search, checks each element in a list one by one until the target is found or the end is reached.
function linearSearch(arr, target) {
for (let i = 0; i < arr.length; i++) {
if (arr[i] === target) return i;
}
return -1;
}
- Time Complexity: O(n)
- Space Complexity: O(1)
- Use Case: Unsorted arrays or lists; small data sets
1.2 Binary Search
Binary search works on sorted arrays. It divides the search interval in half each time, discarding the half that cannot contain the target.
function binarySearch(arr, target) {
let low = 0;
let high = arr.length - 1;
while (low <= high) {
const mid = Math.floor((low + high) / 2);
if (arr[mid] === target) return mid;
else if (arr[mid] < target) low = mid + 1;
else high = mid - 1;
}
return -1;
}
- Time Complexity: O(log n)
- Space Complexity: O(1) iterative, or O(log n) recursive
- Use Case: Large sorted arrays or datasets
1.3 Interpolation Search (Advanced)
When data is uniformly distributed, interpolation search can improve over binary search by estimating the likely position of the target.
function interpolationSearch(arr, target) {
let low = 0, high = arr.length - 1;
while (low <= high && target >= arr[low] && target <= arr[high]) {
let pos = low + Math.floor(((high - low) / (arr[high] - arr[low])) * (target - arr[low]));
if (arr[pos] === target) return pos;
if (arr[pos] < target) low = pos + 1;
else high = pos - 1;
}
return -1;
}
- Average Time Complexity: O(log log n)
- Worst-case: O(n)
- Use Case: Uniformly distributed sorted data
1.4 Hashing (Constant‑Time Search)
A hash table allows average O(1) search time by mapping keys to indices. Collisions are handled via chaining or open addressing.
class HashMap {
constructor(size = 1000) {
this.buckets = Array(size).fill(null).map(() => []);
}
set(key, value) {
const idx = this._hash(key);
this.buckets[idx].push([key, value]);
}
get(key) {
const idx = this._hash(key);
for (const [k, v] of this.buckets[idx]) {
if (k === key) return v;
}
return null;
}
_hash(key) {
let hash = 0;
const str = String(key);
for (let char of str) {
hash = (hash << 5) - hash + char.charCodeAt(0);
hash |= 0;
}
return Math.abs(hash) % this.buckets.length;
}
}
- Average Time Complexity: O(1)
- Worst-case: O(n) (all keys collision)
- Use Case: Fast lookup, dictionary, caches
2. Sorting Algorithms
2.1 Bubble Sort
A simple comparison-based algorithm that repeatedly steps through the list, compares adjacent elements and swaps them if they are in the wrong order.
function bubbleSort(arr) {
let n = arr.length;
for (let i = 0; i < n - 1; i++) {
for (let j = 0; j < n - 1 - i; j++) {
if (arr[j] > arr[j + 1]) {
[arr[j], arr[j + 1]] = [arr[j + 1], arr[j]];
}
}
}
return arr;
}
- Time Complexity: O(n²)
- Space Complexity: O(1)
- Use Case: Educational, tiny datasets
2.2 Selection Sort
Repeatedly selects the minimum element from unsorted part and moves it to the sorted portion.
function selectionSort(arr) {
let n = arr.length;
for (let i = 0; i < n - 1; i++) {
let minIdx = i;
for (let j = i + 1; j < n; j++) {
if (arr[j] < arr[minIdx]) minIdx = j;
}
[arr[i], arr[minIdx]] = [arr[minIdx], arr[i]];
}
return arr;
}
- Time Complexity: O(n²)
- Space Complexity: O(1)
- Use Case: When swaps cost little and ease of implementation
2.3 Insertion Sort
Builds the sorted array one item at a time by repeatedly inserting the next element into the right position.
function insertionSort(arr) {
for (let i = 1; i < arr.length; i++) {
let key = arr[i];
let j = i - 1;
while (j >= 0 && arr[j] > key) {
arr[j + 1] = arr[j];
j--;
}
arr[j + 1] = key;
}
return arr;
}
- Time Complexity: O(n²); better for nearly sorted data
- Space Complexity: O(1)
- Use Case: Small or mostly sorted lists
2.4 Merge Sort
A divide‑and‑conquer algorithm that splits the array in halves, sorts each half recursively, and merges them.
function merge(left, right) {
const result = [];
let i = 0, j = 0;
while (i < left.length && j < right.length) {
if (left[i] < right[j]) result.push(left[i++]);
else result.push(right[j++]);
}
return result.concat(left.slice(i)).concat(right.slice(j));
}
function mergeSort(arr) {
if (arr.length <= 1) return arr;
const mid = Math.floor(arr.length / 2);
const left = mergeSort(arr.slice(0, mid));
const right = mergeSort(arr.slice(mid));
return merge(left, right);
}
- Time Complexity: O(n log n)
- Space Complexity: O(n)
- Use Case: Reliable, stable sorting and large data sets
2.5 Quick Sort
Pick a pivot, partition the array around pivot so that left elements ≤ pivot, right ≥ pivot, then recurse.
function quickSort(arr) {
if (arr.length <= 1) return arr;
const pivot = arr[arr.length >> 1];
const left = [], middle = [], right = [];
for (let x of arr) {
if (x < pivot) left.push(x);
else if (x > pivot) right.push(x);
else middle.push(x);
}
return quickSort(left).concat(middle, quickSort(right));
}
- Average Time Complexity: O(n log n)
- Worst‑case: O(n²) (poor pivot choice)
- Space Complexity: O(log n) for recursion stack
- Use Case: Fast in‑place sorting, large unsorted data with random pivot
2.6 Heap Sort
Transforms the array into a max-heap, then repeatedly extracts the maximum and rebuilds the heap.
function heapify(arr, n, i) {
let largest = i;
let l = 2*i + 1, r = 2*i + 2;
if (l < n && arr[l] > arr[largest]) largest = l;
if (r < n && arr[r] > arr[largest]) largest = r;
if (largest !== i) {
[arr[i], arr[largest]] = [arr[largest], arr[i]];
heapify(arr, n, largest);
}
}
function heapSort(arr) {
let n = arr.length;
for (let i = Math.floor(n/2) - 1; i >= 0; i--) {
heapify(arr, n, i);
}
for (let i = n - 1; i >= 0; i--) {
[arr[0], arr[i]] = [arr[i], arr[0]];
heapify(arr, i, 0);
}
return arr;
}
- Time Complexity: O(n log n)
- Space Complexity: O(1)
- Use Case: In-place sorting, memory-constrained environments
2.7 Counting Sort & Radix Sort (Non‑Comparison Sorting)
Counting Sort works for integers in a limited range: counts occurrences and reconstructs sorted array.
function countingSort(arr, maxVal) {
const count = Array(maxVal + 1).fill(0);
for (let x of arr) count[x]++;
for (let i = 1; i < count.length; i++) count[i] += count[i - 1];
const output = Array(arr.length);
for (let i = arr.length - 1; i >= 0; i--) {
output[count[arr[i]] - 1] = arr[i];
count[arr[i]]--;
}
return output;
}
- Time Complexity: O(n + k), where k is range size
- Space Complexity: O(n + k)
- Use Case: Integers with known small range
Radix Sort sorts by processing digits (least significant to most) using stable sub-sorts like counting sort.
function radixSort(arr, maxDigits) {
let place = 1;
for (let d = 0; d < maxDigits; d++) {
const buckets = Array(10).fill().map(() => []);
for (let num of arr) {
const digit = Math.floor((num / place) % 10);
buckets[digit].push(num);
}
arr = [].concat(...buckets);
place *= 10;
}
return arr;
}
- Time Complexity: O(n · k) where k is number of digits
- Space Complexity: O(n + k)
- Use Case: Large lists of integers, fixed digit length
3. Comparison: Searching vs Sorting Algorithms
Algorithm | Time Complexity | Space | Stable? | When to Use |
---|---|---|---|---|
Linear Search | O(n) | O(1) | — | Unsorted, small lists |
Binary Search | O(log n) | O(1) | — | Sorted arrays |
Bubble / Selection / Insertion Sort | O(n²) | O(1) | Bubble, Insertion: stable; Selection: not stable | Small, educational |
Merge Sort | O(n log n) | O(n) | Yes | Stable sort, large data |
Quick Sort | O(n log n) | O(log n) | Not stable | Fast average-case sorting |
Heap Sort | O(n log n) | O(1) | Not stable | Memory‑limited settings |
Counting / Radix Sort | O(n + k) or O(n·k) | O(n + k) | Counting: stable if implemented appropriately | Integer keys in small range |
Choosing the Right Algorithm
Consider data size, order, stability requirements, memory availability, and whether data is numeric or custom objects. For example:
- Small unsorted list: Insertion or selection sort.
- Need stability and handles large data: Merge sort.
- Memory constrained: Heap sort in place.
- Integer keys in small fixed range: Counting or Radix sort.
4. Real‑World Use Cases & Optimization Tips
Use Case: Search in Databases & REST APIs
APIs and database indexes often implement B‑trees or hash‑based indexing to support fast lookups (logarithmic or constant time). Linear scan is inefficient at scale.
Use Case: Sorting in UI & Reporting
User‑facing tables typically use stable sorts (merge or insertion sort) to keep relative order of items while ordering by key.
Performance Optimizations
- Break early in bubble sort if no swaps in a pass
- Choose median-of-three or random pivot in quick sort to avoid worst-case
- Use insertion sort for small subarrays within merge or quick sort (hybrid sort, e.g. Timsort)
- In practice, modern languages use hybrid sorts: Python uses Timsort (stable, O(n log n)), Java’s Arrays.sort uses dual‑pivot quick sort for primitives and TimSort for objects.
Memory and Stability Tradeoffs
Stable sorts maintain relative order of equal elements and are important when sorting by multiple keys. Merge sort and insertion sort can be stable, while quick and heap sort are not inherently stable but are space-efficient.
5. Conclusion
Understanding searching and sorting algorithms is key to writing efficient code. From simple linear and binary search to advanced sorts like quick sort and radix sort, each has trade‑offs. In interviews and real‑world code, choose the algorithm based on data size, memory constraints, stability requirements, and type of keys.
Practice with code challenges, visualize algorithm steps, and review complexity in traces. Familiarity with built‑in library sort implementations (like Timsort) also helps when optimizing real code.
Whether you’re preparing for technical interviews or building scalable systems, the mastery of searching and sorting algorithms empowers you to write faster and cleaner code.
0 Comments