AlgoMaster Logo

Bubble Sort

Last Updated: March 31, 2026

Ashish

Ashish Pratap Singh

Bubble Sort is one of the simplest sorting algorithms and often the first one people learn. The idea is straightforward: repeatedly compare adjacent elements and swap them if they are in the wrong order. With each pass, the largest unsorted element “bubbles up” to its correct position.

While Bubble Sort is not efficient for large inputs, it plays an important role in building intuition. It helps you understand how sorting works step by step, how swaps affect order, and how repeated passes gradually lead to a sorted array.

In this chapter, you will learn how Bubble Sort works, how to implement it, and how small optimizations can improve its performance. More importantly, you will use it as a stepping stone to understand why more advanced sorting algorithms are needed.

What Is Bubble Sort?

Bubble sort is a comparison-based sorting algorithm that works by repeatedly stepping through the array, comparing adjacent elements, and swapping them if they are in the wrong order. Each pass through the array moves the largest unsorted element to its correct position at the end, much like a bubble rising to the surface of water. That is where the name comes from.

Here is the core intuition: after the first pass, the largest element is guaranteed to be at the last position. After the second pass, the second-largest element is in its correct spot. After n-1 passes, the entire array is sorted.

Example:

The element 8, being the largest, bubbles to the far right after the first pass. In the next pass, 5 will bubble to its correct position (second from the right), and so on. Each pass guarantees that one more element is sorted.

Why "bubble"?

The analogy is simple. In a carbonated drink, larger bubbles rise to the surface faster. In bubble sort, larger values rise (move toward the end of the array) faster because they keep getting swapped forward with every comparison.

How It Works

Let us break down the algorithm step by step.

  1. Start at the beginning of the array.
  2. Compare the current element with the next element.
  3. If the current element is greater than the next element, swap them.
  4. Move to the next pair and repeat steps 2-3.
  5. After completing one full pass, the largest unsorted element is now at the end.
  6. Repeat the process for the remaining unsorted portion of the array.
  7. Stop when no swaps are made during a pass (this means the array is sorted).

Step 7 is the optimization that makes bubble sort actually practical on nearly sorted data. Without it, you always run n-1 passes even if the array becomes sorted after the second pass. With early termination, if a pass completes with zero swaps, you know the array is already in order and can stop immediately.

Here is the algorithm as a flowchart:

There are two things worth noting in this flowchart. First, the inner loop range shrinks by one after each pass (n-i-2 instead of n-2), because the last i elements are already sorted. Second, the swapped flag is reset at the start of every pass, so it tracks whether any work was done in the current pass only.

Code Implementation

All six implementations follow the same structure: an outer loop that counts passes, an inner loop that does comparisons and swaps within a pass, and an early termination check at the end of each pass. The only differences are language syntax.

Example Walkthrough

Let us trace through bubble sort with the array [5, 3, 8, 4, 2]. We will track every comparison and swap so you can see exactly how the algorithm progresses.

Notice how the inner loop range shrinks with each pass. In pass 1 we made 4 comparisons, in pass 2 we made 3, then 2, then 1. The total number of comparisons is 4 + 3 + 2 + 1 = 10, which is n(n-1)/2 for n=5. This is exactly where the O(n^2) comes from.

Now let us look at the best-case scenario. If the input were already sorted, say [1, 2, 3, 4, 5]:

This is why the swapped flag matters so much. Without it, bubble sort would still run all n-1 passes, doing n(n-1)/2 comparisons for an already sorted array. With it, the best case drops to O(n).

Complexity Analysis

CaseTime ComplexityExplanation
BestO(n)Array is already sorted. One pass with no swaps triggers early termination. We make n-1 comparisons and zero swaps.
AverageO(n^2)Elements are in random order. On average, about half the pairs need swapping, but the nested loop structure still gives us quadratic time.
WorstO(n^2)Array is sorted in reverse order. Every pair in every pass requires a swap. We make exactly n(n-1)/2 comparisons and swaps.
SpaceO(1)Bubble sort is in-place. It only uses a constant amount of extra memory for the temporary swap variable and the swapped flag.

Stability: Bubble sort is a stable sorting algorithm. Two elements with equal values are never swapped (because the condition is strictly greater than, not greater-than-or-equal). This means equal elements maintain their original relative order after sorting.

In-place: Bubble sort does not require any additional arrays or data structures. All swaps happen within the original array, making the space complexity O(1).

To put the numbers in perspective, for an array of 10,000 elements:

  • Best case (already sorted): roughly 10,000 comparisons
  • Worst case (reverse sorted): roughly 50,000,000 comparisons

That is a 5,000x difference, which is why the early termination optimization is not just a nice-to-have. It fundamentally changes the algorithm's behavior on nearly sorted data.

When to Use and When Not to Use

Bubble sort is rarely the right choice for production code, but it does have legitimate use cases.

Good use cases:

  • Nearly sorted data. If the array is almost sorted (only a few elements are out of place), the early termination optimization makes bubble sort run close to O(n). This is one of the few scenarios where bubble sort can compete with more advanced algorithms.
  • Small arrays. For arrays with fewer than 20-30 elements, the overhead of more complex algorithms like merge sort or quicksort is not worth it. The constant factors matter, and bubble sort's simplicity becomes an advantage.
  • Detecting if an array is sorted. One pass of bubble sort with the swapped flag tells you whether the array is already in order. This costs O(n) and uses O(1) space.
  • Teaching and learning. Bubble sort is the simplest comparison-based sorting algorithm. It is the perfect starting point for understanding sorting concepts, loop invariants, and algorithm analysis.
  • Memory-constrained environments. Since bubble sort is in-place and uses O(1) extra space, it works in situations where you cannot afford to allocate additional arrays.

Bad use cases:

  • Large datasets. O(n^2) is simply too slow for thousands of elements or more. Merge sort (O(n log n)) or quicksort (O(n log n) average) are dramatically better choices.
  • Performance-critical applications. Even on medium-sized arrays, the number of swaps bubble sort makes is much higher than alternatives like insertion sort. If every millisecond matters, you should not be using bubble sort.
  • Random or reverse-sorted data. Bubble sort has no meaningful advantage over other quadratic algorithms on random data, and insertion sort typically outperforms it in practice because it does fewer swaps.

A useful rule of thumb: if someone asks you about bubble sort in an interview, they are probably testing whether you understand its properties (stability, in-place, best case O(n)) and whether you can articulate why better alternatives exist. They are not asking you to advocate for using it in production.