AlgoMaster Logo

Introduction to Queues

Ashish

Ashish Pratap Singh

Queues are a fundamental data structure in computer science that follows the FIFO principle FIFO — First In, First Out.

Front
1
3
5
7
9
Rear

That means the first element that goes into the queue is also the first one to come out.

This simple rule makes queues extremely powerful, and you’ll find them everywhere in real-world systems like:

  • Task scheduling in operating systems
  • Handling requests in web servers, ensuring they’re processed in the order they arrive
  • Implementing graphs algorithms like breadth first search (BFS)

And beyond that, queues are also an important topic for coding interviews. Many tricky problems become much simpler once you recognize that a queue is the right data structure to use.

In this chapter, we’ll cover:

  • What a queue is and how it works
  • Core operations and their complexities
  • How to implement them in code
  • Different types of queues

Lets get started:

What is a Queue?

So, what exactly is a queue?

A queue is a linear data structure that processes elements in the exact order they arrive — following the classic FIFO principle: First In, First Out.

That means:

  • The first element added is the first one to be removed.
  • New elements always join from the back, and removals always happen from the front.

A simple way to think about queues is people standing in line at a movie theater:

  • The first person to join the line is the first person to get a ticket.
  • New people always join at the back, and service always happens at the front.

Core Queue Operations

Now that you understand what a queue is, lets walk through the common queue operations you will use with a simple example.

1. Enqueue

This operation adds an element to the back of the queue.

Starting with an empty queue.

We enqueue 3 values one by one:

Enqueue(10):

Front
10
Rear

Enqueue(20):

Front
10
20
Rear

Enqueue(30):

Front
10
20
30
Rear

2. Dequeue

This removes the front-most element from the queue.

For our example, after Dequeue(), we remove 10 since it was the first element that got added:

Dequeue():

Front
20
30
Rear

3. Peek (or Front)

This lets you look at the front element without removing it.

Calling Peek() on our current example gives you 20, but the queue stays unchanged.

4. IsEmpty

and finally isEmpty checks whether the queue contains any elements.

For our current queue example, it will return false but if we dequeue both 20 and 30, isEmpty() will return true.

These operations are all constant time making queues a very efficient data structure for many real-world systems.

Queue Implementations

Now that you understand what a queue is and how it works, let’s talk about how to actually implement one.

There are multiple ways to implement a queue. Let’s explore the two most common approaches:

1. Using Arrays

This is the simplest and most intuitive way to build a queue.

  • You enqueue elements at the end (rear).
  • You dequeue elements from the front (front).

But there’s a problem.

If you use a simple array, removing the front element means shifting all the other elements one position to the left. That takes O(n) time — which is inefficient for large queues.

To avoid shifting, we use a circular array (or circular buffer).

  • When the rear pointer reaches the end of the array, it wraps around to the front (if space is available).
  • This gives us constant time operations for both enqueue and dequeue — O(1).

Here’s how we can implement this in code:

  • We use an array to store the queue elements, along with two pointers: front and rear — these track the start and end of the queue.
  • When we enqueue, we move the rear pointer forward, wrapping it around using % capacity if needed, and insert the new element at that position.
  • When we dequeue, we read the element at the front, then move the front pointer forward, again wrapping it around circularly.

This circular movement ensures that we can reuse slots freed up by dequeue operations — preventing wasted space at the beginning of the array.

By wrapping around both front and rear using % capacity, we keep the queue compact and efficient, without needing to shift elements.

2. Using Linked Lists

Another popular and efficient way to implement a queue is using a linked list.

A linked list grows and shrinks dynamically, with no need to shift elements. You get O(1) time for both enqueue and dequeue operations without the complexity of circular indexing.

Here’s how it works:

We maintain two pointers:

  • head: points to the front of the queue (used for dequeue)
  • tail: points to the rear of the queue (used for enqueue)
  • For enqueue, we create a new node with the given value. If the queue isn’t empty, we link the current tail’s next pointer to the new node. This adds the new node to the end of the queue. Then we update tail to point to the newly added node. If the queue was empty (i.e., both head and tail were null), we also set head = newNode. This ensures that the first element becomes both the head and tail.
  • For dequeue, If head is null, the queue is empty so we throw and error. We extract the value from the front node. We update head to point to the next node — removing the front element. If the queue becomes empty after removal (i.e., head becomes null), we also set tail = null. This prevents stale pointers and keeps the structure clean. Finally, return the dequeued value.

3. Using Built-in Libraries

In most coding interviews — and almost all real-world projects — you don’t need to build a queue from scratch.

Because modern programming languages already provide efficient, well-tested, and optimized queue implementations in their standard libraries.

Java

Java provides a Queue interface, with popular implementations like LinkedList and ArrayDeque.

Python

Python doesn't have a built-in Queue class in the core language. But it does have deque in the collections module and it works perfectly for queues.

C++

In C++, the Standard Template Library (STL) provides a queue container adapter built on top of other containers like deque or list.

These libraries are optimized and ready to use, so unless the interviewer specifically asks you to implement a queue from scratch, it’s best to rely on them.

Types of Queues

So far, we've covered the basic idea of a queue — a First In, First Out (FIFO) data structure.

But there are actually several variations of queues, each designed for different use cases.

Let’s go through the most important ones.

1. Simple Queue

This is the most basic form of a queue and we have already covered it in detail.

  • Insertion (enqueue) happens at the rear.
  • Removal (dequeue) happens at the front.
  • It strictly follows the FIFO order — the element that enters first is the one that leaves first.

However, when implemented using a simple array, it can lead to wasted space. After several dequeues, the front keeps moving forward, leaving unused slots at the beginning of the array.

To fix this, we use a smarter version — the circular queue.

2. Circular Queue

A circular queue eliminates the wasted-space problem by treating the array as circular.

  • When the rear pointer reaches the end of the array, it wraps around to the front.
  • This allows the queue to reuse freed-up slots after elements are dequeued.

Circular queues are especially useful in memory-constrained environments, like buffering systems or resource scheduling.

3. Deque (Double-Ended Queue)

A deque, short for double-ended queue is a more flexible version of a queue that allows insertion and deletion from both ends:

  • You can enqueue or dequeue at the front.
  • and You can enqueue or dequeue at the rear.

This makes deques extremely versatile.

4. Priority Queue

A priority queue doesn’t follow the traditional FIFO order. Instead:

  • Each element is associated with a priority.
  • The element with the highest priority is dequeued first, regardless of its insertion time.
  • If multiple elements have the same priority, then FIFO order is used among them.

A classic real-world example is CPU scheduling in operating systems:

The OS maintains a queue of tasks, each with a priority level. Tasks with higher priority are executed before those with lower priority.