If stacks process the most recent element first, queues process the oldest one. That single difference — first in, first out — makes queues the backbone of BFS, task scheduling, and anything that needs to respect arrival order.
This post breaks down what queues are, how they work under the hood, and why using the wrong implementation can silently destroy your performance.
What is a queue?
A queue is a collection where the first element added is the first one removed.
This is called FIFO — First In, First Out.
You add to the back and remove from the front.
Some everyday examples:
A line at a coffee shop — first in line, first served
A print queue — documents print in the order they were submitted
A support ticket system — oldest tickets get handled first
Enqueue, dequeue, and peek. Three operations.
Why a plain array is a bad queue
Your first instinct might be to use an array. Push to the end, remove from the front.
The problem? Removing from the front of an array is O(n).
Every remaining element has to shift one position left to fill the gap. You already know this from studying arrays.
pythonThe O(n) trap:
1# DON'T do this2queue = [1, 2, 3, 4, 5]3queue.pop(0) # O(n) — shifts everything left
This means every dequeue costs O(n). For a queue that processes thousands of items, that adds up fast.
Use a deque — O(1) from both ends
The right implementation for a queue is a double-ended queue (deque).
A deque is optimized for fast appends and removals from both ends.
No shifting. The deque simply advances its front pointer.
This is the entire reason deque exists. If dequeue were O(n), the queue would be useless at scale.
Peek is O(1)
Looking at the front element without removing it is also constant time.
pythonConstant-time peek:
1queue[0] # → 30, O(1)
In interviews, you'll peek to inspect the next item to process before deciding whether to dequeue.
There is no random access
Just like stacks, queues are designed for sequential processing, not arbitrary lookups.
You don't access the 5th element of a queue. You don't search through it. You process items in the order they arrived.
The constraint is the feature. FIFO ordering eliminates ambiguity about what to process next.
The answer is always: the one that's been waiting the longest.
Searching a queue is O(n)
If you need to find a specific value in a queue, you'd have to examine each element from front to back.
That's O(n) time.
If you're searching, you probably want a hash set alongside your queue — not a different data structure entirely, but a companion for fast lookups.
BFS — queues are the engine of breadth-first search
BFS explores nodes level by level. It visits all neighbors of the current node before moving deeper.
That's FIFO. Process the oldest discovered node first.
pythonBFS uses a queue:
1from collections import deque23def bfs(root):4 queue = deque([root])5 while queue:6 node = queue.popleft() # process oldest first7 for neighbor in node.neighbors:8 queue.append(neighbor) # discover new nodes
This is the most important use of queues in interviews. If BFS uses a stack, it becomes DFS. The data structure defines the traversal order.
Stack → DFS (most recent first, go deep)
Queue → BFS (oldest first, go wide)
That distinction is worth memorizing.
Level-order traversal is BFS on a tree
One of the most common interview patterns: process a tree level by level.
pythonLevel-order traversal:
1from collections import deque23def level_order(root):4 if not root:5 return []6 result = []7 queue = deque([root])8 while queue:9 level_size = len(queue)10 level = []11 for _ in range(level_size):12 node = queue.popleft()13 level.append(node.val)14 if node.left:15 queue.append(node.left)16 if node.right:17 queue.append(node.right)18 result.append(level)19 return result
The key trick: snapshot len(queue) at the start of each level. That tells you exactly how many nodes belong to the current level before any children get added.
When a problem is secretly about queues
Queues don't always announce themselves. Here are the signals:
"Level by level" or "layer by layer" — BFS with a queue.
"Shortest path" in an unweighted graph — BFS guarantees shortest path. Queue required.
"Process in order received" — any FIFO processing. Task scheduling, message ordering.
"Sliding window" with eviction — sometimes a deque is used to maintain a window of elements, removing expired items from the front.
"First K" or "stream of data" — processing elements in arrival order.
When you recognize FIFO semantics, the queue becomes obvious.
Know the common queue operations in your language
For Python (using collections.deque):
queue.append(x) → O(1) (enqueue)
queue.popleft() → O(1) (dequeue)
queue[0] → O(1) (peek front)
len(queue) → O(1) (size)
len(queue) == 0 → O(1) (is empty)
Never uselist.pop(0) as a queue operation. It's O(n) and will cause performance issues in any non-trivial problem.
Knowing these costs lets you reason about performance before you write code.
Final takeaway
Queues enforce one rule: process the oldest item first. That's FIFO.
Understanding:
why the right implementation matters (deque vs list)
why all core operations are O(1)
that BFS is fundamentally a queue algorithm
the interview signals that point to a queue solution
will make you confident every time you see level-order traversal, shortest path, or FIFO processing. When the problem cares about the element that's been waiting the longest — reach for a queue.