DSA

Linked Lists

January 28, 20265 min read

Linked Lists are one of those data structures everyone learns, everyone complains about and folds a good amount of candidates.

Linked Lists are rarely used directly. Instead, they show up as building blocks for higher‑level data structures like stacks, queues, and deques. In this guide, we’ll teach you everything you nee to know about linked lists.


What Is a Linked List?

A Linked List is a data structure that represents a sequence of nodes.

Each node:

  • Stores data
  • Stores a pointer to another node

Unlike arrays, a linked list:

  • ❌ Does not allocate a contiguous block of memory
  • ❌ Does not support direct indexing

That one fact — no contiguous memory — explains almost every property of linked lists.

There are two main variations you’ll see:

  • Singly Linked List – each node points to the next
  • Doubly Linked List – each node points to both the next and previous node

How to Visualize a Linked List

The cleanest way to think about linked lists is with sentinel (dummy) nodes.

We usually add:

  • A sentinel head
  • A sentinel tail

These nodes don’t store real data. Their job is to simplify logic by removing edge cases like:

  • “What if the list is empty?”
  • “What if I’m inserting the first element?”
  • “What if I’m deleting the last element?”

With sentinels, every real node always has a previous and/or next pointer.

A basic singly linked list looks like:
1[HEAD] -> [node] -> [node] -> [node] -> [TAIL]

Each node contains:

  • A value
  • A pointer to the next node

Defining a Node

Here’s the simplest possible node definition:
1class Node:2    def __init__(self, val=None, next=None):3        self.val = val4        self.next = next

Nothing fancy. Just data + a pointer.


Adding to the Head — O(1)

Adding to the front of a linked list is one of the few operations that’s truly constant time.

Steps:

  1. Create a new node
  2. Store head.next in a temporary variable
  3. Point head.next to the new node
  4. Point new_node.next to the old first node
  5. Done

Why is this O(1)?

Because you’re not traversing anything. You’re just rearranging pointers.


Removing from the Head — O(1)

Removing the first real node is also constant time. Steps:

  1. If head.next is tail, the list is empty → error
  2. Set head.next = head.next.next
  3. Done

Again: no traversal, just pointer updates.


Adding to the Tail — O(n)

Here’s where linked lists start to feel slower than arrays.

If you don’t store a tail pointer, adding to the end requires:

  • Starting at the head
  • Walking node by node until you reach the end

That traversal costs O(n).

In a doubly linked list with a tail pointer, this can be O(1). But in a basic singly linked list, it’s O(n).


Removing from the Tail — O(n)

Removing the last node is expensive for the same reason.

To remove the tail, you need the node before it — which means traversing the list.

Steps (conceptually):

  1. If the list is empty → error
  2. Traverse until you reach the second‑to‑last node
  3. Set its next pointer to the sentinel tail

Traversal dominates, so this is O(n).


Peeking

  • Peek head → O(1)
  • Peek tail → O(n) (unless you maintain a tail pointer)

Peeking just means looking at the value, not removing it.


Accessing the Middle — O(n)

This is the most important mental shift:

We don’t talk about indices in linked lists. We talk about nodes.

There is no list[3].

To access the “3rd node,” you must:

  1. Start at the head
  2. Traverse node by node

That traversal costs O(n).

This is the direct consequence of not having contiguous memory.


Searching by Value — O(n)

Searching works exactly how you’d expect:

  • Start at the head
  • Walk node by node
  • Stop when you find the value or hit the tail

Worst case? You scan the entire list → O(n).*


Deleting at a Position — O(n)

Deleting a node at a specific position also requires traversal.

You must:

  1. Traverse to the node (or the node before it)
  2. Rearrange pointers to skip it

Traversal dominates → O(n).


Deleting a Node Once You Have It — O(1)

This is a subtle but powerful idea.

If you already have a reference to a node, deletion is constant time.

You simply:

  • Rewire the surrounding pointers

No traversal needed.

This pattern is the foundation of structures like a Linked HashMap, where:

  • A hashmap gives O(1) access to nodes
  • A linked list maintains ordering

Final Takeaway

Linked Lists trade random access for cheap insertions and deletions at known locations.

They’re not about speed everywhere — they’re about predictable pointer manipulation.

Once you internalize:

  • No contiguous memory
  • No indexing
  • Traversal is everything

Linked lists stop feeling magical and start feeling obvious.

And that’s exactly where you want to be in interviews.