If hash maps are about looking up exact values, tries are about looking up prefixes. They're the data structure behind autocomplete, spell checkers, and any problem that cares about what a string starts with.
This post breaks down what tries are, how they work under the hood, and why their performance characteristics make them irreplaceable for prefix-based problems.
What is a trie?
A trie (pronounced "try") is a tree where each path from root to node represents a prefix of some string. The root is empty. Each edge represents a character.
Some everyday examples:
Your phone's autocomplete — type "hel" and it suggests "hello", "help", "helmet"
Spell checkers — is this sequence of characters a real word?
Search engines — suggest queries as you type
IP routing tables — longest prefix matching
The name comes from retrieval. It's a tree optimized for retrieving strings by their prefixes.
A trie is a tree of characters
Under the hood, a trie is a tree where:
The root is an empty node
Each node has a dictionary of children: children[char] → TrieNode
Each path from root to a node spells out a prefix
Some nodes are marked as end-of-word — meaning a complete string ends here
pythonInserting 'cat', 'car', and 'do' into a trie:
1# (root)2# / \3# c d4# | |5# a o *6# / \7# t* r*8#9# * = end of word
The key insight: strings that share a prefix share nodes. "cat" and "car" share the c → a path. This is what makes tries space-efficient for dictionaries with overlapping prefixes.
Insert is O(m)
Inserting a word of length m means walking m levels down the tree, creating nodes as needed, and marking the last node as end-of-word.
pythonInsert walks one node per character:
1def insert(self, word):2 node = self.root3 for ch in word:4 if ch not in node.children:5 node.children[ch] = TrieNode()6 node = node.children[ch]7 node.end = True
No hashing, no collisions. Each character takes you one level deeper. Total work: O(m) where m is the word length.
Search is O(m)
To check if a word exists, follow the path character by character. If you reach the end and the node is marked as end-of-word, the word exists.
pythonSearch follows the path and checks end-of-word:
1def search(self, word):2 node = self.root3 for ch in word:4 if ch not in node.children:5 return False6 node = node.children[ch]7 return node.end # must be a complete word, not just a prefix
This is also O(m). Compare that to a hash map lookup which is O(m) too (because hashing the string takes O(m)). The trie matches hash map speed for exact lookups — but can do things hash maps can't.
Prefix query is O(m) — and this is where tries shine
Checking "does any word start with this prefix?" is the same as search, but you don't need the end-of-word check. If you can follow the path, the prefix exists.
pythonPrefix check — just follow the path:
1def starts_with(self, prefix):2 node = self.root3 for ch in prefix:4 if ch not in node.children:5 return False6 node = node.children[ch]7 return True # prefix exists
A hash map cannot do this efficiently. To check if any key starts with "hel", you'd have to scan every key. A trie does it in O(m).
This is the reason tries exist.
Delete is O(m)
To delete a word, unmark its end-of-word flag. Optionally, clean up nodes that are no longer part of any word (no children, not end-of-word).
In practice, most interview problems only need insert, search, and starts_with. Delete is rare.
There is no random access
You cannot "get the 5th word" from a trie. Tries are not ordered by insertion time — they're ordered by lexicographic structure.
If you need all words in sorted order, a DFS traversal gives them to you for free (tries store strings in sorted order by nature).
Memory usage is the tradeoff
Each node has a dictionary of children. For a trie storing English words, each node could have up to 26 children. In practice, most nodes are sparse.
For interview problems, memory is rarely the bottleneck. But in production, if you're storing millions of strings, a compressed trie (radix tree) or a hash map might be more practical.
When a problem is secretly about tries
"Find all words that start with..." — if the problem involves prefix matching across a dictionary, that's a trie.
"Word search on a board" — Word Search II on LeetCode. Build a trie of dictionary words, then DFS the grid. The trie lets you prune branches where no dictionary word could possibly match.
"Longest common prefix" — insert all strings into a trie, then walk down the single-child path from the root.
"Autocomplete / type-ahead" — classic trie application. Walk to the prefix node, then collect all words below it.
"Count words with a given prefix" — store a count at each node that tracks how many words pass through it.
The full implementation
pythonComplete Trie class:
1class TrieNode:2 def __init__(self):3 self.children = {} # char -> TrieNode4 self.end = False # True if a word ends here567class Trie:8 def __init__(self):9 self.root = TrieNode()1011 def insert(self, word):12 node = self.root13 for ch in word:14 if ch not in node.children:15 node.children[ch] = TrieNode()16 node = node.children[ch]17 node.end = True1819 def search(self, word):20 node = self.root21 for ch in word:22 if ch not in node.children:23 return False24 node = node.children[ch]25 return node.end2627 def starts_with(self, prefix):28 node = self.root29 for ch in prefix:30 if ch not in node.children:31 return False32 node = node.children[ch]33 return True
Know the common trie operations
For the implementation above:
trie.insert(word) → O(m)
trie.search(word) → O(m)
trie.starts_with(prefix) → O(m)
Where m is the length of the word or prefix. No operation depends on how many words are in the trie — only on the length of the query.
Final takeaway
Tries are the answer when hash maps aren't enough.
Understanding:
that a trie is a tree of characters, where shared prefixes share nodes
that insert, search, and prefix queries are all O(m)
that the killer feature is prefix lookup — something hash maps can't do efficiently
the interview cues: "prefix", "starts with", "autocomplete", "word search"
will let you recognize trie problems instantly and implement them cleanly.