A step-by-step approach to technical interviews — from understanding the problem to writing the code, plus data structures and algorithm patterns.
Restate the question in your own words to confirm understanding.
Clarify input size, structure, edge cases, and any limitations.
Try working through a small example to understand the problem better.
Start with a brute force solution and discuss it with the interviewer. Ask 'What do you think?' before optimizing.
Once brute force is clear, think about how to optimize time/space complexity.
Don't just say the solution — walk through it step by step. Explicitly make sure the interviewer is following along.
Write comments outlining each section of your solution before writing actual code.
Implement the solution, referring to your comments as a guide.
Walk through your code with a couple of examples to verify correctness and catch edge cases.
If: Needs faster search in O(1)
→ Try using a Set or a Map.
If: Finding top/bottom/max/min/closest/farthest K elements among N elements
→ Try using a Heap.
If: Input is a sorted Array, List, or Matrix
→ Try Two Pointer strategy or Binary Search.
If: Trying all Permutations and Combinations
→ Use Backtracking or Breadth First Search.
If: Input is a Tree or Graph
→ Apply Tree Traversals or Graph Traversals (BFS and DFS).
If: Singly Linked List traversal
→ Try Two Pointers or Slow/Fast Pointers.
If: Recursive solution is hard to visualize/code
→ Try using a Stack data structure with a loop.
If: Array iteration takes O(N²) time, O(1) space
→ Try using a HashMap/HashSet → O(N) time, O(N) space.
If: Array iteration takes O(N²) time, O(1) space (alternative)
→ Try sorting the array → O(N log N) time, O(1) space.
If: Recursive solution needs optimization
→ Consider Dynamic Programming.
If: Group of strings or substring manipulation/find/store
→ Try using Tries or HashMap.
Key data structures and algorithm patterns for coding interviews.
Contiguous memory storage. Master traversal, manipulation, and in-place operations.
Complexity: Access O(1), Search O(n), Insert O(n)
Key-value storage with constant-time lookups. Essential for frequency counting and deduplication.
Complexity: Insert O(1), Lookup O(1), Delete O(1) avg
Sequential nodes connected by pointers. Practice reversal, cycle detection, and merge operations.
Complexity: Access O(n), Insert O(1), Delete O(1)
LIFO and FIFO structures. Used for parsing, BFS, undo operations, and monotonic stack patterns.
Complexity: Push/Pop O(1), Enqueue/Dequeue O(1)
Hierarchical structures with traversal patterns (in-order, pre-order, post-order, level-order).
Complexity: Search O(log n), Insert O(log n) avg
Nodes and edges representing relationships. Master adjacency lists, BFS, DFS, and topological sort.
Complexity: BFS/DFS O(V + E)
Tree-based structure for efficiently finding min/max. Used for top-K problems and scheduling.
Complexity: Insert O(log n), Extract O(log n), Peek O(1)
Prefix trees for efficient string storage and lookup. Used for autocomplete and spell checking.
Complexity: Insert O(m), Search O(m) where m = key length
Use two indices to traverse from different positions. Great for sorted arrays and linked lists.
Maintain a window of elements for subarray/substring problems. Fixed or variable window size.
Divide and conquer on sorted data. Applies beyond arrays — search spaces, rotated arrays, etc.
Graph/tree traversal strategies. BFS for shortest path, DFS for exhaustive exploration.
Break problems into overlapping subproblems. Master top-down (memoization) and bottom-up (tabulation).
Explore all possibilities by building candidates incrementally and abandoning invalid paths early.
Make locally optimal choices at each step. Works when local optimum leads to global optimum.
Fundamental algorithms — quicksort, mergesort, counting sort. Know time/space trade-offs.