type
status
date
summary
tags
category
icon
password
featured
freq
difficulty
What is Recursion
Recursion is a fundamental computational concept where a function solves a problem by calling itself on smaller instances of the same problem.
Every recursive solution requires two ingredients:
- Base Case: The simplest input we know how to answer directly, which stops the recursion.
- Recursive Relation: The rule that reduces the current problem into smaller, solvable sub-problems.
Basics Examples
Fibonacci
- Idea: split
F(n)into the two smaller subproblemsF(n-1)andF(n-2).
- Caveat: exponential time without memoization.
Countdown
- A classic linear recursion with single subproblem (
n-1) and a clear base case (n==0).
- Complexity: time O(n), stack space O(n) as the stack depth is exactly
n. Ifncan be huge, consider an iterative loop to avoid stack overflows.
How to Design a Recursion Solution
- Define the state: what parameters uniquely describe a subproblem?
- Write the base case(s).
- Write the recurrence: solve smaller states, combine.
- Prove termination: each call moves toward a base case.
Bottom-Up vs. Top-Down
“Bottom-Up” or “Top-Down" is about when a recursive routine does its work and how information flows between calls. This isn’t a hard partition—many problems can be solved either way or mix both styles.
- Bottom-Up (postorder / do work on return): solve all dependent subcalls, then aggregate their results to produce the current result.
Examples: subtree statistics, expression evaluation, divide-and-conquer merges, tree/DAG DP.
- Top-Down (preorder / do work on entry): pass context downward — prefix state, limits, partial answers — making decisions as you descend.
Examples: backtracking with constraints (length, sums, same-level dedup), path construction, early pruning.

Depth-First Search (DFS)
DFS explores one path to the end before backtracking, ensuring thorough search with potential to prune or cut off paths early.
DFS’s recursive nature leverages the call stack to remember the path, making it intuitive to implement and useful for problems involving hierarchical or nested structures. And it is used for many purposes in graphs and trees, such as traversal & enumeration, connected components, cycle detection and path finding.
- Tree/graph DFS with visited:
- Backtracking (“choose / explore / unchoose”):
- Grid DFS with bounds & pruning:
Complexity:
O(V+E)for graphs (each edge/vertex visited once) andO(R*C)for grids.
- Stack space equals recursion depth (tree height / path length), which can be optimized with iterative stacks if needed.
Dynamic Programming
Dynamic Programming (DP) is essentially recursion enhanced with memory. When a recursive function repeatedly solves the same subproblems, storing these results in a cache (or memoization table) avoids redundant computations and transforms exponential time algorithms into efficient polynomial time solutions.
Top-Down DP (Memoized Recursion) aligns naturally with the recursive mindset: you define the problem recursively while adding a cache check at the start to immediately return previously computed results.
Step | Description |
Recurrence | Write the function using the base case and recursive relation. |
Memo Check | Add a check at the start: if the current state (defined by the arguments) is in the cache, return the cached result immediately. |
Cache Result | Before returning the result of the recursive calls, store it in the memo dictionary. |
The key is properly defining the state, the base cases, and the recursive relation (transition logic), then ensuring all computed results are cached so recursion runs efficiently.
Divide & Conquer
Divide and Conquer solves problems by dividing original input into smaller, independent subproblems, conquering each subproblem, and then combining their results to produce the final solution.
This approach marries well with recursion, where the recursive calls solve smaller subproblems, accompanied by a merge or combine step to construct the final solution.
- Divide: Break the problem into smaller subproblems that are similar to the original but simpler to solve. For example, in Merge Sort, the array is split into two halves.
- Conquer: Solve each subproblem recursively. If a subproblem is small enough, solve it directly (base case).
- Combine: Merge or combine the solutions of the subproblems to form the solution of the original problem, such as merging two sorted halves in Merge Sort.
Examples: Merge Sort, Quick Sort, Binary Search Tree operations, Maximum Subarray (divide into left / right / cross).
Practical Notes
- Global updates in recursion: for immutable scalars (e.g.,
int), use return values or enclose in a mutable wrapper; or in nested functions usenonlocalto modify an outer variable.
nonlocalworks only inside nested functions to reference a variable in the nearest enclosing (non-global) scope.
- Common base cases:
- Tree/Graph:
node is None, found target, visited detection (for cycles). - Grid: out of bounds; obstacle; reached goal.
- Array: empty, length ≤ k (minimal size).
Final thoughts
Recursion is a thinking tool: it turns big problems into small ones with clarity. It offers a distinct perspective on problem-solving that complements iterative approaches.
When combined with memoization or dynamic programming, recursion transforms problems with overlapping subproblems from intractable to efficiently solvable within polynomial time. Moreover, it facilitates algorithms in diverse domains, such as tree/graph traversals, backtracking, and those based on divide-and-conquer strategies like merge sorting.
- Author:Fan Luo
- URL:https://fanluo.me/article/understanding-recursion-functions-that-call-themselves
- Copyright:All articles in this blog adopt BY-NC-SA agreement. Please indicate the source!
上一篇
Breadth-First Search: Level-Order Exploration
下一篇
[Leetcode 3] 无重复字符的最长子串
