Last Updated: April 2, 2026
We have a 2D grid of integers, and we need to find the longest path where each step moves to a strictly larger value. Movement is restricted to the four cardinal directions (up, down, left, right), no diagonals.
A few things to notice right away. First, the path must be strictly increasing, which means we can never revisit a cell (since a cell's value cannot be strictly greater than itself). That rules out cycles entirely, which is a big deal. It means the implicit graph formed by "cell A has an edge to cell B if B is an adjacent cell with a larger value" is a Directed Acyclic Graph (DAG). And finding the longest path in a DAG is a classic problem that can be solved efficiently with dynamic programming or topological ordering.
Second, the path can start from any cell and end at any cell. So we need to consider every cell as a potential starting point and find the maximum across all of them.
1 <= m, n <= 200 -> The matrix has at most 200 x 200 = 40,000 cells. This means O(m n) and even O(m n log(m n)) solutions are comfortable.0 <= matrix[i][j] <= 2^31 - 1 -> Values can be very large but that does not affect complexity, just need to handle int range.The most natural approach is to try every cell as a starting point and explore all possible increasing paths from it using DFS. From each cell, we look at all four neighbors and recursively explore any neighbor with a strictly larger value. The length of the longest path starting from a cell is 1 (for the cell itself) plus the maximum path length among all valid neighbors.
Without any caching, this means we recompute the same subproblems over and over. If cell (0, 0) leads to cell (1, 0), and cell (0, 1) also leads to cell (1, 0), we would explore the entire subtree rooted at (1, 0) twice. In the worst case, this leads to exponential time.
The brute force recomputes the same cell's longest path every time it is reached from a different predecessor. What if we cached each cell's result the first time we computed it?
The brute force wastes time recomputing paths from cells we have already fully explored. The fix is straightforward: add a cache. Define dp[i][j] as the length of the longest increasing path starting from cell (i, j). The first time we compute it via DFS, we store it. Every future call for the same cell returns the cached value immediately.
After memoization, each cell is computed exactly once. The DFS from each cell visits at most 4 neighbors, and each of those either returns a cached value in O(1) or triggers a fresh computation that itself only happens once. So the total work across all DFS calls is O(m * n).
The strictly increasing constraint eliminates cycles. If you could ever return to a cell you already visited, that would mean you found a path where values go up and then come back to the same value, which contradicts "strictly increasing." Since there are no cycles, the DFS from any cell will always terminate and produce a consistent result regardless of which cells called it. This makes memoization safe and correct.
dp of size m x n, initialized to 0 (0 means "not yet computed").dp.The DFS with memoization is already optimal in time, but the recursion stack can be as deep as m * n. What if we could process cells iteratively using topological ordering?
Instead of DFS, we can use BFS with topological ordering (Kahn's algorithm). Compute the "in-degree" of each cell, where in-degree is the count of adjacent cells with strictly smaller values. Cells with in-degree 0 are local minima and form the first BFS layer. We process layer by layer, decrementing in-degrees of larger neighbors. When a neighbor's in-degree drops to 0, it joins the next layer. The total number of BFS layers equals the longest increasing path.
Each BFS layer peels off cells whose predecessors have all been processed. The first layer contains local minima (path length 1). The second layer contains cells reachable from minima in one step (path length 2). A cell joins layer k only after all cells with smaller values that are adjacent to it have been processed, meaning all shorter paths feeding into it have been accounted for. The number of layers equals the length of the longest chain in the DAG.