Last Updated: April 4, 2026
We need to slide a window of size k across the array and compute the median of each window. The median is simple to define: sort the window, pick the middle element (odd k) or average the two middle elements (even k).
The challenge is efficiency. Sorting each window from scratch costs O(k log k) per position, and with n - k + 1 windows, that becomes O(n * k log k). For n and k both near 10^5, that is far too slow.
Here is the key observation: when the window slides by one position, only one element leaves and one element enters. Can we maintain a sorted structure incrementally, updating it in O(log k) per slide instead of rebuilding from scratch?
1 <= k <= nums.length <= 10^5 - With n up to 100,000, we need O(n log k) or better. Brute force O(n * k log k) is too slow for large k.-2^31 <= nums[i] <= 2^31 - 1 - Full 32-bit range. When averaging two elements for the even-k median, their sum can overflow. Use a/2.0 + b/2.0 instead of (a+b)/2.0.10^-5 - Standard doubles are sufficient.The most direct approach: for every window position, extract the k elements, sort them, and pick the median. No data structure tricks, just a faithful translation of the problem statement.
This works correctly but re-sorts from scratch on every slide, throwing away all the sorted order from the previous window. For small k, it is perfectly fine. For large k, it becomes the bottleneck.
n - k + 1.i from 0 to n - k: a. Copy the subarray nums[i..i+k-1] into a temporary array.
b. Sort the temporary array.
c. If k is odd, the median is window[k / 2]. If k is even, the median is (window[k/2 - 1] + window[k/2]) / 2.0.
This approach works correctly but re-sorts from scratch every time the window moves. Since k - 1 elements stay the same between slides, we are wasting O(k log k) work on data that barely changed. What if we maintained the sorted order incrementally?
To find the median efficiently, we need quick access to the middle element(s). The classic technique is splitting elements into two heaps: a max-heap small holding the lower half and a min-heap large holding the upper half. The median lives at the boundary between them: either the top of small (odd k) or the average of both tops (even k).
When the window slides, we add the incoming element and remove the outgoing one. But here is the catch: heaps do not support efficient removal of arbitrary elements. If the element to remove is buried deep in the heap, there is no way to extract it in O(log k).
This is where lazy deletion saves us. Instead of physically removing the outgoing element from the heap, we record it in a hash map of "pending deletions." The element stays in the heap as a ghost. When it eventually bubbles to the top of its heap, we pop and discard it. This way, the heap tops are always valid (we prune ghosts before reading them), and we avoid the O(k) cost of searching for an element.
The two-heap approach works because the median is always at the boundary between the smaller and larger halves of the sorted data. By keeping the lower half in a max-heap and the upper half in a min-heap, the median is always accessible at the heap tops in O(1).
Lazy deletion is safe because we only care about the tops of the heaps (that is where the median lives). Ghost elements buried deep in the heap do not affect the median. When a ghost eventually reaches the top, we discard it before using the heap top for anything. The balance invariant guarantees that small always has either the same count as large (even k) or exactly one more (odd k).
small and a min-heap large. Use a hash map delayed to track pending deletions. a. The outgoing element is nums[i - k]. Add it to delayed.
b. Determine which heap the outgoing element belongs to and adjust the balance.
c. Insert the incoming element nums[i] into the appropriate heap.
d. Rebalance the heaps if needed.
e. Prune ghost elements from the tops of both heaps.
f. Compute the median and add to result.