Jan 13, 2021

# LC 239

**Goal: **Given an integer array and window side of K, find the max values for every window in the array. Return the max values array.

**Intuition: **For window of size k ending with [i], the max of this window must ≥ [i] and must be within the window borders. All the numbers that are less than [i] are irrelevant.

**Thought Process: **For naive solution, we just find the max value for each window size of k. It takes O(n*k), which is not good enough for this problem. We are aiming for linear solution for given input size (10⁵). Notice that there are many repeating operations involved for naive solution because sometimes the max value for previous window is also the max value for current window if the max value is within the border of current window, thus you should not need to do the findmax() for current window.

So the key takeaway from above analysis is that we are trying to reduce the time(find max in window size k) of O(k) to a constant check on whether previous max is within the range.

We can maintain a non-increasing sequence data structure to store numbers before current number [i]. Why? we only care about number that is greater than [i], those numbers that are less than [i] will not be the max for current window. Thus we want to maintain a sequence so that all the numbers inside the data structure will be ≥ [i]. And the left most number in the data structure will be the largest. Then if [i-k+1] < index of left most number in data structure, it means that the left most value is within the range of k. Thus we have found the max for window ending [i].

As we discussed so far, we need a linear data structure that is efficient at adding and removing from both ends. Thus, double-ended queue or deque is chosen.

**Additional:**

i-k+1≥ 0 this line makes sure that we start adding max values when the window size is met.

The order of two while loop does not matter because as mentioned in Intuition section, we need to make sure both conditions are met. candidate must ≥ [i] and its index must ≥ i-k+1.

**Time: O(size of input)**