Problem Link

Description


Given an integer array nums and an integer k, return the maximum sum of a non-empty subsequence of that array such that for every two consecutive integers in the subsequence, nums[i] and nums[j], where i < j, the condition j - i <= k is satisfied.

A subsequence of an array is obtained by deleting some number of elements (can be zero) from the array, leaving the remaining elements in their original order.

 

Example 1:

Input: nums = [10,2,-10,5,20], k = 2
Output: 37
Explanation: The subsequence is [10, 2, 5, 20].

Example 2:

Input: nums = [-1,-2,-3], k = 1
Output: -1
Explanation: The subsequence must be non-empty, so we choose the largest number.

Example 3:

Input: nums = [10,-2,-10,-5,20], k = 2
Output: 23
Explanation: The subsequence is [10, -2, -5, 20].

 

Constraints:

  • 1 <= k <= nums.length <= 105
  • -104 <= nums[i] <= 104

Solution


Python3

class Solution:
    def constrainedSubsetSum(self, nums: List[int], k: int) -> int:
        N = len(nums)
        queue = deque()
        res = -inf
 
        # dp[i] = nums[i] + max(0, dp[i - 1], dp[i - 2], ..., dp[i - k])
        # O(N) solution: keep a decreasing monotonic queue
 
        for i, x in enumerate(nums):
            curr = x
            if queue:
                curr += max(0, queue[0][0])
 
            while queue and curr > queue[-1][0]:
                queue.pop()
            
            if curr > 0:
                queue.append((curr, i))
 
            if queue and i - queue[0][1] == k:
                queue.popleft()
 
            res = max(res, curr)
 
        return res