Home > Enterprise >  Can we implement recursive bottom up dynamic programming solutions using python's @cache?
Can we implement recursive bottom up dynamic programming solutions using python's @cache?

Time:09-05

I was solving leetcode climbing stairs problem:

You are given an integer array cost where cost[i] is the cost of ith step on a staircase. Once you pay the cost, you can either climb one or two steps. You can either start from the step with index 0, or the step with index 1. Return the minimum cost to reach the top of the floor.

I solved it using Top Down dynamic programming approach with memoization using python @cache as follows:

from functools import cache

class Solution:
        
    def minCostClimbingStairs(self, cost) -> int:

        @cache
        def minCostAux(curStep):
                       
            if curStep == -1: return 0 
            if curStep == 0: return cost[0]
            return min(minCostAux(curStep-2)   cost[curStep], minCostAux(curStep-1)   cost[curStep]) 

        cost.append(0)
        return minCostAux(len(cost)-1)

I love how using @cache makes the solution look minimal.

The bottom up recursive solution without @cache looks like this:

class Solution:    
            
    def minCostClimbingStairs(self, cost) -> int:

        def minCostAux(step_i):
            # ===== return memoized case =====
            if minCosts[step_i] != -1:
                return minCosts[step_i]
                
            # ===== memoize =====
            minCosts[step_i] = min(minCostAux(step_i-1 )
                                 , minCostAux(step_i-2))   cost[step_i]

            # ===== return base CEIL case =====
            if step_i == len(cost)-1: return minCosts[step_i]

            # ===== recurse bottom up =====
            return minCostAux(step_i 1)

        cost.append(0) # this result in one extra step to get final answer (check Note above)

        # ===== initialize memory =====
        minCosts = [-1] * len(cost)

        # ===== memoize base cases =====
        minCosts[0] = cost[0]
        minCosts[1] = min(cost[0] cost[1], cost[1])
        
        # ===== recurse bottom up =====
        return minCostAux(2)

I was curious if we can implement bottom up recursive solution using @cache. It could have looked something like this:

from functools import cache
class Solution:    
                
    def minCostClimbingStairs(cost) -> int:
        @cache
        def minCostAux(step_i):
                
            if step_i == len(cost)-1: return minCosts[step_i]
            
            minCosts[step_i] = min(minCostAux(step_i-1) 
                                 , minCostAux(step_i-2, cost))   cost[step_i] # ***

            return minCostAux(step_i 1) # ***
            
        cost.append(0) 
        return minCostAux(2, cost)

This doesnt work as I am not able to figure out how can I appropriately combine lines commented as # ***. In top down, we memoize by returning current problem solution which is calculated by recursing subproblems. But, in bottom up we also have to recurse for bigger problem: 1 in minCostAux(step_i 1). Is this doable with @cache? Am just curious if there is no way to do it!

PS 1

I know we can do it by explicitly maintaining cache dictionary, but that again adds some code lines. But, I want to look it as minimal as possible.

PS 2

Sample runs for testing:

print(Solution().minCostClimbingStairs([10,15,20])) # 15
print(Solution().minCostClimbingStairs([1,100,1,1,1,100,1,1,100,1])) # 6

CodePudding user response:

This is another DP bottom up version, but it does NOT require cache though. I find this approach is more intuitive...

It's inspired by @Kelly though.


def minCostClimbingStairs(self, cost: List[int]) -> int:
        N = len(cost)
        
        dp = [0] * (N   1)
        dp[0] = dp[1] = 0
        
        for i in range(2, N   1):
            dp[i] = min(dp[i-1]   cost[i-1], dp[i-2]   cost[i-2])

        #  'last index   1', should be the answer...
        return dp[-1]
    

CodePudding user response:

A bottom up recursive solution using cache

from functools import cache  # cache only available in Python 3.9 , earlier use lru_cache

class Solution(object):
    def minCostClimbingStairs(self, cost):
        """
        :type cost: List[int]
        :rtype: int
        """
        def get_val(index):
            ' get value from cost array (handling index limit) '
            try:
                return cost[index]
            except IndexError:
                return 0
        
        @cache   # note: same as  lru_cache(maxsize=None) if using lru_cache
        def minCostAux(curStep):
            if curStep >= len(cost):
                return 0               # Pass limit of cost array          
            
            return min(
                minCostAux(curStep   1)   get_val(curStep   1),   # min forward of step by 1
                minCostAux(curStep   2)   get_val(curStep   2))     # min forward of step by 2
        
        return minCostAux(-1)
    
print(Solution().minCostClimbingStairs([10, 15, 20]))                # Output: 15
print(Solution().minCostClimbingStairs([1,100,1,1,1,100,1,1,100,1])) # Output: 6

Performance

cost = list(range(400, 0, -1))
%timeit Solution().minCostClimbingStairs(cost)
568 µs ± 8.29 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)
            
  • Related