There are slices of cake where the consumer gets a utility value log(1 j)beta^(t - 1) value from it where j is the number of slices consumed each day and t is the number of days since t = 1. What would be the OPT recurrence to maximize utility to determine how many slices of cake to eat each day for any certain beta?
CodePudding user response:
Assuming that we’re summing those logs, then letting OPT(n, t) be the maximum utility of eating n slices in the first t days,
OPT(n, 0) = 0
OPT(n, d) = max from j = 0 to n of [OPT(n − j, d − 1) log(1 j) beta^(t − 1)]
Since utility decreases over time, optimal consumption over time does not increase. Therefore an n-day horizon suffices. Compute OPT(n, n) and follow the argmaxes backward.