Home > Back-end >  Optimize two variable function with O(log n) time
Optimize two variable function with O(log n) time

Time:09-25

Given integer variable x, ranging from 0 to n. We have two functions f(x) and g(x) with the following properties:

  • f(x) is a strictly increasing function; with x1 > x2, we have f(x1) > f(x2)
  • g(x) is a strictly decreasing function; with x1 > x2, we have g(x1) < g(x2)
  • f(x) and g(x) are black-box functions, and have constant time complexity O(1)

The problem is to solve an optimization problem and determine optimal x:

minimize f(x) g(x)

An easy approach is a simple linear scan to test all x from 0 to n with time complexity of O(n). I am curious if there is an approach to solve it with O(log n).

CodePudding user response:

There is no such solution.

Start with f(i) = 2i. And g(i) = 2n - 2i. These meet your requirements, and the minimum is going to be 2n.

Now at one point k replace g(k) with 2n - 2k - 1. This still meets your requirements, the minimum is now going to be 2n-1, and you only can get this knowledge from asking about the kth. No amount of other questions give you any information that is different than the original one. So there is no way around asking n questions to notice a difference between the modified and original functions.

CodePudding user response:

I doubt the problem in such general shape has an answer.

Let f(x)=2x for even x and 2x 1 for odd, and g(x)=-2x-1. Then f g oscillates between 0 and 1 for integer arguments and every odd x is a local minimum.

And, similarly to example by @btilly, a small variation in the g(x) definition may introduce a global minimum anywhere.

CodePudding user response:

I already marked one response as the solution. With certain special cases, you have to brutal force all x to get the optimal value. However, my real intention is see if there is any early stopping criteria when we observe a specific pattern. An example early stopping solution is as follows.

First we solve the boundary conditions at 0 and n, we have f(0), f(n), and g(0), g(n), with any x, we have:

f(0) < f(x) < f(n)

g(0) > g(x) > g(n)

Given two trials x and y, y > x, if we observe:

f(y) g(y) > f(x) g(x) // x solution is better

f(y) - f(x) > g(y) - g(n) // no more room to improve after y

Then, there is no need to test solutions after y.

  • Related