Блог пользователя _Bishop_

Автор _Bishop_, история, 3 года назад, По-английски

Here is an interesting problem from the Newton School's June coding competition

Problem: There are $$$x$$$ apples and $$$y$$$ oranges. In one move you can pick an apple or an orange equiprobably and convert it to the other. What is the expected number of moves required to make the count of one of them zero? Constraints $$$0 \leq x, y \leq 10^9$$$

Intended solution: Let $$$E(x , y)$$$ be the expected number of moves with $$$x$$$ apples and $$$y$$$ oranges in the beginning. Quite obviously (or maybe not), we pick apples and oranges with probability $$$\frac{1}{2}$$$ and then we have the following transitions: $$$E(x , y) = 1 + \frac{E(x + 1, y - 1) + E(x - 1, y + 1)}{2}$$$. And also $$$E(x, 0) = E(0 , y) = 0$$$. And probably with a little guess work we can obtain $$$E(x , y) = xy$$$.

A different way: Somehow I interpreted the problem in a different way and landed in a difficult position. I went about thinking that we draw the fruits equiprobably so the probability that we pick up a fruit is $$$\frac{1}{x + y}$$$. This makes the probability that we draw an apple to be $$$\frac{x}{x + y}$$$ and that we draw an orange to be $$$\frac{y}{x + y}$$$. This took me to the following recurrence: $$$E(x , y) = 1 + \frac{x}{x + y}E(x - 1, y + 1) + \frac{y}{x + y}E(x + 1, y - 1)$$$ and that $$$E(x , 0) = E(0 , y) = 0$$$. Now, I want to know if this recurrence has any closed form solution. Are there methods to compute such recursive functions. Maybe some math-person knows how to solve it. Also, how does one even approach this problem with smaller constraints (I mean with $$$dp$$$ we run into cycles)?

  • Проголосовать: нравится
  • +56
  • Проголосовать: не нравится

»
3 года назад, # |
  Проголосовать: нравится +33 Проголосовать: не нравится
»
3 года назад, # |
Rev. 5   Проголосовать: нравится +11 Проголосовать: не нравится

Adding one more condition $$$E(x, y) = E(y, x)$$$. Next if we try finding values of $$$E(1, n)$$$ for different values of $$$n$$$, we would get $$$E(1, n) = 2^n - 1$$$.

Next we have $$$E(1, n) = 1 + \frac{n}{n+1} E(2, n-1)$$$ and so on, we can calculate recursively. I haven't simplified the general expression for $$$E(x, y)$$$ but I think we might get some $$$O(1)$$$ formula.

I did the same thing in the original problem and observed that $$$E(x, y) = xy$$$. Also your recurrence is wrong. It should be $$$E(x, y) = 1 + \frac{x}{x+y} E(x-1, y+1) + \frac{y}{x+y} E(x+1, y-1)$$$

»
3 года назад, # |
Rev. 2   Проголосовать: нравится +36 Проголосовать: не нравится

For simplicity, I'll denote their (constant) sum as $$$s$$$ and then the recurrence you mentioned as:

$$$ f(0) = f(s) = 0,\ f(k) = 1 + \frac{k}{s} f(k-1) + \frac{s-k}{s} f(k+1) $$$

In terms of computation, as you've mentioned the immediate approach is dp but since the states form cycles, it doesn't quite work. However the equalities are linear in the values $$$f(k)$$$ so the next idea should be to solve these linear equations, in time $$$O(s^3)$$$.

Now, I can't quite find a closed form, but I can give an $$$O(s)$$$ algorithm to compute $$$f$$$: first we can reorder the equality in the following way:

$$$ \frac{k}{s} f(k) + \frac{s-k}{s} f(k) = 1 + \frac{k}{s} f(k-1) + \frac{s-k}{s} f(k+1) $$$
$$$ f(k+1) - f(k) = \frac{k}{s-k} (f(k) - f(k-1)) - \frac{s}{s-k} $$$

This is a recurrence on the adjacent differences, with conditions $$$f(0) = f(s) = 0$$$. If we denote $$$f(1) = c$$$, all the values of $$$f$$$ are then a function of $$$c$$$, where by the condition $$$f(s) = 0$$$ the answer is unique. Specifically, if my calculations are right:

$$$ f(k) = \sum\limits_{m=0}^{k-1} \left( \dfrac{c}{\binom{s-1}{m}} - \sum\limits_{j=0}^{m-1} \dfrac{m!(s-1-m)!}{j!(s-1-j)!} \right) $$$

And then it must hold that $$$f(s) = 0$$$, but this looks disgusting. As for the algorithm, we can define the homogeneous version of the equation above:

$$$ g(k+1) - g(k) = \frac{k}{s-k} (g(k) - g(k-1)) $$$

and with the single condition $$$g(0) = 0$$$, it's easy to find one solution: choose $$$g(1) = 1$$$, and then:

$$$ g(k+1) - g(k) = \dfrac{1}{\binom{s-1}{k}} $$$
$$$ g(k) = \sum\limits_{m=0}^{k-1} \dfrac{1}{\binom{s-1}{m}} $$$

Observe that for any $$$f$$$ which is a solution to the non-homogeneous equation, with the condition $$$f(0) = 0$$$ (but without $$$f(s) = 0$$$), and for any constant $$$c$$$, the function $$$f + cg$$$ is also a solution. So we can find any single $$$f$$$ and then shift it using $$$g$$$ so that $$$f(s) + cg(s) = 0$$$.

To find a single $$$f$$$, just choose $$$f(1) = 0$$$ and then compute the adjacent differences in $$$O(s)$$$, and then prefix sum.

»
3 года назад, # |
  Проголосовать: нравится +36 Проголосовать: не нравится

Usually, when your $$$dp$$$ has "cycles" but they form linear relationships, you can solve it using Gaussian Elimination. However, this has complexity $$$O(n^3)$$$, which is pretty sad. However, for some special forms of equation systems (in particular, ones with a "small lookahead", where dependencies are only a tiny bit into the future), you can do some neat trick. I'll exemplify here for this problem, but you can think of how to generalize it.

Let's denote $$$n = x+y$$$ and $$$E_i = E(i, n - i)$$$. We know that $$$E_0 = E_{n} = 0$$$ and that $$$E_i = 1 + \frac{i}{n} E_{i-1} + \frac{n-i}{n} E_{i+1}$$$. This is, as you said, a "cyclic dp". However, let's do the following trick:

First, denote $$$X = E_1$$$ (undeterminate). Then, we can use equation $$$E_1 = 1 + \frac{1}{n} E_0 + \frac{n-1}{n} E_2$$$ to solve for $$$E_2$$$ (as a function of $$$X$$$): $$$E_2 = X \frac{n}{n - 1} - 1 \frac{n}{n-1}$$$, if I'm not mistaken. Consequently, we can use equation $$$E_2 = ...$$$ to solve for $$$E_3$$$, etc. You can see that doing all this would yield equations of form $$$E_i = a_i X + b_i$$$ (linear functions in $$$X$$$), for all $$$i$$$.

Nevertheless, doing all these steps you'll obtain that $$$E_{n} = a_n X + b_n$$$. However, at this point you can use the last "extra" equation $$$E_n = 0$$$ to finally solve for $$$X$$$.

This approach is $$$O(n)$$$ if coded, but you can see that the equation of $$$a_n$$$ and $$$b_n$$$ comes from a product of $$$2 \times 2$$$ matrices, and if that product can be computed analytically, the approach may be solvable in faster complexity. I'll try a bit on paper and see if this is the case.