adamant's blog

By adamant, history, 2 years ago, In English

Hi everyone!

Today I'd like to write about some polynomials which are invariant under the rotation and relabeling in euclidean spaces. Model problems work with points in the 3D space, however both ideas, to some extent, might be generalized for higher number of dimensions. They might be useful to solve some geometric problems under the right conditions. I used some ideas around them in two problems that I set earlier.

Congruence check in random points

You're given two set of lines in 3D space. The second set of lines was obtained from the first one by the rotation and relabeling. You're guaranteed that the first set of lines was generated uniformly at random on the sphere, find the corresponding label permutation.

Actual problem: 102354F - Cosmic Crossroads.

Solution

Let $$$P_4(x, y, z) = \sum\limits_{l=1}^n \left((x-x_l)^2+(y-y_l)^2 + (z-z_l)^2\right)^2$$$. It is a fourth degree polynomial, which geometric meaning is the sum of distances from $$$(x, y, z)$$$ to all points in the set, each distance raised to power $$$4$$$. Distance is preserved under rotation, hence this expression is invariant under rotation transform. On the other hand it may be rewritten as

$$$P_4(x, y, z) = \sum\limits_{i=0}^4 \sum\limits_{j=0}^4 \sum\limits_{k=0}^4 A_{ijk} x^i y^j z^k,$$$

where $$$A_{ijk}$$$ is obtained as the sum over all points $$$(x_l,y_l,z_l)$$$ from the initial set. To find the permutation, it is enough to calculate $$$P_4$$$ for all points in both sets and them match points with the same index after they were sorted by the corresponding $$$P_4$$$ value.

It is tempting to try the same trick with $$$P_2(x, y, z)$$$, but it is the same for all the points in the set for this specific problem:

$$$ \begin{align} P_2(x, y, z) =& \sum\limits_{l=1}^n [(x-x_l)^2+(y-y_l)^2+(z-z_l)^2]\\ =& n \cdot (x^2+y^2+z^2) - 2x \sum\limits_{l=1}^n x_l - 2y \sum\limits_{l=1}^n y_l - 2z \sum\limits_{l=1}^n z_l + \sum\limits_{l=1}^n [x_l^2+y_l^2+z_l^2] \\ =& n \left[\left(x-\bar x\right)^2 + \left(y-\bar y\right)^2 + \left(z-\bar z\right)^2\right] - n(\bar x^2+\bar y^2+\bar z^2) + \sum\limits_{l=1}^n (x_l^2 + y_l^2 + z_l^2), \end{align} $$$

where $$$\bar x$$$, $$$\bar y$$$ and $$$\bar z$$$ are the mean values of $$$x_l$$$, $$$y_l$$$ and $$$z_l$$$ correspondingly. As you can see, non-constant part here is simply the squared distance from $$$(x, y, z)$$$ to the center of mass of the points in the set. Thus, $$$P_2(x, y, z)$$$ is the same for all points having the same distance from the center of mass, so it is of no use in 102354F - Cosmic Crossroads, as all the points have this distance equal to $$$1$$$ in the input.

Burunduk1 taught me this trick after the Petrozavodsk camp round which featured the model problem.

Sum of squared distances to the axis passing through the origin

You're given a set of points $$$r_k=(x_k, y_k, z_k)$$$. A torque needed to rotate the system of points around the axis $$$r=(x, y, z)$$$ is proportional to the sum of squared distances to the axis across all points. You need to find the minimum amount of points that have to be added to the set, so that the torque needed to rotate it around any axis passing through the origin is exactly the same.

Actual problem: Hackerrank — The Axis of Awesome

Solution

The squared distance from the point $$$r_k$$$ to the axis $$$r$$$ is expressed as

$$$\dfrac{|r_k \times r|^2}{r \cdot r} = \dfrac{(y_k z - z_k y)^2+(x_k z - z_k x)^2+(x_k y - y_k x)^2}{x^2+y^2+z^2}.$$$

The numerator here is a quadratic form, hence can be rewritten as

$$$|r_k \times r|^2 = \begin{pmatrix}x & y & z\end{pmatrix} \begin{pmatrix} y_k^2 + z_k^2 & -x_k y_k & -x_k z_k \\ -x_k y_k & x_k^2 + z_k^2 & -y_k z_k \\ -x_k z_k & -y_k z_k & x_k^2 + y_k^2 \end{pmatrix} \begin{pmatrix}x \\ y \\ z\end{pmatrix}.$$$

Correspondingly, the sum of squared distances for $$$k=1..n$$$ is defined by the quadratic form

$$$I = \sum\limits_{k=1}^n\begin{pmatrix} y_k^2 + z_k^2 & -x_k y_k & -x_k z_k \\ -x_k y_k & x_k^2 + z_k^2 & -y_k z_k \\ -x_k z_k & -y_k z_k & x_k^2 + y_k^2 \end{pmatrix},$$$

known in analytic mechanics as the inertia tensor. As any other tensor, its coordinate form is invariant under rotation.

Inertia tensor is a positive semidefinite quadratic form, hence there is an orthonormal basis in which it is diagonal:

$$$I = \begin{pmatrix}I_1 & 0 & 0 \\ 0 & I_2 & 0 \\ 0 & 0 & I_3\end{pmatrix}.$$$

Here $$$I_1$$$, $$$I_2$$$ and $$$I_3$$$ are the eigenvalues of $$$I$$$, also called the principal moments of inertia (corresponding eigenvectors are called the principal axes of inertia). From this representation we deduce that the condition from the statement is held if and only if $$$I_1 = I_2 = I_3$$$.

Adding a single point on a principal axis would only increase principal moments on the other axes. For example, adding $$$(x, 0, 0)$$$ would increase $$$I_2$$$ and $$$I_3$$$ by $$$x^2$$$. Knowing this, one can prove that the answer to the problem is exactly $$$3-m$$$ where $$$m$$$ is the multiplicity of the smallest eigenvalue of $$$I$$$.

Applying it to the first problem

Now, another interesting observation about inertia tensor is that both principal inertia moments and principal inertia axes would be preserved under rotation. It means that in the first problem, another possible way to find the corresponding rotation and the permutation of points is to find principal inertia axes for both sets of points and then find a rotation that matches corresponding principal inertia axes in the first and the second sets of points.

Unfortunately, this method still requires that principal inertia moments are all distinct (which generally holds for random sets of points), otherwise there would be an infinite amount of eigendecompositions of $$$I$$$.

  • Vote: I like it
  • +107
  • Vote: I do not like it

»
2 years ago, # |
  Vote: I like it +27 Vote: I do not like it

As far as polynomial rotational invariants for $$$k$$$ vectors are concerned, the statement following Theorem 6.1 here reduces the search space to just polynomials in $$$v_i^\intercal v_j$$$ for $$$1 \le i, j \le k$$$. Note that they prove it for $$$k \ge n - 1$$$ vectors, but for any invariant $$$f$$$ with $$$< n - 1$$$ vectors (by ignoring the last $$$n - k - 1$$$ vectors), we can extend it to an invariant $$$f'$$$ with $$$n - 1$$$ vectors, and note that $$$f'(v_1, \ldots, v_k, v_{k + 1}, \ldots, v_{n - 1}) = f'(v_1, \ldots, v_k, 0, \ldots, 0)$$$, so since all rotations preserve $$$0$$$, we can set the last $$$n - k - 1$$$ vectors to $$$0$$$, and thus the conclusion holds for $$$k < n - 1$$$ as well.

  • »
    »
    2 years ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    Thanks! As I understand, in my article it is actually a family of polynomials

    $$$P_{xyz} (x_1, y_1, z_1, \dots, x_n, y_n, z_n) = P(x, y, z, x_1, y_1, z_1, \dots, x_n, y_n, z_n),$$$

    such that $$$P_{xyz}$$$ is invariant for any specific point $$$(x, y, z)$$$. I also additionally want these polynomials to be invariant under relabeling, so $$$P_{xyz}$$$, probably, should be some kind of symmetric polynomial...

    • »
      »
      »
      2 years ago, # ^ |
        Vote: I like it 0 Vote: I do not like it

      Yes, I had mentioned them in the context of coefficients of a general term $$$x^ay^bz^c$$$.

      For the coefficients to be invariant under relabeling, note that for any polynomial $$$P$$$, we can get a corresponding $$$P'(v_1, \ldots, v_k) = \sum_{\pi \in S_n} P(v_{\pi(1)}, \ldots, v_{\pi(k)})$$$ which is relabeling invariant. Note that any relabeling-invariant polynomial can be written in that form.

      Also, just to clarify, the result I mentioned in my previous comment holds only for scalar invariants (rather than tensor invariants). However, any polynomial that is derived from other kinds of invariants in an isometric and symmetric manner should satisfy the claim too.

»
2 years ago, # |
  Vote: I like it 0 Vote: I do not like it

It is tempting to try the same trick with $$$P_2(x,y,z)$$$, but it is the same for all the points in the set.

How is $$$P_2$$$ same for all points? Is it because the points are generated uniformly at random?

  • »
    »
    2 years ago, # ^ |
    Rev. 4   Vote: I like it +8 Vote: I do not like it

    No, it is because all points in the problem lie on the unit sphere around their center of mass and $$$P_2(x, y, z)$$$ is exactly the distance to the center of mass (up to multiplicative and additive constants). I've added some explanation in the blog entry, hope it is a bit more clear now.

    • »
      »
      »
      2 years ago, # ^ |
        Vote: I like it 0 Vote: I do not like it

      Ah nice, that makes sense now, thank you!

      One more thing, how do we know that $$$P_4$$$ is sufficient? i.e. if multiple points have the same $$$P_4$$$ value, can we map any permutation of them to the other set, and get a valid solution? I guess you can use some symmetry argument here, but I'm unable to find any.

      • »
        »
        »
        »
        2 years ago, # ^ |
          Vote: I like it +8 Vote: I do not like it

        I think it's generally not sufficient. If e.g. points are located in regular polyhedron vertices, $$$P_4$$$ is probably useless... But when directions are generated at random, all $$$P_4$$$ would be different with high probability.

        • »
          »
          »
          »
          »
          2 years ago, # ^ |
            Vote: I like it 0 Vote: I do not like it

          That makes sense. Thanks again for the wonderful blog!