4

Let's assume $v,w, x_i \in R^n$ are unknown. Can one compute dot product $\langle v,w\rangle$ if one has just the numbers: $\langle v,x_i\rangle$ and $\langle w,x_i\rangle$ for $n$ random vectors $x_i$.

If $x_i = e_i$ it is quite simple: $$ \langle v,w\rangle = \sum_i \langle v,e_i\rangle \langle w,e_i\rangle$$

But what if the $x_i$ is not an orthogonal basis ?
If not possible can we do something with stronger assumptions like all vectors being unitary ?

  • Let $A$ be the matrix whose rows are $x_i$. Then $\langle v, x_i\rangle$ is just the $i$th component of $Av$. So you're trying to compute $v^T w$ from $Av$ and $Aw$. It's obvious this works when $A$ is orthogonal, but for arbitrary $A$ you essentially need to know $A$ up to left-multiplication by a rotation. Of course if $A$ is known then you can just invert it to get $v$ and $w$. – Erick Wong Apr 26 '16 at 08:19

1 Answers1

1

$\newcommand{tuple}[1]{\langle #1 \rangle}$The dot product is linear in both its arguments, so you can inline Gram-Schmidt orthogonalization within $\langle v,w \rangle$. For example for 3 vectors $x_1,x_2,x_3$ of unit length we have

\begin{align} y_1 &= x_1,\\ y_2 &= x_2 - \tuple{x_2, y_1}y_1,\\ y_3 &= x_3 - \tuple{x_3,y_1}y_1 - \tuple{x_3,y_2}y_2, \end{align}

so

\begin{align} \tuple{v,y_1} &= \tuple{v,x_1},\\ \tuple{v,y_2} &= \tuple{v,x_2 - \tuple{x_2, y_1}y_1} \\ &= \tuple{v, x_2} - \tuple{x_2,y_1}\tuple{v,y_1},\\ \tuple{v,y_3} &= \tuple{v,x_3 - \tuple{x_3,y_1}y_1 - \tuple{x_3,y_2}y_2}\\ &= \tuple{v,x_3} - \tuple{x_3,y_1}\tuple{v,y_1} - \tuple{x_3,y_2}\tuple{v,y_2} \end{align}

and similarly with $\tuple{w,y_i}$. If $x_i$'s don't have unit length you can just use $\frac{x_i}{\|x_i\|}$ and scale $\tuple{v,x_i}$ and $\tuple{w,x_i}$ accordingly. Of course this works only if the random vectors span the space in which both $v$ and $w$ are contained (for example for field $\mathbb{F}_2$ the rank does not approach the full rank as $n \to \infty$, see this question).

I hope this helps $$\ddot\smile$$

dtldarek
  • 37,969