Let $A_k(f_1,..f_N)$ the $k \times N$ matrix with rows the derivatives $f_1^{(l)},...f_N^{(l)}, n=0,...k-1$ of the functions where as usual the zeroth derivative is the function itself.
The special case $k=N$ gives rise to a square matrix whose discriminant is called the Wronskian of the functions. It is obvious that if the functions are linearly dependent the Wronskian is identically zero, while a fundamental theorem is that if the functions are linearly independent on some domain, the Wronskian is not identically zero (though of course, it may have zeroes as an analytic function).
Note that this result holds only for analytic functions in general as for example compactly supported smooth functions on the real line with disjoint support are linearly independent but have identically zero Wronskian.
The proof is not hard - assuming the Wronskian vanishes identically, we pick a maximal subset of functions for which the corresponding Wronskian doesn't vanish (since the functions are linearly independent, none is identically zero and the Wronskian of one function is equal to the function so any one them has non zero Wronskian by itself etc), hence there is a small open ball where that Wronskian has no zeroes by continuity and then a simple manipulation shows that any other function in the set is a linear combination of those on that small open ball, hence everywhere by analytic continuation.
Coming back to the problem and differentiating the relation $$\sum_{i=1}^N f_i \bar{g}_i = \sum_{i=1}^M h_i \bar{k}_i$$ $k-1$ times and conjugate differentiating it $l-1$ times and noting that differentiation keeps $\bar g_j, \bar k_j$ invariant and conjugate differentiation keeps $f_j, h_j$ invariant we get the relation (both sides being $k\times l$ matrices) $$A_k(f_1,...f_N)\overline {A^t_l(g_1,..g_N)}=A_k(h_1,...h_M)\overline {A^t_l(k_1,..k_M)}$$
Now by the above the discriminants of $A_{N-1}(f_1,...f_N), \overline {A^t_{M-1}(k_1,..k_M)}$ do not vanish identically so we can invert the corresponding matrices (in the realm of meromorphic/conjugate meromorphic functions of course) and we get (both sides being now $N \times M$ matrices) $$\overline {A^t_{M-1}(g_1,..g_N){A^t_{M-1}(k_1,..k_M)}^{-1}}=A_{N-1}^{-1}(f_1,...f_N)A_{N-1}(h_1,...h_M)=B_{N,M}((a_{nm}))$$
Since the LHS is a matrix of conjugate meromorphic functions and the RHS is a matrix of meromorphic functions, the equality means that $B_{N,M}((a_{nm}))$ is a constant matrix and since $$A_{N-1}(h_1,...h_M)=A_{N-1}(f_1,...f_N)B_{N,M}$$ we get that $(h_1,..h_M)$ is a linear combination of $(f_1,..f_N)$. Of course the reverse is also true by symmetry so we get that $N=M$ (since both sets are linearly independent) and that the matrix $B_{N,M}$ is invertible and (more or less) the required $A$ of the OP, while the relation for the other set is immediate by transposing and conjugating the equality $$\overline {A^t_{M-1}(g_1,..g_N){A^t_{M-1}(k_1,..k_M)}^{-1}}=A$$ and we are done!