Given Hermitian matrices ${\bf A}, {\bf B}, {\bf C} \in \mathbb{C}^{N \times N}$, we have the following optimization problem in vector ${\bf x} \in \mathbb{C}^N$
$$\begin{array}{ll} \text{maximize} & {\bf x}^H {\bf A} {\bf x}\\ \text{subject to} & \|{\bf x}\|^2_2 \leq 1\end{array}$$
which has a solution given by (using KKT conditions)
$${\bf A} {\bf x} = \mu {\bf x}$$
where $\mu > 0$, which indicates that the maximizer is the eigenvector corresponding to the dominant eigenvalue.
Suppose now that ${\bf A} = (1-\alpha) {\bf B} + \alpha {\bf C}$, where $0 \leq \alpha \leq 1$. Then we have
$$((1-\alpha) {\bf B} + \alpha {\bf C}) {\bf x} = \mu {\bf x}$$
In this case, can anyone suggest now to proceed with the solution for any $0 \leq \alpha \leq 1$ in terms of the eigenvectors of ${\bf B}$ and ${\bf C}$?