How can I get lower/upper bounds on the largest eigenvalue of the following sum of diagonal and rank-1 matrices for vector $h$ with $h_i>0\ \forall i$:
$$A=2\text{diag}(h)+h \cdot 1^T$$ For instance, for $d=3$ it would be matrix below
$$2 \left( \begin{array}{ccc} h_1 & 0 & 0 \\ 0 & h_2 & 0 \\ 0 & 0 & h_3 \\ \end{array} \right)+\left( \begin{array}{ccc} h_1 & h_1 & h_1 \\ h_2 & h_2 & h_2 \\ h_3 & h_3 & h_3 \\ \end{array} \right) $$
The following has been observed to be an upper bound empirically $$2\max_i h_i+\sum_i h_i\ge\lambda_\text{max}(A)$$
If we let $h=1,\frac{1}{2},\frac{1}{3},\ldots,\frac{1}{d}$, then for $d=4000$, the answer is $\approx 9.29455$, proposed upper bound is 10.8714. Furthermore, relative difference between bound and true value seems bounded as we vary $h$
Motivation: $\alpha<\lambda_1(A)$ is necessary and sufficient for the iteration $w=w-\alpha \langle w, x\rangle x$ to converge when $x$ is sampled from centered Normal with diagonal covariance and $h_i$ on the diagonal (derivation)