In the most general case, there is no analytical solution for the zero of function
$$f(x)=a^x+b^x-1$$ and numerical iterative methods would be required.
If wa assume $a>1$ and $b>1$, $f(x)$ is not very pleasant to look at if graphing but this is not the case of its logarithmic tansform
$$g(x)=\log(a^x+b^x)$$ which looks quite close to a straight line.
Being lazy, expand $g(x)$ as a Taylor series around $x=0$ and obtain
$$g(x)=\log(2)+\log(\sqrt{ab})x+O(x^2)\implies x_0=-\frac {2\log(2) } {\log({ab}) }$$ and start Newton method for generating the sequence
$$x_{n+1}=x_n-\frac{\left(a^{x_n}+b^{x_n}\right) \log \left(a^{x_n}+b^{x_n}\right)}{a^{x_n} \log (a)+b^{x_n} \log (b)}$$
For illustration, using $a=3$ and $b=7$, the iterates will be
$$\left(
\begin{array}{cc}
n & x_n \\
0 & -0.455340 \\
1 & -0.468168 \\
2 & -0.468178
\end{array}
\right)$$ which is quite fast.
But we could have a still better approximation performing one single iteration of Halley method starting at $x=0$. This would give, as an approximation,
$$x_0=\frac {4\log(2)\log(ab) } {(\log (2)-2) \left(\log ^2(a)+\log ^2(b)\right)-2 (2+\log (2)) \log (a) \log (b) }$$
For the worked example, this would give $x_0=-0.46790$.
This estimate could still be improved using one single iteration of higher order methods (we still get analytical expressions. The formulae start to be too long for typing them, but for the worked example, as a function of the order of the method, the results would be
$$\left(
\begin{array}{ccc}
n & x_0^{(n)} & \text{method} \\
2 & -0.4553404974 & \text{Nexton} \\
3 & -0.4679002951 & \text{Halley} \\
4 & -0.4682565630 & \text{Householder} \\
5 & -0.4681819736 & \text{no name} \\
6 & -0.4681774686 & \text{no name} \\
7 & -0.4681781776 & \text{no name} \\
8 & -0.4681782373 & \text{no name}
\end{array}
\right)$$