Adding noise seems crazy to me. "Wrong" is not quite the right word but adding unrelated noise is not necessary and likely only increases the uncertainty in the estimation process. (But I believe you when you say it is a common practice in your field. Also, if you don't have to bin: Don't!)
If you know that the underlying distribution of some binned data (however you bin it) is an exponential distribution, then the standard approach is to get a maximum likelihood estimate. (Note there are also Bayesian approaches you might consider.)
Such an approach also gets you and estimate of the precision of the estimate.
If one has $k$ frequency counts $f_1, f_2,\ldots,f_k$ and $k+1$ corresponding histogram bin borders $b_0<b_1<\cdots<b_k$, then the log of the likelihood is given by
$$\log L=\sum_{i=1}^k f_i \log(F(b_i)-F(b_{i-1}))$$
where $F(x)$ is the cumulative distribution function for the distribution from which the sample is taken. In this case $F(x)=1-e^{-\lambda x}$ when $x\geq0$ and $F(x)=0$ otherwise. We have
$$\log L=\sum_{i=1}^k f_i\log(e^{-\lambda b_{i-1}}-e^{-\lambda b_i})$$
We choose for the estimator of $\lambda$ the value that maximizes the likelihood. Most of the time this requires an iterative procedure but there is a closed-form solution when for an exponential distribution the bin borders are non-negative consecutive integers: $0, 1, 2, 3,\ldots$. In that case the maximum likelihood estimator is
$$\log \left(\sum _{i=1}^k i f_i\right)-\log \left(\sum _{i=2}^k (i-1) f_i\right)$$
Here's how to do it using Mathematica.
(* Bin boundaries and frequencies *)
SeedRandom[12345];
h = HistogramList[RandomVariate[ExponentialDistribution[1/2], 100]]
(* {{0,1,2,3,4,5,6,7,8},{35,31,14,6,6,5,2,1}} *)
(* Log of the likelihood *)
logL = Sum[h[[2, i]] Log[CDF[ExponentialDistribution[[Lambda]], h[[1, i + 1]]] -
CDF[ExponentialDistribution[[Lambda]], h[[1, i]]]], {i, 1, Length[h[[2]]]}]
$$\log \left(e^{-7 \lambda }-e^{-8 \lambda }\right)+2 \log \left(e^{-6 \lambda }-e^{-7 \lambda }\right)+5 \log \left(e^{-5 \lambda }-e^{-6 \lambda }\right)+6 \log \left(e^{-4 \lambda }-e^{-5 \lambda }\right)+6 \log \left(e^{-3 \lambda }-e^{-4 \lambda }\right)+14 \log \left(e^{-2 \lambda }-e^{-3 \lambda }\right)+35 \log \left(1-e^{-\lambda }\right)+31 \log \left(e^{-\lambda }-e^{-2 \lambda }\right)$$
(* Initial guess at maximum likelihood estimate: reciprocal of the mean *)
λ0 = 1/((h[[1, 1 ;; Length[h[[1]]] - 1]] + h[[1, 2 ;;]]) . h[[2]]/(2 Total[h[[2]]]))
(* 20/39 *)
(* Find maximum likelihood estimate )
mle = FindMaximum[{logL, λ > 0}, {{λ, λ0}}]
( {-165.66485033366567,{λ -> 0.524524468750641}} *)
(* Approximate standard error of the maximum likelihood estimator )
stdErr = Sqrt[-1/D[logL, {λ, 2}]] /. mle[[2]]
( 0.05305581097254485` *)
Because the bin borders are consecutive non-negative integers starting at 0, we can apply the closed-form solution:
Log[Sum[i h[[2, i]], {i, 1, 8}]] - Log[Sum[(i - 1) h[[2, i]], {i, 2, 8}]] // N
(* 0.524524 *)
There's no need to involve a geometric distribution.