Consider the following probability function of a random variable $Y$: $$ f(y \mid \theta)=e^{-(y-\theta)},\quad y\ge\theta $$ and $0$ otherwise. We take a random sample $(Y_1,Y_2,...,Y_k)$ and want to find a sufficient statistic and a maximum likelihood estimator for $\theta$.
Now, the likelihood is given by $$ L\left(y_1, y_2, \ldots, y_k \mid \theta\right)=\prod_{i=1}^k e^{-\left(y_i-\theta\right)}=\exp \left(-\sum_{i=1}^k y_i+k \theta\right) $$
Obviously, this is maximized when $\theta$ is maximized. Since the density function is nonzero only when $y\ge\theta$, my first intuition is that the MLE for $\theta$ is $\min(y_1,y_2,...,y_k)$, although I am not sure that it is correct.
For the sufficient statistic, I believe we can choose $S=-\sum_{i=1}^k Y_i$, in which case the likelihood function can be written as the product of $g(s, \theta)=e^{s+k \theta}$ and $h(y_1,y_2,...,y_k)=1$, and a theorem then tells us that $S$ is a sufficient statistic.
Can someone tell me if I have made a mistake or misunderstood something?