My uni professor has taught us the following:
If the likelihood formed on the basis of a random sample from a distribution belongs to the regular exponential family, then the likelihood equation for finding the ML estimate of the parameter vector $\boldsymbol{\theta}$ is given by [equation 1]$$\mathop{\mathbb{E}}(\boldsymbol{T}(\boldsymbol{X_1}, ..., \boldsymbol{X_n}))=\boldsymbol{T}(\boldsymbol{x_1}, ..., \boldsymbol{x_n})$$ That is, the likelihood equation can be obtained by equating the expectation of $\boldsymbol{T}(\boldsymbol{X_1}, ..., \boldsymbol{X_n})$ equal to its observed value.
I am having some trouble interpretting the observed values.
If we take the normal distribution (unknown mean $\mu$, known variance $\sigma^2$) for example, we get that the sufficient statistic $\boldsymbol{T}(\boldsymbol{X_1}, ..., \boldsymbol{X_n})=\frac{x}{\sigma}$.
Calculating the LHS of equation 1: $\mathop{\mathbb{E}}(\boldsymbol{T}(\boldsymbol{X_1}, ..., \boldsymbol{X_n}))=\mathop{\mathbb{E}}(\frac{x}{\sigma})=\frac{1}{\sigma}\mathop{\mathbb{E}}(x)=\frac{\mu}{\sigma}$.
I'm not exactly sure how to calculate $\boldsymbol{T}(\boldsymbol{x_1}, ..., \boldsymbol{x_n})$.
Can anyone provide some guidance? (I haven't been able to find any resources that uses this result)