Let $X_1, \cdots, X_n$ be i.i.d. $Unif(0,\theta)$ and $T = \max\{X_1,X_2,···,X_n\}$. Show that T is a sufficient statistic using the definition. So I need to show that for $t>0$, $\Bbb P(X_1 \leq x_1, \cdots, X_n \leq x_n \lvert T \leq t)$ does not depend on $\theta$. Here are my computations :
$$\Bbb P(X_1 \leq x_1, \cdots, X_n \leq x_n \lvert T \leq t)=\frac{\Bbb P(X_1 \leq x_1, \cdots, X_n \leq x_n , T \leq t)}{\Bbb P( T \leq t)}$$
$$=\frac{\Bbb P(X_1 \leq x_1, \cdots, X_n \leq x_n , T \leq t)}{\Bbb P( T \leq t)}=\frac{ \Bbb P(X_1 \leq \min(x_1,t), \cdots, X_n \leq \min(x_n,t))}{\Bbb P(X_1 \leq t, \cdots, X_n \leq t)}$$
$$=\frac{ \Bbb P(X_1 \leq \min(x_1,t), \cdots, X_n \leq \min(x_n,t))}{\Pi_{i=1}^n \Bbb P(X_i \leq t)}=\frac{\Pi_{i=1}^n \int_{-\infty}^{\min(x_i,t)}\Bbb 1_{[0,\theta]}dx}{\bigg(\int_{-\infty}^{t}\Bbb 1_{[0,\theta]}dx\bigg)^n}$$
It is very technical but we can finally write it as $$\Bbb P(X_1 \leq x_1, \cdots, X_n \leq x_n \lvert T \leq t)=\frac{\Pi_{i=1}^n \min(x_i,t,\theta)}{(\min(t,\theta))^n}$$
Let us suppose for example that $t\geq \theta$ and all $x_i$ are also bigger than $\theta$ then $\Bbb P(X_1 \leq x_1, \cdots, X_n \leq x_n \lvert T \leq t)=\frac{\theta^n}{\theta^n}=1$ which does not depend on $\theta$.
The big issue is starting to be clear : if $t \geq \theta$ and (at least) one of $x_i$ is smaller than $\theta$ then we have that $\Bbb P(X_1 \leq x_1, \cdots, X_n \leq x_n \lvert T \leq t)=\frac{x_i \theta^{n-1}}{\theta^n}=\frac{x_i}{\theta}$ that depends on $\theta$ !
If maths are never wrong, then I am but where ? I was told there must be a mistake in my computations but they seem ok. Can anyone see what's not right ?
(I know I can use the equivalent factorisation but I really want to do it by definition).