I have very little experience with probability so apologies if the title is confusing!!
Let $\mu, \nu$ be probability measures on measure spaces $X,Y$ (if helpful we can assume $X = Y$ are compact subsets of $\mathbb{R}^d$, but I don't want to place any assumptions involving absolute continuity of $\mu, \nu$ w.r.t. Lebesgue). Let $\Gamma(\mu, \nu)$ denote the set of all probability measures on $X \times Y$ with marginals $\mu, \nu$, i.e. for all $A \subseteq X$ and $B \subseteq Y$ we have $\gamma(A \times Y) = \mu(A)$ and $\gamma(X \times B) = \nu(B)$.
Fix an arbitrary $\gamma \in \Gamma(\mu, \nu)$ and let $E \subseteq X \times Y$ such that $\gamma(E) > 0$. For all $x \in X$ and $y \in Y$ let $E_x = \{y \in Y \mid (x,y) \in E\}$ and $E^y = \{x \mid (x,y) \in E\}$. Then does $\gamma(E) > 0$ give us that there always exist $x \in X$ (alternatively, $y \in Y$) such that $\nu(E_x) > 0$ (alternatively, $\mu(E^y) > 0$)?
I can see this is true when $\gamma$ is the product measure (see, e.g., Folland 2.36) but I'm not seeing how to generalize the proof. By the disintegration theorem (https://en.wikipedia.org/wiki/Disintegration_theorem) we can get things like $$ \gamma(E) = \int_Y \gamma_y(E)\ \mathrm{d} \nu(y) $$ and $$ \gamma(E) = \int_X \gamma_x(E)\ \mathrm{d} \mu(x), $$ whence we know $\gamma_y(E) > 0$ for $\nu$-a.e. $y$ and $\gamma_x(E) > 0$ for $\mu$-a.e. $x$. But it's not clear to me how to turn these into statements about $\mu(E^y)$ and $\nu(E_x)$, respectively...any advice? Thank you!!