Questions tagged [marginal-probability]

Marginal probability arises from a joint probability measure on a product space. The marginal probability distributions are the push-forward measures induced by the coordinate projections. A marginal probability is the probability of a single cylinder-set event. This is contrast to joint probability or conditional probability, in which additional events are considered.

Marginal probability is the probability of any one single event. This is contrast to joint probability or conditional probability, in which additional events are considered.

94 questions
7
votes
3 answers

Gaussian 2-D Mixture, mean mode median of marginals and 2-D

Let p(x1,x2) = $\dfrac {4}{10}\mathcal{N}\left( \begin{bmatrix} 10 \\ 2 \end{bmatrix},\begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}\right) + \dfrac {6}{10} \mathcal{N}\left( \begin{bmatrix} 0 \\ 0 \end{bmatrix},\begin{bmatrix} 8.4 & 2.0 \\ 2.0 & 1.7…
4
votes
3 answers

Given a coupling $\pi(\mu,\nu)$, show that $E_\mu f- E_\nu f= E_\pi [f(X) - f(Y)]$

In the lecture notes by for High-Dimensional Probability by Handel, the following is affirmed: Let $\mu$ and $\nu$ be probability measures, then $$\mathcal C(\mu,\nu) = \{ \text{Law} (X,Y) : X\sim \mu, Y\sim \nu \} $$ Therefore, any $\pi \in…
4
votes
1 answer

Where does the term "marginal" in "marginal probability" or "marginal distribution" come from?

Where does the term "marginal" in "marginal probability" or "marginal distribution" come from?
3
votes
1 answer

Is it possible to decompose the joint conditions?

Given three events $E$, $A$ and $B$, is it possbile to decompose the joint-conditional probability $P(E \vert A, B)$, as a expression in terms of non-joint-conditional probabilities, and marginal probabilities, as following? $$ P(E \vert A, B)…
3
votes
1 answer

Does positive joint probability imply positivity of a conditional event?

I have very little experience with probability so apologies if the title is confusing!! Let $\mu, \nu$ be probability measures on measure spaces $X,Y$ (if helpful we can assume $X = Y$ are compact subsets of $\mathbb{R}^d$, but I don't want to place…
2
votes
1 answer

Can marginals determine all probabilities

suppose we are given $n$ binary random variables, $X_1,\dots,X_n$. A probability distribution $P$ assigns all elementary events a probability; $$ P(X_1=x_1\&\ldots \& X_n=x_n)\in[0,1].$$ A regular marginal $m_I$ is a probability function on a…
2
votes
1 answer

Probability density of two random variables using characteristic function

I've been trying to solve the following question : $X$ and $Y$ are two real random variables with a probability density of : $$f(x,y) = e^{-y} *\mathscr{1}_{0
2
votes
2 answers

Compute marginal density given conditional density

Given a continuous random variable $X$ with conditional pdfs $$f_{X=x|Y=0}\qquad \text{and}\qquad f_{X=x|Y=1}$$ and the probability of a discrete random variable $Y$ as $P(Y=0)=0.1,P(Y=1)=0.9$, how can I compute the marginal density $f_X$? Is it…
2
votes
2 answers

(Expectation)Consider the joint density $f(x,y)=c(x−y)e^{−x}, 0 \le y \le x$.

Consider the joint density $f(x,y)=c(x−y)e^{−x}, 0 \le y \le x$. a) Determine the value of c. b) Calculate the marginal of Y. c) Calculate the expectation E(Y). a)$1=\int_0^x c(x−y)e^{−x} = \frac{cx^2e^{-x}}{2} = 1 \to c = \frac{2e^x}{x^2},…
2
votes
1 answer

Understanding Why we Integrate joint density function with opposite bounds to get marginal density

I have a function $f_{x,y}(x,y)$ which represents the joint density function. In order to get marginal density function in terms of $x$, I need to integrate using $y$ bounds. Why is that? I assumed that we would integrate with $x$ bounds.
2
votes
1 answer

Transforming uniform probability density function over unit disc from polar coordinates to cartesian coordinates

I need to solve the following exercise and I am not fully sure about my approach as the results seem odd and therefore would like some advise. Given are uniform random distributions of an angle $\theta \in [0,2\pi)$ and radius $r \in [0,1]$, both…
2
votes
1 answer

Why do elliptical copula densities contain $x_1$ and $x_2$, but Archimedean copula densities contain $u_1$ and $u_2$?

$$c\left(u_{1}, u_{2}\right)=\frac{1}{\sqrt{1-\rho_{12}^{2}}} \exp \left\{-\frac{\rho_{12}^{2}\left(x_{1}^{2}+x_{2}^{2}\right)-2 \rho_{12} x_{1} x_{2}}{2\left(1-\rho_{12}^{2}\right)}\right\}$$ is the copula density of the Gaussian copula, which is…
2
votes
1 answer

Independence between fractional parts of consecutive sums of independent uniforms

Let $X_1,X_2$ be independent $\text{Uniform}(0,1)$ random variables. Define $U_1 = X_1 - \lfloor X_1 \rfloor$ and $U_2 = X_1 + X_2 - \lfloor X_1 + X_2 \rfloor$ where $\lfloor a \rfloor$ is the largest integer less or equal to $a \in \mathbb{R}$. We…
2
votes
0 answers

Concept about marginal probability $p(y)$to conditional probability $p(y|x)$ transformation?

I have a function like the following, $p\left( y \right) = \int\limits_x {\int\limits_z {(Q({x^2} + y) + yz + z)dxdz} } $ Where, $Q(x) = \frac{1}{{2\pi }}\int\limits_x^\infty {{e^{ - \frac{{{t^2}}}{2}}}dt} $ and $x,y,z \in R$. I like to find…
2
votes
2 answers

Suppose $X$ and $Y$ are discrete random variable with joint p.f. Find the marginal p.f. of $X$ and $Y$

The joint probability function of x and y is: $$f(x,y)=\frac{e^{-2}}{x!(y-x)!}$$ where $$x = 0, 1, ..., y; y= 0, 1, ...$$ My solution to this problem: The marginal p.f. of x is: $$f_x(x)=\sum_{y=0}^\infty \frac{e^{-2}}{x!(y-x)!}$$ Then…
1
2 3 4 5 6 7