As a brief aside from multivariate distributions, there is the Probability Integral Transform.
Definition
Let \(X\) be a continuous random variable with CDF \(F_X(x)\). The Probability Integral Transform states that the random variable U, given by:
$$U=F_X(X)$$
is distributed uniformly over the interval [0,1]; i.e., \(U\sim\mathcal{U}(0,1)\).
Multivariate Probability Integral Transform
Let \(X=(X_1,X_2,\dots,X_n)\) be a random vector with continuous marginal CDFs \(F_i(x)=P(X_i\lt x)\).
By performing the PIT on each element, we get a random vector where each element has U(0,1) marginals:
$$(U_1,U_2,\dots,U_n)=(F_1(X_1),F_2(X_2),\dots,F_n(X_n))$$
Copula
Continuing with our random vector \(U\), we define a copula function as:
$$C(u_1,u_2,\dots,u_n)=P(U_1\leq u_1, U_2\leq u_2,\dots, U_n\leq u_n)$$
Gaussian Copula
We previously covered how to generated correlated Gaussian variables. We can combine this with our multivariate Probability Integral Transform to get dependent uniforms.
Getting the Copula
A corollary to the probability integral transform that we've seen with the quantile function is that given a variable in the interval (0,1), we can transform it a random variable distributed from a given distribution by applying the inverse CDF.
For the Gaussian copula, we apply the normal quantile function to turn our uniform values into normally distributed. If there is a Gaussian copula, then we would see this result distributed as a correlated multivariate normal.
Sampling from the Copula
Sampling is going the reverse of above. We can generate correlated normals through a multivariate normal distribution. We then apply the marginal CDF to each element to get correlated quantiles. We then apply the quantile function of choice to each element to get them into having marginal distributions of whatever we would like.
T-Copula
A similar copula can be performed with heavier tails using the multivariate T-distribution as we did for the Gaussian copula.