Skip to content
Snippets Groups Projects
Commit 79c8e534 authored by SANTAGOSTINI Pierre's avatar SANTAGOSTINI Pierre
Browse files

Details section, equations: some characters in bold.

See also: mvdggd.
parent f80221bd
No related branches found
No related tags found
No related merge requests found
......@@ -18,13 +18,13 @@ kldggd <- function(Sigma1, beta1, Sigma2, beta2, eps = 1e-06) {
#' 0 if the distributions are univariate)
#' and \code{attr(, "k")} (number of iterations).
#'
#' @details Given \eqn{X_1}, a random vector of \eqn{R^p} (\eqn{p > 1}) distributed according to the MGGD
#' with parameters \eqn{(0, \Sigma_1, \beta_1)}
#' and \eqn{X_2}, a random vector of \eqn{R^p} distributed according to the MGGD
#' with parameters \eqn{(0, \Sigma_2, \beta_2)}.
#' @details Given \eqn{\mathbf{X}_1}, a random vector of \eqn{\mathbb{R}^p} (\eqn{p > 1}) distributed according to the MGGD
#' with parameters \eqn{(\mathbf{0}, \Sigma_1, \beta_1)}
#' and \eqn{\mathbf{X}_2}, a random vector of \eqn{\mathbb{R}^p} distributed according to the MGGD
#' with parameters \eqn{(\mathbf{0}, \Sigma_2, \beta_2)}.
#'
#' The Kullback-Leibler divergence between \eqn{X_1} and \eqn{X_2} is given by:
#' \deqn{ \displaystyle{ KL(X_1||X_2) = \ln{\left(\frac{\beta_1 |\Sigma_1|^{-1/2} \Gamma\left(\frac{p}{2\beta_2}\right)}{\beta_2 |\Sigma_2|^{-1/2} \Gamma\left(\frac{p}{2\beta_1}\right)}\right)} + \frac{p}{2} \left(\frac{1}{\beta_2} - \frac{1}{\beta_1}\right) \ln{2} - \frac{p}{2\beta_2} + 2^{\frac{\beta_2}{\beta_1}-1} \frac{\Gamma{\left(\frac{\beta_2}{\beta_1} + \frac{p}{\beta_1}\right)}}{\Gamma{\left(\frac{p}{2 \beta_1}\right)}} \lambda_p^{\beta_2} } }
#' \deqn{ \displaystyle{ KL(\mathbf{X}_1||\mathbf{X}_2) = \ln{\left(\frac{\beta_1 |\Sigma_1|^{-1/2} \Gamma\left(\frac{p}{2\beta_2}\right)}{\beta_2 |\Sigma_2|^{-1/2} \Gamma\left(\frac{p}{2\beta_1}\right)}\right)} + \frac{p}{2} \left(\frac{1}{\beta_2} - \frac{1}{\beta_1}\right) \ln{2} - \frac{p}{2\beta_2} + 2^{\frac{\beta_2}{\beta_1}-1} \frac{\Gamma{\left(\frac{\beta_2}{\beta_1} + \frac{p}{\beta_1}\right)}}{\Gamma{\left(\frac{p}{2 \beta_1}\right)}} \lambda_p^{\beta_2} } }
#' \deqn{ \displaystyle{ \times F_D^{(p-1)}\left(-\beta_1; \underbrace{\frac{1}{2},\dots,\frac{1}{2}}_{p-1}; \frac{p}{2}; 1-\frac{\lambda_{p-1}}{\lambda_p},\dots,1-\frac{\lambda_{1}}{\lambda_p}\right) } }
#'
#' where \eqn{\lambda_1 < ... < \lambda_{p-1} < \lambda_p} are the eigenvalues
......@@ -39,6 +39,7 @@ kldggd <- function(Sigma1, beta1, Sigma2, beta2, eps = 1e-06) {
#' and \eqn{X_2}, a random variable distributed according to the generalized Gaussian distribution
#' with parameters \eqn{(0, \sigma_2, \beta_2)}.
#' \deqn{ KL(X_1||X_2) = \displaystyle{ \ln{\left(\frac{\frac{\beta_1}{\sqrt{\sigma_1}} \Gamma\left(\frac{1}{2\beta_2}\right)}{\frac{\beta_2}{\sqrt{\sigma_2}} \Gamma\left(\frac{1}{2\beta_1}\right)}\right)} + \frac{1}{2} \left(\frac{1}{\beta_2} - \frac{1}{\beta_1}\right) \ln{2} - \frac{1}{2\beta_2} + 2^{\frac{\beta_2}{\beta_1}-1} \frac{\Gamma{\left(\frac{\beta_2}{\beta_1} + \frac{1}{\beta_1}\right)}}{\Gamma{\left(\frac{1}{2 \beta_1}\right)}} \left(\frac{\sigma_1}{\sigma_2}\right)^{\beta_2} } }
#' @seealso [mvdggd]: probability density of a MGGD.
#'
#' @author Pierre Santagostini, Nizar Bouhlel
#' @references N. Bouhlel, A. Dziri, Kullback-Leibler Divergence Between Multivariate Generalized Gaussian Distributions.
......
......@@ -29,13 +29,13 @@ Computes the Kullback- Leibler divergence between two random variables distribut
according to multivariate generalized Gaussian distributions (MGGD) with zero means.
}
\details{
Given \eqn{X_1}, a random vector of \eqn{R^p} (\eqn{p > 1}) distributed according to the MGGD
with parameters \eqn{(0, \Sigma_1, \beta_1)}
and \eqn{X_2}, a random vector of \eqn{R^p} distributed according to the MGGD
with parameters \eqn{(0, \Sigma_2, \beta_2)}.
Given \eqn{\mathbf{X}_1}, a random vector of \eqn{\mathbb{R}^p} (\eqn{p > 1}) distributed according to the MGGD
with parameters \eqn{(\mathbf{0}, \Sigma_1, \beta_1)}
and \eqn{\mathbf{X}_2}, a random vector of \eqn{\mathbb{R}^p} distributed according to the MGGD
with parameters \eqn{(\mathbf{0}, \Sigma_2, \beta_2)}.
The Kullback-Leibler divergence between \eqn{X_1} and \eqn{X_2} is given by:
\deqn{ \displaystyle{ KL(X_1||X_2) = \ln{\left(\frac{\beta_1 |\Sigma_1|^{-1/2} \Gamma\left(\frac{p}{2\beta_2}\right)}{\beta_2 |\Sigma_2|^{-1/2} \Gamma\left(\frac{p}{2\beta_1}\right)}\right)} + \frac{p}{2} \left(\frac{1}{\beta_2} - \frac{1}{\beta_1}\right) \ln{2} - \frac{p}{2\beta_2} + 2^{\frac{\beta_2}{\beta_1}-1} \frac{\Gamma{\left(\frac{\beta_2}{\beta_1} + \frac{p}{\beta_1}\right)}}{\Gamma{\left(\frac{p}{2 \beta_1}\right)}} \lambda_p^{\beta_2} } }
\deqn{ \displaystyle{ KL(\mathbf{X}_1||\mathbf{X}_2) = \ln{\left(\frac{\beta_1 |\Sigma_1|^{-1/2} \Gamma\left(\frac{p}{2\beta_2}\right)}{\beta_2 |\Sigma_2|^{-1/2} \Gamma\left(\frac{p}{2\beta_1}\right)}\right)} + \frac{p}{2} \left(\frac{1}{\beta_2} - \frac{1}{\beta_1}\right) \ln{2} - \frac{p}{2\beta_2} + 2^{\frac{\beta_2}{\beta_1}-1} \frac{\Gamma{\left(\frac{\beta_2}{\beta_1} + \frac{p}{\beta_1}\right)}}{\Gamma{\left(\frac{p}{2 \beta_1}\right)}} \lambda_p^{\beta_2} } }
\deqn{ \displaystyle{ \times F_D^{(p-1)}\left(-\beta_1; \underbrace{\frac{1}{2},\dots,\frac{1}{2}}_{p-1}; \frac{p}{2}; 1-\frac{\lambda_{p-1}}{\lambda_p},\dots,1-\frac{\lambda_{1}}{\lambda_p}\right) } }
where \eqn{\lambda_1 < ... < \lambda_{p-1} < \lambda_p} are the eigenvalues
......@@ -73,6 +73,9 @@ N. Bouhlel, A. Dziri, Kullback-Leibler Divergence Between Multivariate Generaliz
IEEE Signal Processing Letters, vol. 26 no. 7, July 2019.
\doi{10.1109/LSP.2019.2915000}
}
\seealso{
\link{mvdggd}: probability density of a MGGD.
}
\author{
Pierre Santagostini, Nizar Bouhlel
}
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment