Notes on Whittaker & Watson, Chapter XII, part 3

Whittaker & Watson, A Course of Modern Analysis. Chapter XII, The Gamma Function.

12.4 The Beta function in its integral form defined by the Eulerian integral of the first kind: $$ B(p,q) = \int_0^1x^{p-1}(1-x)^{q-1}dx $$ is analytic when $\Re(p),\Re(q)>0$.

It is symmetric so that $B(p,q)=B(q,p)$.

By integration by parts $B(p,q+1)=\frac{q}{p}B(p+1,q)$.

12.41 To establish that $$ B(m,n)=\frac{\Gamma(m)\Gamma(n)}{\Gamma(m+n)}, $$ consider for $m,n$ such that $\Re(2m-1), \Re(2n-1) >0$ $$ \begin{align*} \Gamma(m)\Gamma(n) & = \int_0^\infty e^{-x}x^{m-1}dx\int_0^\infty e^{-y}x^{n-1}dy\newline & = 4\lim_{R\to\infty}\int_0^R\int_0^R e^{-(x^2+y^2)}x^{2m-1} e^{-y^2}x^{2n-1}dx\ dy & x\to x^2,y\to y^2\newline & = 4\lim_{R\to\infty}\int_0^Re^{-r^2}r^{2(m+n-1)}dr^2\int_0^{\frac{\pi}{2}}\cos^{2m-1}\theta\sin^{2n-1}\theta\ d\theta & \iint_{\substack{(x,y)\in[0,1]^2\newline\ x^2+y^2>R^2}}\to0\newline &=\Gamma(m+n)B(m,n). & \cos^2\theta\to u \end{align*} $$ This formulation by the Gamma function can be extended by analytic continuation.

12.42 The substitution $\cos^2\theta=u$ above gives us the following result $$ \int_0^\frac{\pi}{2}\cos^{m-1}\theta\sin^{n-1}\theta\ d\theta = \frac{1}{2}B\left(\frac{m}{2}, \frac{n}{2}\right) = \frac{\Gamma(\frac{m}{2})\Gamma(\frac{n}{2})}{2\Gamma(\frac{m+n}{2})}. $$ 12.43 Pochhammer’s extension. Define $\epsilon(\alpha,\beta)$ by the integral on the Pochhammer contour $C$, for $\Re(\alpha),\Re(\beta)>0$ $$ \begin{align*} \epsilon(\alpha,\beta) &= e^{-i\pi(\alpha+\beta)}\int_Ct^{\alpha-1}(1-t)^{\beta-1}dt\newline &= e^{-i\pi(\alpha+\beta)}(1-e^{2\pi\alpha i})(1-e^{2\pi\beta i})\int_0^1t^{\alpha-1}(1-t)^{\beta-1}dt\newline & = -4\sin\alpha\pi\sin\beta\pi\frac{\Gamma(\alpha)\Gamma(\beta)}{\Gamma(\alpha+\beta)}\newline & = -\frac{4\pi^2}{\Gamma(1-\alpha)\Gamma(1-\beta)\Gamma(\alpha+\beta)}. \end{align*} $$ The RHS is analytic for all $\alpha,\beta\in\mathbb{C}$, which gives the analytical continuation of the Beta function that $\sin(\alpha\pi)\sin(\beta\pi)B(\alpha,\beta)=-\frac{1}{4}\epsilon(\alpha,\beta)$.

12.5 Dirichlet’s integral. $$ I = \int\cdots\int_{\sum t_r\le1}f(t_1+t_2+\cdots t_n)t_1^{\alpha_1-1}t_2^{\alpha_2-1}\cdots t_n^{\alpha_n-1}dt_1dt_2\cdots dt_n $$ with $\alpha_r>0, r=1,\cdots n$.

Let $v=\frac{t_2}{t_1+t_2}$ and $\lambda=t_3+\cdots+t_n$, the integral in $t_1$ and $t_2$ $$ \begin{align*} & \int_0^{1-\lambda}\int_0^{1-t_2-\lambda}f(t_1+t_2+\lambda)t_1^{\alpha_1-1}t_2^{\alpha_2-1}dt_1dt_2\newline =& \int_0^{1-\lambda}dt_2\int_{1}^{t_2/(1-\lambda)} f\left(\frac{t_2}{v}+\lambda\right)(1-v)^{\alpha_1-1}v^{-\alpha_1-1}t_2^{\alpha_1+\alpha_2-1}dv\newline =&\int_0^1 dv\int_{0}^{(1-\lambda)v} f\left(\frac{t_2}{v}+\lambda\right)(1-v)^{\alpha_1-1}v^{-\alpha_1-1}t_2^{\alpha_1+\alpha_2-1}dt_2\newline =&\int_0^1 \int_{0}^{1-\lambda}f\left(\tau_2+\lambda\right)(1-v)^{\alpha_1-1}v^{\alpha_2-1}\tau_2^{\alpha_1+\alpha_2-1}d\tau_2dv &t_2=v\tau_2\ \text{or}\ \tau_2=t_1+t_2\newline =&\frac{\Gamma(\alpha_1)\Gamma(\alpha_2)}{\Gamma(\alpha_1+\alpha_2)}\int_{0}^{1-\lambda}f\left(\tau_2+\lambda\right)\tau_2^{\alpha_1+\alpha_2-1}d\tau_2. \end{align*} $$ By induction, the interal reduces to $$ I = \frac{\Gamma(\alpha_1)\Gamma(\alpha_2)\cdots\Gamma(\alpha_n)}{\Gamma(\alpha_1+\alpha_2+\cdots\alpha_n)}\int_0^1f(\tau)\cdot\tau^{\alpha_1+\alpha_2+\cdots+\alpha_n-1}d\tau. $$ [It is from this integral that the name for the Dirichlet distribution is derived. It has the pdf $$ p(x_1,\cdots,x_n)=\frac{\Gamma(\alpha_1+\alpha_2+\cdots\alpha_{n})}{\Gamma(\alpha_1)\Gamma(\alpha_2)\cdots\Gamma(\alpha_{n})}x_1^{\alpha_1-1}\cdots x_n^{\alpha_n-1} $$ defined on a simplex $x_1+\cdots+x_n=1$ with $x_i\ge 0, i=1,2,\cdots,n$. The moments of the Dirichlet distribution can be calculated applying the integral studied by Dirichlet, such that $$ \begin{align*} E\left[\prod_{i=1}^n x_i^{r_i}\right]&=\int_0^1\cdots\int_0^1 p(x_1,\cdots,x_n)x_1^{r_1}\cdots x_n^{r_n}dx_1\cdots dx_{n-1}\newline & = \int_0^1\cdots\int_0^1p(x_1,\cdots,x_n)\prod_{i=1}^{n-1} x_i^{r_i}\left(1-\sum_{i=1}^{n-1}x_i\right)^{r_n}dx_1\cdots dx_{n-1}\newline &=\frac{\Gamma(\alpha_1+\alpha_2+\cdots\alpha_{n})}{\Gamma(\alpha_1)\Gamma(\alpha_2)\cdots\Gamma(\alpha_{n})}\int_0^1\cdots\int_0^1\prod_{i=1}^{n-1} x_i^{\alpha_i+r_i-1}\left(1-\sum_{i=1}^{n-1}x_i\right)^{\alpha_{n}+r_n-1}dx_1\cdots dx_{n-1}\newline &=\frac{\Gamma(\alpha_1+\alpha_2+\cdots\alpha_{n})}{\Gamma(\alpha_1)\Gamma(\alpha_2)\cdots\Gamma(\alpha_{n})}\frac{\Gamma(\alpha_1+r_1)\cdots\Gamma(\alpha_{n}+r_{n})}{\Gamma(\alpha_1+r_1+\cdots\alpha_{n}+r_{n})}. \end{align*} $$ For the inverted Dirichlet distribution, the pdf $$ p(x_1,\cdots,x_n)=\frac{\Gamma(\alpha_1+\alpha_2+\cdots\alpha_{n+1})}{\Gamma(\alpha_1)\Gamma(\alpha_2)\cdots\Gamma(\alpha_{n+1})}\frac{x_1^{\alpha_1-1}\cdots x_n^{\alpha_n-1}}{(1+x_1+\cdots+x_n)^{\alpha_1+\alpha_2+\cdots+\alpha_{n+1}}}. $$ is defined for $x_i>0, i=1,\cdots,n$. Under the transform $t_i=\frac{x_i}{1+\sum_{k=1}^nx_k}$ with corresponding Jacobian $\frac{\partial(t_1,\cdots t_n)}{\partial(x_1,\cdots,x_n)} = (1+x_1+\cdots+x_n)^{-n-1}$, we have $t_1,\cdots,t_n,(1-t_1-\cdots-t_n)$ following a Dirichlet distribution with pdf $$ \begin{align*} p_t(t_1,\cdots,t_n) &=\frac{\Gamma(\alpha_1+\alpha_2+\cdots\alpha_{n+1})}{\Gamma(\alpha_1)\Gamma(\alpha_2)\cdots\Gamma(\alpha_{n+1})}\frac{t_1^{\alpha_1-1}\cdots t_n^{\alpha_n-1}}{(1+x_1+\cdots+x_n)^{\alpha_{n+1}-1}}\newline &=\frac{\Gamma(\alpha_1+\alpha_2+\cdots\alpha_{n+1})}{\Gamma(\alpha_1)\Gamma(\alpha_2)\cdots\Gamma(\alpha_{n+1})}t_1^{\alpha_1-1}\cdots t_n^{\alpha_n-1}(1-t_1-\cdots-t_n)^{\alpha_{n+1}-1}. \end{align*} $$ The moments of the inverted Dirichlet distribution can then be calculated in a similar manner applying Dirichlet’s integral $$ \begin{align*} E\left[\prod_{i=1}^n x_i^{r_i}\right]&=\int\cdots\int p(x_1,\cdots,x_n)x_1^{r_1}\cdots x_n^{r_n}dx_1\cdots dx_n\newline & = \int_0^1\cdots\int_0^1p_t(t_1,\cdots,t_n)\prod_{i=1}^n t_i^{r_i}\left(1-\sum_{i=1}^nt_i\right)^{-\sum_{i=1}^nr_i}dt_1\cdots dt_n \newline &=\frac{\Gamma(\alpha_1+\alpha_2+\cdots\alpha_{n+1})}{\Gamma(\alpha_1)\Gamma(\alpha_2)\cdots\Gamma(\alpha_{n+1})}\int_0^1\cdots\int_0^1\prod_{i=1}^n t_i^{\alpha_i+r_i-1}\left(1-\sum_{i=1}^nt_i\right)^{\alpha_{n+1}-\sum r_i-1}dt_1\cdots dt_n\newline &=\frac{\Gamma(\alpha_1+\alpha_2+\cdots\alpha_{n+1})}{\Gamma(\alpha_1)\Gamma(\alpha_2)\cdots\Gamma(\alpha_{n+1})}\frac{\Gamma(\alpha_1+r_1)\cdots\Gamma(\alpha_n+r_n)}{\Gamma(\alpha_1+r_1+\cdots\alpha_n+r_n)}\int_0^1(1-\tau)^{\alpha_{n+1}-\sum r_i-1}\tau^{(\sum \alpha_i+r_i)-1}d\tau\newline & = \frac{\Gamma(\alpha_1+r_1)\cdots\Gamma(\alpha_n+r_n)\Gamma(\alpha_{n+1}-\sum r_i)}{\Gamma(\alpha_1)\Gamma(\alpha_2)\cdots\Gamma(\alpha_{n+1})}. \end{align*} $$ ]