An important property of the $\Gamma$ function is
$\Gamma(\alpha+1)=\alpha \Gamma(\alpha)$Also note that since
$\begin{equation}\begin{split}\displaystyle 1&=\int_0^1 Beta(x|\alpha, \beta)\\ B(\alpha, \beta) &= \int_0^1 x^{\alpha-1}(1-x)^{\beta-1} \end{split}\end{equation}$Given that $X\sim Beta(\alpha , \beta) $
Solve for expected value
$\begin{equation}\begin{split}\displaystyle &\text{Write definition}\\ \mathbb{E}[X] &= \int_0^1 \frac{x^{\alpha} (1-x)^{\beta-1}}{B(\alpha, \beta)}\\ &\text{Rewrite using beta function}\\ &= \frac{B(\alpha+1,\beta)}{B(\alpha,\beta)}\\ &\text{Rewrite using gamma function & simplify}\\ &= \frac{\Gamma(\alpha+1)\Gamma(\alpha+\beta)}{\Gamma(\alpha+\beta+1)\Gamma(\alpha)}\\ &\text{Use gamma property to simplify}\\ &= \frac{\alpha}{\alpha+\beta} \end{split}\end{equation}$Solve for $\mathbb{E}[X^2]$
$\begin{equation}\begin{split}\displaystyle &\text{Write definition}\\ \mathbb{E}[X^2] &= \int_0^1 \frac{x^{\alpha+1} (1-x)^{\beta-1}}{B(\alpha, \beta)}\\ &\text{Rewrite using beta function}\\ &= \frac{B(\alpha+2,\beta)}{B(\alpha,\beta)}\\ &\text{Rewrite using gamma function & simplify}\\ &= \frac{\Gamma(\alpha+2)\Gamma(\alpha+\beta)}{\Gamma(\alpha+\beta+1)\Gamma(\alpha)}\\ &\text{Use gamma property to simplify}\\ &= \frac{\alpha^2+\alpha}{(\alpha+\beta)(\alpha+\beta+1)} \end{split}\end{equation}$Solve for $\mathbb{V}ar[X]$
$\begin{equation}\begin{split}\displaystyle \mathbb{V}ar[X] &= \mathbb{E}[X^2] - \mathbb{E}[X]^2\\ &= \frac{\alpha^2+\alpha}{(\alpha+\beta)(\alpha+\beta+1)}-\frac{\alpha^2}{(\alpha+\beta)^2}\\ &= \frac{\alpha\beta}{(\alpha+\beta)^2(\alpha+\beta+1)} \end{split}\end{equation}$Multidimensional extension of the Beta distribution.
Let $A = \sum_{i=1}^m \alpha_i$
Where
$\begin{equation}\begin{split}\displaystyle B(\boldsymbol{\alpha}) &= B(\alpha_1,\cdots,\alpha_m)\\ &= \frac{1}{\Gamma(A)} \prod_{i=1}^M \Gamma(\alpha_i) \end{split}\end{equation}$The marginals of the Dirichlet distribution are Beta distributions.
Claim $X_i \sim Beta(\alpha_i, A-\alpha_i)$
$\begin{equation}\begin{split}\displaystyle &\text{Law of total probability}\\ P(x_1,\cdots,x_m) &=P(x_i)P(x_1, \cdots, x_{i-1},x_{i+1}, \cdots, x_m | x_i)\\ &\text{Expand based on claim}\\ &=\frac{x_i^{\alpha_i-1}(1-x_i)^{A-\alpha_i-1}\Gamma(A)}{\Gamma(\alpha_i)\Gamma(A-\alpha_i)} \prod_{j=1\\j\neq i}^m \frac{x_j^{\alpha_j-1}\Gamma(A-\alpha_i)}{\Gamma(\alpha_j)(1-x_i)^{A-\alpha_i-1}}\\ &\text{Simplify}\\ &= \prod_{j=1}^m \frac{x_i^{\alpha_i-1} \Gamma(A)}{\Gamma(\alpha_i)}\\ &\text{Verify to verify original distribution}\\ &= \frac{{\bf x}^{\boldsymbol{\alpha}-\vec{1}}}{B(\boldsymbol{\alpha})} \end{split}\end{equation}$From marginals $\mathbb{E}[X_i]=\frac{\alpha_i}{A}$. So $\mathbb{E}[Dir_m(\boldsymbol{\alpha})] = \frac{\boldsymbol{\alpha}}{A}$
Variance
$\mathbb{V}ar[X_i]=\frac{\alpha_i (A-\alpha_i)}{A(A+1)}$Solve for $\mathbb{E}[X_i X_j]$
$\begin{equation}\begin{split}\displaystyle &\text{Write definition}\\ \mathbb{E}[X_i X_j] &= \int_0^1 \int_0^{1-x_i} x_i x_j P(x_i , x_j ) dx_j dx_i\\ &\text{Expand definition}\\ &= \int_0^1 \int_0^{1-x_i} \frac{x_i^{\alpha_i} x_j^{\alpha_j} (1-x_i - x_j)^{A-\alpha_i-\alpha_j}}{B(\alpha_i, \alpha_j, A-\alpha_i-\alpha_j)} dx_j dx_i\\ &\text{Rewrite using $B$ function}\\ &= \frac{B(\alpha_i+1, \alpha_j+1,A-\alpha_i -\alpha_j)}{B(\alpha_i, \alpha_j, A-\alpha_i-\alpha_j)}\\ &\text{Simplify}\\ &= \frac{\alpha_i \alpha_j}{A(A+1)}\\ \end{split}\end{equation}$Solve for $\mathbb{C}ov[X_i, X_j]$
$\begin{equation}\begin{split}\displaystyle &\text{Write definition}\\ \mathbb{C}ov[X_i, X_j] &= \mathbb{E}[X_i X_j] - \mathbb{E}[X_i]\mathbb{E}[X_j]\\ &\text{Expand definition}\\ &= \frac{\alpha_i \alpha_j}{A(A+1)} - \frac{\alpha_i \alpha_j}{A^2}\\ &\text{Simplify}\\ &= - \frac{\alpha_i \alpha_j}{A(A+1)w} \end{split}\end{equation}$