|
Applied Math Seminar
Winter Quarter 2009
3:15 p.m.
Sloan Mathematics Corner
Building 380, Room 380-C
Friday, March 6, 2009
Lek-Heng Lim Mathematics University of California at Berkeley
Analysis of cumulants
Abstract:
|
|
Statistical modeling has been getting some bad press recently, faulted
by the media for having precipitated the present global financial
crisis. New York Times blamed the crisis on the use of Value-at-Risk
to measure risk exposure while Wired Magazine blamed it on the use of
Copulas to price CDOs. Without agreeing with these views, we will
argue that the common problem being highlighted is the lack of
rigorous techniques for analyzing non-Gaussian multivariate data.
We will see that a Gaussian assumption is equivalent to ignoring terms
beyond quadratic in a multivariate power series, the coefficients of
which are symmetric tensors known as cumulants. In the univariate
case, these are scalar-valued and the first four are well-known: mean,
variance, skewness, kurtosis. In the multivariate case, mean and
covariance are vector- and matrix-valued and may be effectively
analyzed using linear algebra. However, higher order multivariate
cumulants are hypermatrix-valued with no well-known methods of
analysis.
We will discuss two ways to study symmetric hypermatrices akin to the
spectral theorem for symmetric matrices: (1) decomposing a homogeneous
polynomial into a linear combination of powers of linear forms; (2)
decomposing a symmetric tensor into a multilinear combination of
points on a Stiefel manifold. Both decompositions have beautiful
underlying geometries: (1) secant varieties of the Veronese; (2)
symmetric subspace varieties. We then propose a PCA-like technique for
analyzing cumulants: It identifies "principal components" that
simultaneously accounts for variations in all cumulants via
optimization over a single Grassmannian.
This is joint work with Jason Morton.
|