Feature significance for multivariate kernel density estimation
- Publication Type:
- Journal Article
- Citation:
- Computational Statistics and Data Analysis, 2008, 52 (9), pp. 4225 - 4242
- Issue Date:
- 2008-05-15
Closed Access
| Filename | Description | Size | |||
|---|---|---|---|---|---|
![]() | 2010000264OK.pdf | 1.34 MB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
Multivariate kernel density estimation provides information about structure in data. Feature significance is a technique for deciding whether features-such as local extrema-are statistically significant. This paper proposes a framework for feature significance in d-dimensional data which combines kernel density derivative estimators and hypothesis tests for modal regions. For the gradient and curvature estimators distributional properties are given, and pointwise test statistics are derived. The hypothesis tests extend the two-dimensional feature significance ideas of Godtliebsen et al. [Godtliebsen, F., Marron, J.S., Chaudhuri, P., 2002. Significance in scale space for bivariate density estimation. Journal of Computational and Graphical Statistics 11, 1-21]. The theoretical framework is complemented by novel visualization for three-dimensional data. Applications to real data sets show that tests based on the kernel curvature estimators perform well in identifying modal regions. These results can be enhanced by corresponding tests with kernel gradient estimators. © 2008 Elsevier Ltd. All rights reserved.
Please use this identifier to cite or link to this item:

