Repository | Book | Chapter

191486

(2007) Challenges for computational intelligence, Dordrecht, Springer.

A trend on regularization and model selection in statistical learning

a bayesian ying yang learning perspective

Lei Xu

pp. 365-406

In this chapter, advances on regularization and model selection in statistical learning have been summarized, and a trend has been discussed from a Bayesian Ying Yang learning perspective. After briefly introducing Bayesian Ying- Yang system and best harmony learning, not only its advantages of automatic model selection and of integrating regularization and model selection have been addressed, but also its differences and relations to several existing typical learning methods have been discussed and elaborated. Taking the tasks of Gaussian mixture, local subspaces, local factor analysis as examples, not only detailed model selection criteria are given, but also a general learning procedure is provided, which unifies those automatic model selection featured adaptive algorithms for these tasks. Finally, a trend of studies on model selection (i.e., automatic model selection during parametric learning), has been further elaborated. Moreover, several theoretical issues in a large sample size and a number of challenges in a small sample size have been presented.

Publication details

DOI: 10.1007/978-3-540-71984-7_14

Full citation:

Xu, L. (2007)., A trend on regularization and model selection in statistical learning: a bayesian ying yang learning perspective, in W. Duch & J. Mańdziuk (eds.), Challenges for computational intelligence, Dordrecht, Springer, pp. 365-406.

This document is unfortunately not available for download at the moment.