DATE: | Tuesday, Dec 8, 2009 |
TIME: | 3:30 pm |
PLACE: | Council Room (SITE 5-084) |
TITLE: | Cascading Customized Classification: a Bayesian Boosting technique |
PRESENTER: | John Li University of Ottawa |
ABSTRACT:
Bayesian Learning techniques include Bayesian Networks (BNs), Naïve Bayes (NB), NB-like or One-Dependent Estimation classifiers (ODE) such as AODE and HNB. Because learning an optimal Bayesian Network structure in NBs turns out to be a NP problem, NB and those NB-like classifiers perform Bayesian learning by assuming conditional independence among attributes given the classes and a simplified structure with only one parent, respectively. As a result, NB and NB-like classifiers enjoy a lot of advantages such as a linear time complexity and a high performance as compared with other classification models. However, our research shows that NB and those NB-like classifiers still suffer from a low performance with respect to either prediction accuracy or class ranking in some cases where class distributions are highly skewed. This is also called the Class Imbalance Problem (CIP). For solving this problem, we investigated the question of whether these individual classifiers can be further enhanced by assuming a Meta learning technique. Unfortunately, current Meta Learning techniques such as Adaptive Boosting (AdaBoost), Bootstrap Aggregating (Bagging), and MultiABoost are ineffective for improving NB and NB-like classifiers. In this talk, we present a new algorithm called Cascading Customized Classification (CCC), and show that CCC can effectively enhance NB and NB-like classifiers for the purpose of classification and class ranking in a linear time complexity by only building a couple of Customized Classifiers (CC).
|