دانلود مقاله ISI انگلیسی شماره 29152
ترجمه فارسی عنوان مقاله

آموزش شبکه های بیزی طبقه توسط به حداقل رساندن ریسک

عنوان انگلیسی
Learning Bayesian network classifiers by risk minimization
کد مقاله سال انتشار تعداد صفحات مقاله انگلیسی
29152 2012 25 صفحه PDF
منبع

Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)

Journal : International Journal of Approximate Reasoning, Volume 53, Issue 2, February 2012, Pages 248–272

ترجمه کلمات کلیدی
شبکه های بیزی - طبقه بندی - مدل های گرافیکی احتمالاتی - آموزش ساختار -
کلمات کلیدی انگلیسی
Bayesian networks, Classification, Probabilistic graphical models, Structure learning,
پیش نمایش مقاله
پیش نمایش مقاله   آموزش شبکه های بیزی طبقه توسط به حداقل رساندن ریسک

چکیده انگلیسی

Bayesian networks (BNs) provide a powerful graphical model for encoding the probabilistic relationships among a set of variables, and hence can naturally be used for classification. However, Bayesian network classifiers (BNCs) learned in the common way using likelihood scores usually tend to achieve only mediocre classification accuracy because these scores are less specific to classification, but rather suit a general inference problem. We propose risk minimization by cross validation (RMCV) using the 0/1 loss function, which is a classification-oriented score for unrestricted BNCs. RMCV is an extension of classification-oriented scores commonly used in learning restricted BNCs and non-BN classifiers. Using small real and synthetic problems, allowing for learning all possible graphs, we empirically demonstrate RMCV superiority to marginal and class-conditional likelihood-based scores with respect to classification accuracy. Experiments using twenty-two real-world datasets show that BNCs learned using an RMCV-based algorithm significantly outperform the naive Bayesian classifier (NBC), tree augmented NBC (TAN), and other BNCs learned using marginal or conditional likelihood scores and are on par with non-BN state of the art classifiers, such as support vector machine, neural network, and classification tree. These experiments also show that an optimized version of RMCV is faster than all unrestricted BNCs and comparable with the neural network with respect to run-time. The main conclusion from our experiments is that unrestricted BNCs, when learned properly, can be a good alternative to restricted BNCs and traditional machine-learning classifiers with respect to both accuracy and efficiency.