efficient approximation of the conditional relative entropy with applications to discriminative learning of bayesian network classifiers

Clicks: 159
ID: 187829
2013
Article Quality & Performance Metrics
Overall Quality Improving Quality
0.0 /100
Combines engagement data with AI-assessed academic quality
AI Quality Assessment
Not analyzed
Abstract
We propose a minimum variance unbiased approximation to the conditional relative entropy of the distribution induced by the observed frequency estimates, for multi-classification tasks. Such approximation is an extension of a decomposable scoring criterion, named approximate conditional log-likelihood (aCLL), primarily used for discriminative learning of augmented Bayesian network classifiers. Our contribution is twofold: (i) it addresses multi-classification tasks and not only binary-classification ones; and (ii) it covers broader stochastic assumptions than uniform distribution over the parameters. Specifically, we considered a Dirichlet distribution over the parameters, which was experimentally shown to be a very good approximation to CLL. In addition, for Bayesian network classifiers, a closed-form equation is found for the parameters that maximize the scoring criterion.
Reference Key
mateus2013entropyefficient Use this key to autocite in the manuscript while using SciMatic Manuscript Manager or Thesis Manager
Authors ;Paulo Mateus;Pedro Adão;Alexandra M. Carvalho
Journal European journal of medicinal chemistry
Year 2013
DOI
10.3390/e15072716
URL
Keywords

Citations

No citations found. To add a citation, contact the admin at info@scimatic.org

No comments yet. Be the first to comment on this article.