approximated information analysis in bayesian inference
Clicks: 146
ID: 232709
2015
Article Quality & Performance Metrics
Overall Quality
Improving Quality
0.0
/100
Combines engagement data with AI-assessed academic quality
Reader Engagement
Emerging Content
30.0
/100
145 views
8 readers
Trending
AI Quality Assessment
Not analyzed
Abstract
In models with nuisance parameters, Bayesian procedures based on Markov Chain Monte Carlo (MCMC) methods have been developed to approximate the posterior distribution of the parameter of interest. Because these procedures require burdensome computations related to the use of MCMC, approximation and convergence in these procedures are important issues. In this paper, we explore Gibbs sensitivity by using an alternative to the full conditional distribution of the nuisance parameter. The approximate sensitivity of the posterior distribution of interest is studied in terms of an information measure, including Kullback–Leibler divergence. As an illustration, we then apply these results to simple spatial model settings.
Abstract Quality Issue:
This abstract appears to be incomplete or contains metadata (103 words).
Try re-searching for a better abstract.
| Reference Key |
seo2015entropyapproximated
Use this key to autocite in the manuscript while using
SciMatic Manuscript Manager or Thesis Manager
|
|---|---|
| Authors | ;Jung In Seo;Yongku Kim |
| Journal | European journal of medicinal chemistry |
| Year | 2015 |
| DOI |
10.3390/e17031441
|
| URL | |
| Keywords |
Citations
No citations found. To add a citation, contact the admin at info@scimatic.org
Comments
No comments yet. Be the first to comment on this article.