复制成功
  • 图案背景
  • 纯色背景

笔记

  • 2019-11-16
    为大人带来形象的羊生肖故事来历 为孩子带去快乐的生肖图画故事阅读
    谈谈怎样学好数学_苏步青-中学生文库
网上书库

上传于:2014-05-06

粉丝量:676

上传资料均来自于互联网,若有侵权,立刻通知删除。



Doing Bayesian Data Analysis A Tutorial with R and BUGS

下载积分:1500

内容提示: Doing Bayesian Data Analysis: ATutorial with R and BUGSJohn K. KruschkeDraft ofMay 11, 2010. Please do not circulate this preliminary draft. If you reportBayesian analyses based on this book, please do cite it!¨ ⌢Copyright c  2010 by John K. Kruschke. iiDedicated to my mother, Marilyn A. Kruschke,and to the memory of my father, Earl R. Kruschke,who both brilliantly exemplified and taught sound reasoning.And, in honor of my father,who dedicated his first book to his children,I also dedicate this book...

文档格式:PDF| 浏览次数:17| 上传日期:2014-05-06 22:54:50| 文档星级:
Doing Bayesian Data Analysis: ATutorial with R and BUGSJohn K. KruschkeDraft ofMay 11, 2010. Please do not circulate this preliminary draft. If you reportBayesian analyses based on this book, please do cite it!¨ ⌢Copyright c  2010 by John K. Kruschke. iiDedicated to my mother, Marilyn A. Kruschke,and to the memory of my father, Earl R. Kruschke,who both brilliantly exemplified and taught sound reasoning.And, in honor of my father,who dedicated his first book to his children,I also dedicate this book to mine:Claire A. Kruschke and Loren D. Kruschke Contents1This Book’s Organization: Read Me First!1.1Real people can read this book . . . . . . . . . . . . . . . . . . . . . . . .1.2Prerequisites . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1.3The organization of this book . . . . . . . . . . . . . . . . . . . . . . . . .1.3.1What are the essential chapters? . . . . . . . . . . . . . . . . . . .1.3.2Where’s the equivalent of traditional test X in this book? . . . . . .1.4Gimme feedback (be polite). . . . . . . . . . . . . . . . . . . . . . . . .1.5Acknowledgments. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .11233455IThe Basics: Parameters, Probability, Bayes’ Rule, and R72Introduction: Models we believe in2.1Models ofobservations and models of beliefs . . . . . . . . . . . . . . . .2.1.1Models have parameters . . . . . . . . . . . . . . . . . . . . . . .2.1.2Prior and posterior beliefs . . . . . . . . . . . . . . . . . . . . . .2.2Three goals for inference from data . . . . . . . . . . . . . . . . . . . . . .2.2.1Estimation of parameter values . . . . . . . . . . . . . . . . . . . .2.2.2Prediction of data values . . . . . . . . . . . . . . . . . . . . . . .2.2.3Model comparison . . . . . . . . . . . . . . . . . . . . . . . . . .2.3The R programming language. . . . . . . . . . . . . . . . . . . . . . . .2.3.1Getting and installing R. . . . . . . . . . . . . . . . . . . . . . .2.3.2Invoking R and using the command line . . . . . . . . . . . . . . .2.3.3A simple example of R in action . . . . . . . . . . . . . . . . . . .2.3.4Getting help in R . . . . . . . . . . . . . . . . . . . . . . . . . . .2.3.5Programming in R . . . . . . . . . . . . . . . . . . . . . . . . . .2.3.5.1Editing programs in R . . . . . . . . . . . . . . . . . . .2.3.5.2Variable names in R . . . . . . . . . . . . . . . . . . . .2.3.5.3Running a program . . . . . . . . . . . . . . . . . . . .2.4Exercises. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .910111313131414151515161718181819193What is this stuff called probability?3.1The set of all possible events . . . . . . . . . . . . . . . . . . . . . . . . .3.1.1Coin flips: Why you should care . . . . . . . . . . . . . . . . . . .3.2Probability: Outside or inside the head . . . . . . . . . . . . . . . . . . . .3.2.1Outside the head: Long-run relative frequency3.2.1.1Simulating a long-run relative frequency . . . . . . . . .212222232323. . . . . . . . . . .iii ivCONTENTS3.2.1.2Inside the head: Subjective belief . . . . . . . . . . . . . . . . . .3.2.2.1Calibrating a subjective belief by preferences . . . . . . .3.2.2.2Describing a subjective belief mathematicallyProbabilities assign numbers to possibilities . . . . . . . . . . . . .Probability distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . .3.3.1Discrete distributions: Probability mass . . . . . . . . . . . . . . .3.3.23.3.2.1Properties of probability density functions . . . . . . . .3.3.2.2The normal probability density function3.3.3Mean and variance of a distribution . . . . . . . . . . . . . . . . .3.3.3.1Mean as minimized variance3.3.4Variance as uncertainty in beliefs3.3.5Highest density interval (HDI) . . . . . . . . . . . . . . . . . . . .Two-way distributions. . . . . . . . . . . . . . . . . . . . . . . . . . . .3.4.1Marginal probability . . . . . . . . . . . . . . . . . . . . . . . . .3.4.2Conditional probability . . . . . . . . . . . . . . . . . . . . . . . .3.4.3Independence of attributes . . . . . . . . . . . . . . . . . . . . . .R code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .3.5.1R code for Figure 3.1 . . . . . . . . . . . . . . . . . . . . . . . . .3.5.2R code for Figure 3.3 . . . . . . . . . . . . . . . . . . . . . . . . .Exercises. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Deriving a long-run relative frequency . . . . . . . . . .242525262626272729303233343435363839404041413.2.2. . . . . .3.2.33.3Continuous distributions: Rendezvous with density†. . . . . . . .. . . . . . . . .. . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . .3.43.53.64Bayes’ Rule4.1Bayes’ rule4.1.14.1.24.1.34.2Applied to models and data . . . . . . . . . . . . . . . . . . . . . . . . . .4.2.1Data order invariance . . . . . . . . . . . . . . . . . . . . . . . . .4.2.2An example with coin flipping . . . . . . . . . . . . . . . . . . . .4.2.2.14.3The three goals of inference. . . . . . . . . . . . . . . . . . . . . . . . .4.3.1Estimation of parameter values . . . . . . . . . . . . . . . . . . . .4.3.2Prediction ofdata values . . . . . . . . . . . . . . . . . . . . . . .4.3.3Model comparison . . . . . . . . . . . . . . . . . . . . . . . . . .4.3.4Why Bayesian inference can be difficult . . . . . . . . . . . . . . .4.3.5Bayesian reasoning in everyday life . . . . . . . . . . . . . . . . .4.3.5.1Holmesian deduction4.3.5.2Judicial exoneration . . . . . . . . . . . . . . . . . . . .4.4R code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4.4.1R code for Figure 4.1 . . . . . . . . . . . . . . . . . . . . . . . . .4.5Exercises. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4344444547474950525252525356565657575759. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Derived from definitions of conditional probability . . . . . . . . .Intuited from a two-way discrete table . . . . . . . . . . . . . . . .The denominator as an integral over continuous values . . . . . . .p(D|θ) is not θ . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . CONTENTSvIIAll the Fundamentals Applied to Inferring a Binomial Proportion635Inferring a Binomial Proportion via Exact Mathematical Analysis5.1The likelihood function: Bernoulli distribution . . . . . . . . . . . . . . . .5.2A description ofbeliefs: The beta distribution . . . . . . . . . . . . . . . .5.2.1Specifying a beta prior . . . . . . . . . . . . . . . . . . . . . . . .5.2.2The posterior beta. . . . . . . . . . . . . . . . . . . . . . . . . .5.3Three inferential goals. . . . . . . . . . . . . . . . . . . . . . . . . . . .5.3.1Estimating the binomial proportion5.3.2Predicting data . . . . . . . . . . . . . . . . . . . . . . . . . . . .5.3.3Model comparison . . . . . . . . . . . . . . . . . . . . . . . . . .5.3.3.1Is the best model a good model?5.4Summary: How to do Bayesian inference5.5R code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .5.5.1R code for Figure 5.2 . . . . . . . . . . . . . . . . . . . . . . . . .5.6Exercises. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .6566676870717172737575767679. . . . . . . . . . . . . . . . .. . . . . . . . . . . . .. . . . . . . . . . . . . . . . . .6Inferring a Binomial Proportion via Grid Approximation6.1Bayes’ rule for discrete values ofθ . . . . . . . . . . . . . . . . . . . . . .6.2Discretizing a continuous prior density . . . . . . . . . . . . . . . . . . . .6.2.1Examples using discretized priors . . . . . . . . . . . . . . . . . .6.3Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .6.4Prediction ofsubsequent data . . . . . . . . . . . . . . . . . . . . . . . . .6.5Model comparison. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .6.6Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .6.7R code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .6.7.1R code for Figure 6.2 etc. . . . . . . . . . . . . . . . . . . . . . . .6.8Exercises. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .83848485878889899090927Inferring a Binomial Proportion via the Metropolis Algorithm7.1A simple case of the Metropolis algorithm . . . . . . . . . . . . . . . . . .7.1.1A politician stumbles upon the Metropolis algorithm . . . . . . . .7.1.2A random walk . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1017.1.3General properties of a random walk . . . . . . . . . . . . . . . . . 1017.1.4Why we care . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1047.1.5Why it works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1047.2The Metropolis algorithm more generally7.2.1“Burn-in,” efficiency, and convergence . . . . . . . . . . . . . . . . 1087.2.2Terminology: Markov chain Monte Carlo . . . . . . . . . . . . . . 1097.3From the sampled posterior to the three goals . . . . . . . . . . . . . . . . 1107.3.1Estimation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1117.3.1.1Highest density intervals from random samples . . . . . . 1117.3.1.2Using a sample to estimate an integral7.3.2Prediction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1137.3.3Model comparison: Estimation of p(D) . . . . . . . . . . . . . . . 1137.4MCMC in BUGS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1157.4.1Parameter estimation with BUGS7.4.2BUGS for prediction . . . . . . . . . . . . . . . . . . . . . . . . . 118979899. . . . . . . . . . . . . . . . . . 107. . . . . . . . . . 112. . . . . . . . . . . . . . . . . . 116 viCONTENTS7.4.3ConclusionR code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1217.6.1R code for a home-grown Metropolis algorithm . . . . . . . . . . . 121Exercises. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123BUGS for model comparison . . . . . . . . . . . . . . . . . . . . . 119. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1207.57.67.78Inferring Two Binomial Proportions via Gibbs Sampling8.1Prior, likelihood and posterior for two proportions . . . . . . . . . . . . . . 1298.2The posterior via exact formal analysis . . . . . . . . . . . . . . . . . . . . 1308.3The posterior via grid approximation . . . . . . . . . . . . . . . . . . . . . 1338.4The posterior via Markov chain Monte Carlo8.4.1Metropolis algorithm . . . . . . . . . . . . . . . . . . . . . . . . . 1358.4.2Gibbs sampling . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1368.4.2.1Disadvantages of Gibbs sampling . . . . . . . . . . . . . 1398.5Doing it with BUGS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1408.5.1Sampling the prior in BUGS . . . . . . . . . . . . . . . . . . . . . 1418.6How different are the underlying biases? . . . . . . . . . . . . . . . . . . . 1428.7Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1438.8R code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1448.8.1R code for grid approximation (Figures 8.1 and 8.2)8.8.2R code for Metropolis sampler (Figure 8.3) . . . . . . . . . . . . . 1468.8.3R code for BUGS sampler (Figure 8.6)8.8.4R code for plotting a posterior histogram8.9Exercises. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153127. . . . . . . . . . . . . . . . 134. . . . . . . . 144. . . . . . . . . . . . . . . 149. . . . . . . . . . . . . . 1519Bernoulli Likelihood with Hierarchical Prior9.1A single coin from a single mint . . . . . . . . . . . . . . . . . . . . . . . 1589.1.1Posterior via grid approximation . . . . . . . . . . . . . . . . . . . 1609.2Multiple coins from a single mint . . . . . . . . . . . . . . . . . . . . . . . 1649.2.1Posterior via grid approximation . . . . . . . . . . . . . . . . . . . 1669.2.2Posterior via Monte Carlo sampling . . . . . . . . . . . . . . . . . 1699.2.2.1Doing it with BUGS . . . . . . . . . . . . . . . . . . . . 1719.2.3Outliers and shrinkage of individual estimates . . . . . . . . . . . . 1759.2.4Case study: Therapeutic touch . . . . . . . . . . . . . . . . . . . . 1779.2.5Number ofcoins and flips per coin . . . . . . . . . . . . . . . . . . 1789.3Multiple coins from multiple mints . . . . . . . . . . . . . . . . . . . . . . 1789.3.1Independent mints . . . . . . . . . . . . . . . . . . . . . . . . . . 1789.3.2Dependent mints . . . . . . . . . . . . . . . . . . . . . . . . . . . 1829.3.3Individual differences and meta-analysis . . . . . . . . . . . . . . . 1849.4Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1859.5R code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1859.5.1Code for analysis of therapeutic-touch experiment9.5.2Code for analysis of filtration-condensation experiment . . . . . . . 1889.6Exercises. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191157. . . . . . . . . 185 CONTENTSvii10 Hierarchical modeling and model comparison10.1 Model comparison as hierarchical modeling . . . . . . . . . . . . . . . . . 19510.2 Model comparison in BUGS . . . . . . . . . . . . . . . . . . . . . . . . . 19710.2.1 A simple example. . . . . . . . . . . . . . . . . . . . . . . . . . 19710.2.2 A realistic example with “pseudopriors” . . . . . . . . . . . . . . . 19910.2.3 Some practical advice when using transdimensional MCMC withpseudopriors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20410.3 Model comparison and nested models . . . . . . . . . . . . . . . . . . . . 20610.4 Review of hierarchical framework for model comparison . . . . . . . . . . 20810.4.1 Comparing methods for MCMC model comparison . . . . . . . . . 20810.4.2 Summary and caveats . . . . . . . . . . . . . . . . . . . . . . . . . 20910.5 Exercises. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21019511 Null Hypothesis Significance Testing11.1 NHST for the bias of a coin . . . . . . . . . . . . . . . . . . . . . . . . . . 21611.1.1 When the experimenter intends to fix N . . . . . . . . . . . . . . . 21611.1.2 When the experimenter intends to fix z . . . . . . . . . . . . . . . . 21911.1.3 Soul searching . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22011.1.4 Bayesian analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . 22211.2 Prior knowledge about the coin . . . . . . . . . . . . . . . . . . . . . . . . 22211.2.1 NHST analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22311.2.2 Bayesian analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . 22311.2.2.1Priors are overt and should influence . . . . . . . . . . . 22311.3 Confidence interval and highest density interval . . . . . . . . . . . . . . . 22411.3.1 NHST confidence interval . . . . . . . . . . . . . . . . . . . . . . 22411.3.2 Bayesian HDI . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22711.4 Multiple comparisons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22711.4.1 NHST correction for experimentwise error11.4.2 Just one Bayesian posterior no matter how you look at . . . . . . . 23011.4.3 How Bayesian analysis mitigates false alarms . . . . . . . . . . . . 23111.5 What a sampling distribution is good for . . . . . . . . . . . . . . . . . . . 23111.5.1 Planning an experiment . . . . . . . . . . . . . . . . . . . . . . . . 23111.5.2 Exploring model predictions (posterior predictive check) . . . . . . 23211.6 Exercises. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 233215. . . . . . . . . . . . . 22812 Bayesian Approaches to Testing a Point (“Null”) Hypothesis12.1 The estimation (single prior) approach . . . . . . . . . . . . . . . . . . . . 24012.1.1 Is a null value of a parameter among the credible values? . . . . . . 24012.1.2 Is a null value of a difference among the credible values? . . . . . . 24112.1.2.1Differences of correlated parameters12.1.3 Region of Practical Equivalence (ROPE)12.2 The model-comparison (two-prior) approach . . . . . . . . . . . . . . . . . 24512.2.1 Are the biases of two coins equal or not?12.2.1.1Formal analytical solution . . . . . . . . . . . . . . . . . 24712.2.1.2 Example application . . . . . . . . . . . . . . . . . . . . 24812.2.2 Are different groups equal or not? . . . . . . . . . . . . . . . . . . 24912.3 Estimation or model comparison?12.3.1 What is the probability that the null value is true? . . . . . . . . . . 251239. . . . . . . . . . . 242. . . . . . . . . . . . . . 244. . . . . . . . . . . . . . 246. . . . . . . . . . . . . . . . . . . . . . 251 viiiCONTENTS12.3.2 Recommendations . . . . . . . . . . . . . . . . . . . . . . . . . . 25112.4 R code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25212.4.1 R code for Figure 12.5 . . . . . . . . . . . . . . . . . . . . . . . . 25212.5 Exercises. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25513 Goals, Power, and Sample Size13.1 The Will to Power . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26013.1.1 Goals and Obstacles. . . . . . . . . . . . . . . . . . . . . . . . . 26013.1.2 Power . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26113.1.3 Sample Size . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26213.1.4 Other Expressions of Goals13.2 Sample size for a single coin . . . . . . . . . . . . . . . . . . . . . . . . . 26413.2.1 When the goal is to exclude a null value . . . . . . . . . . . . . . . 26513.2.2 When the goal is precision . . . . . . . . . . . . . . . . . . . . . . 26613.3 Sample size for multiple mints . . . . . . . . . . . . . . . . . . . . . . . . 26713.4 Power: prospective, retrospective, and replication . . . . . . . . . . . . . . 26913.4.1 Power analysis requires verisimilitude ofsimulated data . . . . . . 27013.5 The importance of planning . . . . . . . . . . . . . . . . . . . . . . . . . . 27113.6 R code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27213.6.1 Sample size for a single coin . . . . . . . . . . . . . . . . . . . . . 27213.6.2 Power and sample size for multiple mints . . . . . . . . . . . . . . 27413.7 Exercises. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 281259. . . . . . . . . . . . . . . . . . . . . 264IIIThe Generalized Linear Model28914 Overview ofthe Generalized Linear Model14.1 The generalized linear model (GLM) . . . . . . . . . . . . . . . . . . . . . 29214.1.1 Predictor and predicted variables . . . . . . . . . . . . . . . . . . . 29214.1.2 Scale types: metric, ordinal, nominal14.1.3 Linear function of a single metric predictor . . . . . . . . . . . . . 29414.1.3.1Reparameterization to x threshold form . . . . . . . . . . 29614.1.4 Additive combination ofmetric predictors . . . . . . . . . . . . . . 29614.1.4.1Reparameterization to x threshold form . . . . . . . . . . 29814.1.5 Nonadditive interaction of metric predictors . . . . . . . . . . . . . 29814.1.6 Nominal predictors . . . . . . . . . . . . . . . . . . . . . . . . . . 30014.1.6.1Linear model for a single nominal predictor14.1.6.2 Additive combination of nominal predictors14.1.6.3Nonadditive interaction of nominal predictors14.1.7 Linking combined predictors to the predicted . . . . . . . . . . . . 30414.1.7.1The sigmoid (a.k.a. logistic) function . . . . . . . . . . . 30514.1.7.2 The cumulative normal (a.k.a. Phi) function14.1.8 Probabilistic prediction . . . . . . . . . . . . . . . . . . . . . . . . 30814.1.9 Formal expression ofthe GLM . . . . . . . . . . . . . . . . . . . . 30814.2 Cases of the GLM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31114.2.1 Two or more nominal variables predicting frequency . . . . . . . . 31314.3 Exercises. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 315291. . . . . . . . . . . . . . . . 293. . . . . . . 300. . . . . . . 302. . . . . . 303. . . . . . . 307 CONTENTSix15 Metric Predicted Variable on a Single Group15.1 Estimating the mean and precision of a normal likelihood . . . . . . . . . . 31815.1.1 Solution by mathematical analysis . . . . . . . . . . . . . . . . . . 31815.1.2 Approximation by MCMC in BUGS . . . . . . . . . . . . . . . . . 32215.1.3 Outliers and robust estimation: The t distribution . . . . . . . . . . 32315.1.4 When the data are non-normal: Transformations15.2 Repeated measures and individual differences . . . . . . . . . . . . . . . . 32815.2.1 Hierarchical model . . . . . . . . . . . . . . . . . . . . . . . . . . 33015.2.2 Implementation in BUGS15.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33315.4 R code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33315.4.1 Estimating the mean and precision ofa normal likelihood . . . . . . 33315.4.2 Repeated measures: Normal across and normal within . . . . . . . 33515.5 Exercises. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 338317. . . . . . . . . . 326. . . . . . . . . . . . . . . . . . . . . . 33116 Metric Predicted Variable with One Metric Predictor16.1 Simple linear regression16.1.1 The hierarchical model and BUGS code . . . . . . . . . . . . . . . 34616.1.1.1Standardizing the data for MCMC sampling . . . . . . . 34716.1.1.2 Initializing the chains . . . . . . . . . . . . . . . . . . . 34816.1.2 The posterior: How big is the slope? . . . . . . . . . . . . . . . . . 34916.1.3 Posterior prediction . . . . . . . . . . . . . . . . . . . . . . . . . . 35016.2 Outliers and robust regression. . . . . . . . . . . . . . . . . . . . . . . . 35216.3 Simple linear regression with repeated measures . . . . . . . . . . . . . . . 35416.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35716.5 R code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35816.5.1 Data generator for height and weight . . . . . . . . . . . . . . . . . 35816.5.2 BRugs: Robust linear regression . . . . . . . . . . . . . . . . . . . 35916.5.3 BRugs: Simple linear regression with repeated measures . . . . . . 36216.6 Exercises. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 366343. . . . . . . . . . . . . . . . . . . . . . . . . . . 34417 Metric Predicted Variable with Multiple Metric Predictors17.1 Multiple linear regression . . . . . . . . . . . . . . . . . . . . . . . . . . . 37217.1.1 The perils of correlated predictors . . . . . . . . . . . . . . . . . . 37217.1.2 The model and BUGS program17.1.2.1MCMC efficiency: Standardizing and initializing . . . . . 37617.1.3 The posterior: How big are the slopes?17.1.4 Posterior prediction . . . . . . . . . . . . . . . . . . . . . . . . . . 37817.2 Hyperpriors and shrinkage ofregression coefficients . . . . . . . . . . . . . 37817.2.1 Informative priors, sparse data, and correlated predictors . . . . . . 38217.3 Multiplicative interaction of metric predictors . . . . . . . . . . . . . . . . 38317.3.1 The hierarchical model and BUGS code . . . . . . . . . . . . . . . 38417.3.1.1Standardizing the data and initializing the chains . . . . . 38517.3.2 Interpreting the posterior . . . . . . . . . . . . . . . . . . . . . . . 38517.4 Which predictors should be included? . . . . . . . . . . . . . . . . . . . . 38817.5 R code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39017.5.1 Multiple linear regression17.5.2 Multiple linear regression with hyperprior on coefficients . . . . . . 394371. . . . . . . . . . . . . . . . . . . 375. . . . . . . . . . . . . . . 376. . . . . . . . . . . . . . . . . . . . . . 390 xCONTENTS17.6 Exercises. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39918 Metric Predicted Variable with One Nominal Predictor18.1 Bayesian oneway ANOVA . . . . . . . . . . . . . . . . . . . . . . . . . . 40218.1.1 The hierarchical prior . . . . . . . . . . . . . . . . . . . . . . . . . 40318.1.1.1Homogeneity of variance . . . . . . . . . . . . . . . . . 40418.1.2 Doing it with R and BUGS . . . . . . . . . . . . . . . . . . . . . . 40418.1.3 A worked example . . . . . . . . . . . . . . . . . . . . . . . . . . 40618.1.3.1Contrasts and complex comparisons18.1.3.2 Is there a difference? . . . . . . . . . . . . . . . . . . . . 40818.2 Multiple comparisons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40918.3 Two group Bayesian ANOVA and the NHST t test . . . . . . . . . . . . . . 41218.4 R code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41318.4.1 Bayesian oneway ANOVA . . . . . . . . . . . . . . . . . . . . . . 41318.5 Exercises. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 417401. . . . . . . . . . . 40719 Metric Predicted Variable with Multiple Nominal Predictors19.1 Bayesian multi-factor ANOVA . . . . . . . . . . . . . . . . . . . . . . . . 42219.1.1 Interaction of nominal predictors . . . . . . . . . . . . . . . . . . . 42219.1.2 The hierarchical prior . . . . . . . . . . . . . . . . . . . . . . . . . 42419.1.3 An example in R and BUGS . . . . . . . . . . . . . . . . . . . . . 42519.1.4 Interpreting the posterior . . . . . . . . . . . . . . . . . . . . . . . 42819.1.4.1Metric predictors and ANCOVA19.1.4.2 Interaction contrasts . . . . . . . . . . . . . . . . . . . . 42919.1.5 Non-crossover interactions, rescaling, and homogeneous variances . 43019.2 Repeated measures, a.k.a. within-subject designs19.2.1 Why use a within-subject design? And why not? . . . . . . . . . . 43419.3 R code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43519.3.1 Bayesian two-factor ANOVA . . . . . . . . . . . . . . . . . . . . . 43519.4 Exercises. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 444421. . . . . . . . . . . . . 428. . . . . . . . . . . . . . 43220 Dichotomous Predicted Variable20.1 Logistic regression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45020.1.1 The model. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45120.1.2 Doing it in R and BUGS . . . . . . . . . . . . . . . . . . . . . . . 45120.1.3 Interpreting the posterior . . . . . . . . . . . . . . . . . . . . . . . 45220.1.4 Perils of correlated predictors20.1.5 When there are few 1’s in the data . . . . . . . . . . . . . . . . . . 45420.1.6 Hyperprior across regression coefficients20.2 Interaction of predictors in logistic regression . . . . . . . . . . . . . . . . 45520.3 Logistic ANOVA . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45620.3.1 Within-subject designs . . . . . . . . . . . . . . . . . . . . . . . . 45820.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45820.5 R code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45920.5.1 Logistic regression code . . . . . . . . . . . . . . . . . . . . . . . 45920.5.2 Logistic ANOVA code . . . . . . . . . . . . . . . . . . . . . . . . 46320.6 Exercises. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 468449. . . . . . . . . . . . . . . . . . . . 454. . . . . . . . . . . . . . 454 CONTENTSxi21 Ordinal Predicted Variable21.1 Ordinal probit regression . . . . . . . . . . . . . . . . . . . . . . . . . . . 47221.1.1 What the data look like . . . . . . . . . . . . . . . . . . . . . . . . 47221.1.2 The mapping from metric x to ordinal y . . . . . . . . . . . . . . . 47221.1.3 The parameters and their priors21.1.4 Standardizing for MCMC efficiency . . . . . . . . . . . . . . . . . 47521.1.5 Posterior prediction . . . . . . . . . . . . . . . . . . . . . . . . . . 47521.2 Some examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47621.2.1 Why are some thresholds outside the data?21.3 Interaction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48021.4 Relation to linear and logistic regression . . . . . . . . . . . . . . . . . . . 48121.5 R code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48121.6 Exercises. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 486471. . . . . . . . . . . . . . . . . . . 474. . . . . . . . . . . . . 47822 Contingency Table Analysis22.1 Poisson exponential ANOVA . . . . . . . . . . . . . . . . . . . . . . . . . 49022.1.1 What the data look like . . . . . . . . . . . . . . . . . . . . . . . . 49022.1.2 The exponential link function22.1.3 The Poisson likelihood . . . . . . . . . . . . . . . . . . . . . . . . 49222.1.4 The parameters and the hierarchical prior . . . . . . . . . . . . . . 49422.2 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49422.2.1 Credible intervals on cell probabilities . . . . . . . . . . . . . . . . 49522.3 Log linear models for contingency tables . . . . . . . . . . . . . . . . . . . 49622.4 R code for Poisson exponential model . . . . . . . . . . . . . . . . . . . . 49722.5 Exercises. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 504489. . . . . . . . . . . . . . . . . . . . 49023 Tools in the Trunk23.1 Reporting a Bayesian analysis23.1.1 Essential points . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50823.1.2 Optional points . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50923.1.3 Helpful points. . . . . . . . . . . . . . . . . . . . . . . . . . . . 50923.2 MCMC burn-in and thinning . . . . . . . . . . . . . . . . . . . . . . . . . 51023.3 Functions for approximating highest density intervals . . . . . . . . . . . . 51323.3.1 R code for computing HDI of a grid approximation . . . . . . . . . 51323.3.2 R code for computing HDI of a MCMC sample . . . . . . . . . . . 51323.3.3 R code for computing HDI of a function . . . . . . . . . . . . . . . 51523.4 Reparameterization of probability distributions23.4.1 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51623.4.2 Reparameterization of two parameters . . . . . . . . . . . . . . . . 517507. . . . . . . . . . . . . . . . . . . . . . . . 508. . . . . . . . . . . . . . . 516References519Index528 xiiCONTENTS Chapter 1This Book’s Organization: Read MeFirst!Contents1.11.21.3Real people can read this book . . . . . . . . . . . . . . . . . . . . .Prerequisites . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .The organization of this book . . . . . . . . . . . . . . . . . . . . . .1.3.1What are the essential chapters? . . . . . . . . . . . . . . . . .1.3.2Where’s the equivalent oftraditional test X in this book? . . . .Gimme feedback (be polite) . . . . . . . . . . . . . . . . . . . . . . .Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . .12334551.41.5Oh honey I’m searching for love that is true,But driving through fog is so dang hard to do.Please paint me a line on the road to your heart,I’ll rev up my pick up and get a clean start.1.1Real people can read this bookThis book explains how to actually do Bayesian data analysis, by real people (like you),for realistic data (like yours). The book starts at the basics, with notions of probability andprogramming, then progresses to advanced hierarchical models that are used in realistic dataanalysis. In other words, you do not need to already know statistics and programming. Thisbook is speaking to a first-year graduate student or advanced undergraduate in the social orbiological sciences: Someone who grew up in Lake Wobegon1, but who is not the mythicalbeing that has the previous training of a nuclear physicist and then decided to learn aboutBayesian statistics.This book provides broad coverage and ease of access. Section 1.3 describes the con-tents in abit more detail, but here are some highlights. This bookcovers Bayesian analogues1A popular weekly radio show on National Public Radio, called A Prairie Home Companion, featuresfictional anecdotes about a small town named Lake Wobegon. The stories, written and orated by GarrisonKeillor, always end with the phrase, “And that’s the news from Lake Wobegon, where all the women are strong,all the men are good looking, and all the children are above average.” So, if you grew up there, ...Kruschke, J. K. (2010). Doing Bayesian Data Analysis: A Tutorial with R and BUGS. Academic Press /Elsevier.preliminary draft. If you report Bayesian analyses based on this book, please do cite it!Copyright c  2010 by John K. Kruschke. Draft of May 11, 2010. Please do not circulate this¨ ⌢1 2CHAPTER 1. THIS BOOK’S ORGANIZATION: READ ME FIRST!ofall the traditional statistical tests that are presented in introductory statistics textbooks, in-cluding t-tests, analysis ofvariance (ANOVA), regression, chi-square tests, and so on. Thisbook also covers crucial issues for designing research, such as statistical power and methodsfor determining the sample size needed to achieve a desi...

关注我们

关注微信公众号

您选择了以下内容