Saturday, March 31, 2012

Negative Binomial for Count Data

I have noticed that when estimating the parameters of a negative binomial distribution for describing count data, the MCMC chain can become extremely autocorrelated because the parameters are highly correlated. Question: There must be some standard re-parameterization, or other approach, to reducing this autocorrelation. Can someone point me to it?

[Update: See this follow-up post.]

Here is an example of what I mean. First, a histogram of the data, consisting of a 50 individual counts. (E.g., for each of 50 subjects, the number times they checked their e-mail in the last hour.) The histogram of data has posterior-credible negative binomial distributions superimposed.

Here is the JAGS model statement:
model {
    # Likelihood:
    for( i in 1 : N ) {
        y[i] ~ dnegbin( p , r )
    }
    m <- r*(1-p)/p
    # Prior:
    r ~ dgamma(gSh,gRa)
    p ~ dbeta(1.001,1.001)
}

Although the MCMC chain comes up with reasonable estimates in the long run, there is a lot of autocorrelation along the way:

It is clear that the reason for the autocorrelation is the strong correlation of the parameters:

As I said above, there must be some well-known way to address this issue. Can someone point me to it? Thanks.

Wednesday, March 21, 2012

Classroom-based review in J. of Math. Psych.

In a recent review (full citation below) in the Journal of Mathematical Psychology, Wolf Vanpaemel and Francis Tuerlinckx report results from using the book in a classroom setting. They say
Overall, Kruschke is to be applauded for his incredible efforts at writing such a highly accessible and useful textbook on Bayesian statistics. Doing Bayesian Data Analysis is an impressive piece of work that presents a major step in the dissemination of the Bayesian approach into mainstream psychology and will shape the way future psychologists will deal with their data. We are delighted to use it again in our course and wholeheartedly recommend it to anyone who wants to acquaint students with Bayesian statistics...
Big thanks to Wolf and Francis for using the book, and for the thoughtful review!

Vanpaemel, W. & Tuerlinckx, F. (2012). Doing Bayesian data analysis in the classroom: An experience based review of John K. Kruschke’s “Doing Bayesian Data Analysis: A Tutorial with R and BUGS” 2011. Journal of Mathematical Psychology, 56, 64–66.  DOI: 10.1016/j.jmp.2011.12.001

Sunday, March 11, 2012

Discussion forum added to blog

A discussion forum has been added to the blog. The forum is linked in the right sidebar of the blog.

My thanks to Anne Standish for searching out information about how to set up the forum, and directing me to it.

Saturday, March 10, 2012

Bayesian estimation supersedes the t test

[Updated here.]
Bayesian estimation for two groups provides complete distributions of credible values for the effect size, group means and their difference, standard deviations and their difference, and the normality of the data. The method handles outliers. The decision rule can accept the null value (unlike traditional t tests) when certainty in the estimate is high (unlike Bayesian model comparison using Bayes factors). The method also yields precise estimates of statistical power for various research goals. The software and programs are free, and run on Macintosh, Linux, and Windows platforms. See this linked page for the paper and the software.

Search tag: BEST (for Bayesian estimation)

(A previous post announced an earlier version of this report and software.)

Tuesday, March 6, 2012

Talk and Workshop at DePauw University, April 6

I'll be doing a talk and workshop at DePauw University. Details can be found here.


A list of future and past workshops can be found here

(Poster composed entirely by folks at DePauw.)

Monday, March 5, 2012

The Book Visits Fisher

On their recent visit to St. Peter's Cathedral in Adelaide, colleagues Dan Navarro and Amy Perfors posed the book at the site of R. A. Fisher's remains. What is the probability of this happening if the null hypothesis were true?

Many thanks to Dan and Amy!
And see the book at Bayes' tomb, too.