Suggested further readings

There are a number of different logical positions in which treatments of Bayesian statistics started that are relevant to NMA.

Statistics:

Beautiful combination of Bayesian statistics with information theory by the late David MacKay. The book is available for free online. Simply beautiful: MacKay, D. J. C. Information theory, inference and learning algorithms. Cambridge university press, 2003.

Still the standard book. Not exactly perfect for beginners but beautiful: Gelman, A., Carlin, J. B., Stern, H. S., & Rubin, D. B. (1995). Bayesian data analysis. Chapman and Hall/CRC.

A great text book full of intuitive illustration:

McElreath, R. (2020). Statistical rethinking: A Bayesian course with examples in R and Stan. Chapman and Hall/CRC.

This book is good for the python Bayes codes: Downey, A. (2013). Think Bayes: Bayesian Statistics in Python. O’Reilly Media, Inc.

Another introductory book: Kruschke, J. (2014). Doing Bayesian data analysis: A tutorial with R, JAGS, and Stan. Academic Press.

Normative Models:

The book that propelled the field to visibility: Knill, D. C., and Richards, W. (Eds.). (1996). Perception as Bayesian inference. Cambridge University Press.

This book is largely focused on Bayesian approaches to cue combination: Welchman, A. E., Trommershauser, J., Kording, K., & Landy, M. S. (2011). Decoding the cortical representation of depth. Sensory cue integration. Oxford University Press.

Analysis of neural data:

This book contains a good treatment of Bayesian approaches to the analysis of neural data: Kass, R. E., Eden, U. T., & Brown, E. N. (2014). Analysis of neural data (Vol. 491). New York: Springer.