Probably the most valuable tip when learning something new, is “what to read?” Here is a (partial) list of the books I found most valuable in my research.

  1. Cox, D. R. Principles of Statistical Inference. A high level, almost philosophical text, on the basic ideas in statistics. Possibly the most important, in this list, but make sure you read it last.

  2. Wasserman, Larry. All of Statistics: A Concise Course in Statistical Inference. A one stop shop for almost all of the fundamental ideas in statistics.

  3. van der Vaart, A. W. Asymptotic Statistics. Much, if not most, of classical statistics is asymptotic. This book can serve both as an introduction, and as a technical reference for the vast realm of asymptotic statistics.

  4. Hastie, T, R Tibshirani, and JH Friedman. The Elements of Statistical Learning. A one stop shop for machine learning, cast in statistical terminology.

  5. Tsybakov, A. B. Introduction to Nonparametric Estimation. My favorite reference on non parametric statistics.

  6. Greene, William H. Econometric Analysis. My favorite reference on the analysis of linear models. The economic orientation adds very useful applied motivation, and an emphasis on causal inference rarely seen in the statistical literature.

  7. Grimmett, Geoffrey R., and David R. Stirzaker. Probability and Random Processes. An excellent introductory textbook to probability and stochastic processes.

  8. Cover, T. M., and Joy A. Thomas. Elements of Information Theory. As the name suggests, your one stop shop for information theory.

  9. Wilcox, Rand R. Introduction to Robust Estimation and Hypothesis Testing. A very well written exposition of the vast world of robust statistics. A must read with whoever works with actual data, and not only theory.

  10. Robert E. Weiss. Modelling Longitudinal Data. My favorite reference on the topic.

  11. James E. Gentle. Numerical Linear Algebra for Applications in Statistics. Because we may write , but a computer would never invert a matrix to solve this problem.

  12. Venables, W.N. and Ripley, B.D. Modern Applied Statistics with S. Everything you need to actually analyze data in R.

  13. Rasmussen, C.E and Williams C.K.I. Gaussian Processes for Machine Learning. The first time I felt I understand RKHS.

  14. Bai, Z., and Silverstein, J.W. Spectral Analysis of Large Dimensional Random Matrices. For random matrix theory.

  15. Allard, D. and Chilès, J.P. and Delfiner P. Geostatistics: Modeling Spatial Uncertainty. As the name suggests.

  16. Weihs, C. and Mersmann, O. and Ligges, U. Foundations of Statistical Algorithms: With References to R Packages. Another excellent reference on numerical analysis and algorithms.

  17. Nahmias, S. and Cheng, Y. Production and operations analysis. One-stop-shop for industrial statistics and industrial engineering.

  18. Cano, E.L. and Moguerza, J.M. and Redchuk, A. Six sigma with R: statistical engineering for process improvement. For statistical process control.

  19. Cox, D.R. and Reid, N. The theory of the design of experiments. For DOE.

  20. Anderson, T.W. An introduction to multivariate statistical analysis.

  21. The Princeton Companion to Applied Mathematics. For short, readable, and authoritative explanations of EVERYTHING in applied mathematics.

  22. Bryant, R.E., and O’Hallaron D.R. Computer Systems: A Programmer’s Perspective. If you want to know how a computer works, for the purpose of programming (not maitaining it).

Feel free to use this list, and recommend more entries.

Some other book recommendations worth following: