References
Agrawal, Rakesh, Tomasz Imieliński, and Arun Swami. 1993. “Mining
Association Rules Between Sets of Items in Large Databases.” In
Proceedings of the 1993 ACM SIGMOD International Conference on
Management of Data, 207–16. SIGMOD ’93. New York, NY, USA:
Association for Computing Machinery. https://doi.org/10.1145/170035.170072.
Bahdanau, Dzmitry, Kyunghyun Cho, and Yoshua Bengio. 2014. “Neural
Machine Translation by Jointly Learning to Align and Translate.”
CoRR abs/1409.0473. https://api.semanticscholar.org/CorpusID:11212020.
Belenky, Gregory, Nancy J. Wesensten, David R. Thorne, Maria L. Thomas,
Helen C. Sing, Daniel P. Redmond, Michael B. Russo, and Thomas J.
Balkin. 2003. “Patterns of Performance Degradation and Restoration
During Sleep Restriction and Subsequent Recovery: A Sleep Dose-Response
Study.” Journal of Sleep Research 12: 1--12.
Bellman, R. 1957. “A Markovian Decision Process.”
Journal of Mathematics and Mechanics 6 (5). http://www.jstor.org/stable/24900506.
Boeckmann, A. J., L. B. Sheiner, and S. L. Beal. 1992. NONMEM Users
Guide: Part v, Introductory Guide. NONMEM Project Group, University
of California, San Francisco.
Borne, Kirk. 2021. “Association Rule Mining — Not Your Typical ML
Algorithm.” Medium. https://medium.com/@kirk.borne/association-rule-mining-not-your-typical-ml-algorithm-97acda6b86c2.
Box, George E. P. 1976. “Science and Statistics.”
Journal of the American Statistical Association 71 (356):
791–99.
Breiman, Leo. 1996. “Bagging Predictors.” Machine
Learning 24: 123–40.
———. 2001a. “Random Forests.” Machine Learning 45:
5–32.
———. 2001b. “Statistical Modeling: The Two Cultures.”
Statistical Science 16 (3): 199–231.
Breiman, L., J. H. Friedman, R. A. Olshen, and C. J. Stone. 1984.
Classification and Regression Trees. Wadsworth, Pacific Grove,
CA.
Cleveland, William S. 1979. “Robust Locally Weighted Regression
and Smoothing Scatterplots.” Journal of the American
Statistical Association 74 (368): 829–36. https://doi.org/10.1080/01621459.1979.10481038.
Davidian, M., and D. M. Giltinan. 1995. Nonlinear Models for
Repeated Measurement Data. Chapman & Hall, London.
Ferrari, S. L. P., and F. Cribari-Neto. 2004. “Beta Regression for
Modeling Rates and Proportions.” Journal of Applied
Statistics 31 (7): 799–815.
Fraley, Chris, and Adrian E Raftery. 2002. “Model-Based
Clustering, Discriminant Analysis, and Density Estimation.”
Journal of the American Statistical Association 97 (458):
611–31. https://doi.org/10.1198/016214502760047131.
Friedman, J. H. 2002. “Stochastic Gradient Boosting.”
Computational Statistics and Data Analysis 38 (4): 367–78.
Friedman, Jerome, Hastie Trevor, and Robert Tibshirani. 2000.
“Additive Logistic Regression: A Statistical View of
Boosting.” The Annals of Statistics 28 (2): 337–407.
Furnival, George M., and Robert W. Wilson. 1974. “Regression by
Leaps and Bounds.” Technometrics 16 (4): 499–511.
García-Portugués, E. 2024. Notes for Predictive Modeling. https://bookdown.org/egarpor/PM-UC3M/.
Gilliland, D., and O. Schabenberger. 2001. “Limits on Pairwise
Association for Equi-Correlated Binary Variables.” Journal of
Applied Statistical Sciences 10: 279--285.
Goodfellow, Ian, Yoshua Bengio, and Aaron Courville. 2016. Deep
Learning. MIT Press.
Gower, J. C. n.d. “A General Coefficient of Similarity and Some of
Its Properties.” Biometrics 27 (4): 857–71.
Grue, Lars, and Arvid Heiberg. 2006. “Notes on the History of
Normality–Reflections on the Work of Quetelet and Galton.”
Scandinavian Journal of Disability Research 8 (4): 232–46.
Hartigan, J. A., and M. A. Wong. 1979. “Algorithm AS 136: A
k-Means Clustering Algorithm.” Journal of the Royal
Statistical Society. Series C (Applied Statistics) 28 (1): 100–108.
Harville, D. A. 1976. “Extension of the Gauss-Markov Theorem to
Include the Estimation of Random Effects.” The Annals of
Statistics 4: 384–95.
Hastie, T. J., and R. Tibshirani. 1990. Generalized Additive
Models. Chapman & Hall, London.
Hastie, Trevor, Robert Tibshirani, and Jerome Friedman. 2001. The
Elements of Statistical Learning. Springer Series in Statistics.
New York, NY, USA: Springer New York Inc.
Henderson, C. R. 1950. “The Estimation of Genetic
Parameters.” The Annals of Mathematical Statistics 21
(2): 309–10.
———. 1984. Applications of Linear Models in Animal Breeding.
University of Guelph.
Hinne, Max, Quentin F. Gronau, Don van den Bergh, and Eric-Jan
Wagenmakers. 2020. “A Conceptual Introduction to Bayesian Model
Averaging.” Advances in Methods and Practices in
Psychological Science 3 (2): 200–215. https://doi.org/10.1177/2515245919898657.
James, Gareth, Daniela Witten, Trevor Hastie, and Robert Tibshirani.
2021. An Introduction to Statistical Learning: With Applications in
r, 2nd Ed. Springer. https://www.statlearning.com/.
Kleinbaum, David G., Lawrence L. Kupper, A. Nizam, and Eli S. Rosenberg.
2013. Applied Regression Analysis and Other Multivariable Methods, 5
Ed. Cengage Learning.
Little, R., and D. Rubin. 1987. Statistical Analysis with Missing
Data. Wiley, New York.
Lundberg, Scott M., Gabriel G. Erion, and Su-In Lee. 2018.
“Consistent Individualized Feature Attribution for Tree
Ensembles.” https://arxiv.org/abs/1802.03888.
Lundberg, Scott M., and Su-In Lee. 2017. “A Unified Approach to
Interpreting Model Predictions.” 31st Conference on Neural
Information Processing Systems (NIPS). https://arxiv.org/abs/1705.07874.
Mallows, C. L. 1973. “Some Comments on Cp.”
Technometrics 15 (4): 661–75.
Mazzanti, Samuele. 2020. “SHAP Values Explained Exactly How You
Wished Someone Explained to You. Making Sense of the Formula Used for
Computing SHAP Values.” Medium. https://towardsdatascience.com/shap-explained-the-way-i-wish-someone-explained-it-to-me-ab81cc69ef30.
McCullagh, P., and J. A. Nelder Frs. 1989. Generalized Linear
Models, 2nd Ed. Chapman & Hall, New York.
McCulloch, Warren S., and Walter Pitts. 1943. “A Logical Calculus
of the Ideas Immanent in Nervous Activity.” Bulletin of
Mathematical Biophysics 5: 115–33.
Mead, R., R. N. Curnow, and A. M. Hasted. 1993. Statistical Methods
in Agriculture and Experimental Biology. CRC Press, New York; Boca
Raton, FL.
Miller, Alan J. 1984. “Selection of Subsets of Regression
Variables.” Journal of the Royal Statistical Society, Series
A. 147 (3): 389–425.
Molnar, Christoph. 2022. Interpretable Machine Learning: A Guide for
Making Black Box Models Explainable. 2nd ed. https://christophm.github.io/interpretable-ml-book.
Nash, Warwick J., Tracy L. Sellers, Simon R. Talbot, Andrew J. Cawthorn,
and Wes B Ford. 1994. “The Population Biology of Abalone
(*Haliotis* Species) in Tasmania. I. Blacklip Abalone (*h. Rubra*) from
the North Coast and Islands of Bass Strait.”
Pinheiro, J. C., and D. M. Bates. 1995. “Approximations to the
Log-Likelihood Function in the Nonlinear Mixed-Effects Model.”
Journal of Computational and Graphical Statistics 4: 12–35.
Prater, N. H. 1956. “Estimate Gasoline Yields from Crudes.”
Petroleum Refiner 35 (3).
Raftery, Adrian E. 1995. “Bayesian Model Selection in Social
Research.” Sociological Methodology 25: 111–63.
Ratkowsky, D. A. 1983. Nonlinear Regression Modeling. Marcel
Dekker, New York.
———. 1990. Handbook of Nonlinear Regression Models. Marcel
Dekker, New York.
Sankaran, Kris. 2024. “Data Science Principles for Interpretable
and Explainable AI.” Journal of Data Science, 1–27. https://doi.org/10.6339/24-JDS1150.
Schabenberger, O., and Francis J. Pierce. 2001. Contemporary
Statistical Models for the Plant and Soil Sciences. CRC Press, Boca
Raton.
Schabenberger, O., B. E. Tharp, Kells J. J., and D. Penner. 1999.
“Statistical Tests for Hormesis and Effective Dosages in Herbicide
Dose Response.” Agronomy Journal 91: 713–21.
Shapley, L. 1953. “A Value for n-Peson Games.” In
Contributions to the Theory of Games II, edited by N. Kuhn and
A. Tucker, 307–17. Princeton University Press, Princeton.
Sutton, Clifton D. 2005. “Classification and Regression Trees,
Bagging, and Boosting.” Handbook of Statistics 24:
303–29.
Vaswani, Ashish, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion
Jones, Aidan N. Gomez, Łukasz Kaiser, and Illia Polosukhin. 2017.
“Attention Is All You Need.” In Proceedings of the 31st
International Conference on Neural Information Processing Systems,
6000–6010. NIPS’17. Red Hook, NY, USA: Curran Associates Inc.
Zhang, Grace. 2018. “What Is the Kernel Trick? Why Is It
Important?” Medium. https://medium.com/@zxr.nju/what-is-the-kernel-trick-why-is-it-important-98a98db0961d.