Gaussian Distributions in Machine Learning
DOI:
https://doi.org/10.54097/fb5qm725Keywords:
Gaussian Distributions, Bayesian Linear Regression, Gaussian Process Regression, Laplace Approximation, Gaussian Expectation PropagationAbstract
To help teaching and learning of machine learning courses, properties and application examples of Gaussian distributions in machine learning are presented, including classification, regression, and approximation. In each typical cases, emphasis is placed on the assumptions and the conclusions related to Gaussian distributions, so that students can make clear that it is the Gaussian distribution that make the somewhat abstract Bayes’ theorem more concrete and yields closed-form analytical predictive distributions.
Downloads
References
Hui-Tzu Chang, Chia-Yu Lin (2024). Applying competition-based learning to stimulate students’ practical and competitive AI ability in a machine learning curriculum. IEEE Transactions on Education, p.256-265.
Christopher M. Bishop and Hugh Bishop (2023). Deep learning: foundations and concepts. Springer.
Christopher M. Bishop (2006). Pattern recognition and machine learning. Springer.
Carl Edward Rasmussen and Christopher K. I. Williams (2006). Gaussian processes for machine learning. MIT Press.
Philipp Hennig, Michael A Osborne, and Hans P Kersting (2022). Probabilistic Numerics. Cambridge University Press.
Kevin Patrick Murphy (2023). Probabilistic machine learning: advanced topics. MIT Press.
Downloads
Published
Issue
Section
License
Copyright (c) 2024 Journal of Education and Educational Research

This work is licensed under a Creative Commons Attribution 4.0 International License.









