Prediction and Estimation Techniques
This lecture consists of a theoretical session and a tutorial. The main focus is on probabilistic estimation and conditional distributions which are critical for understanding advanced machine learning. Thus the lecture begins with the definition of random variables, probability distribution and density functions. Then it discusses mean, variance, covariance, conditional mean, and conditional variance. The second chapter deals with describing an unknown random variable with known random variable, assuming relationships between the unknown and known random variables exist. The chapter concludes by determining the relationship between two random variables assuming that their respective probability density functions are available. The third chapter builds on the second chapter by considering relationships between multiple random variables. Once these chapters are covered, the subsequent four chapters deal with estimation, first with maximum likelihood estimation, then with minimum mean square estimation. This discussion establishes the foundation for the final two chapters, namely, the Kalman Filter and the Particle Filter.
The lecture is accompanied by numerous real world examples involving real data processing with the latest machine learning models such as convolutional neural networks and transformers.