Rather than taking a blackbox approach to statistical modelling, as has been the trend in Machine Learning application via off-the-shelf methods, in this workshop we will take a step back and explore some basic and core statistical methods. The topics selected are chosen since they are integral to many Machine Learning methods. Therefore, obtaining a sound appreciation of such methods can enhance the understanding and complement the application of Machine Learning toolboxes and packages.
In this workshop we will focus on explainable and statistically interpretable rigorous methods that can be applied with sound statistical assumptions and verification methods to assess adequacy of assumptions underpinning the models and methods.
The topics included in the workshop cover largely methodological and conceptual approaches to feature extraction and model building. In this manner, all participants can then develop specific detailed applications to explore and build upon these concepts for their particular use cases.
Topics discussed include:
- Feature extraction: vector and functional feature methods covering probabilistic PCA, ICA and functional variants.
- Persistence and Memory features.
- Time Series Regression Modelling using features.
- Kernels and Gaussian Processes
- Warped Gaussian Processes
- Kernel Exponential Family and generalised GLM approaches.
Generally, the level of the course will tend to focus on methodology and providing an introduction for participants to concepts and statistical modelling approaches in the topics mentioned above. Proofs of concepts discussed will be referenced in the slides and provided after the workshop for further details.