Skip to content

jfkcooper/LENS_ML_School_2021

Repository files navigation

LENS Machine Learning School 2021

A host for the tutorial material for the machine learning school 2021. This school took place in the week commencing 15 Feburary 2021. Lecture recordings, where taken, are available at https://youtube.com/playlist?list=PLb8u5qKr67D3oTDELd6DXX-OWKYiZjQbr.

This School is a collaboration of working group 4 as part of the League of Advanced European Neutrons Sources (LENS) but was also supported by the STFC's Scientific Machine Learning Group (SciML) group and the Jülich Supercomputing Centre (JSC).



o To use this material, if you are unfamiliar with git we reccommend donloading the entire repository (the green code button and download .zip file)
o To run the tutorials on colab (https://colab.research.google.com/) you will need a google acount, then select github when prompted for a notebook and insert this repository (https://github.com/jfkcooper/LENS_ML_School_2021) which should then find all of the notebooks
o Alot of the notebooks have "lecturer editions" with answers, or answers hidden at the bottom of the page if you get stuck
o A slack workspace has also been created for this school (https://join.slack.com/t/lensmlschool2021/shared_invite/zt-m5hi20cj-NoriZQbku~BuDQgge~BG8A) pleae join the conversation


Lecture 1: Introduction to deep learning and neural networks (Jos Cooper)

o Terminology

o The perceptron

o Fundamentals of deep learning: neural networks, nodes, weights, biases, activation functions, backpropogation and some of the maths behind it

o Introduction to Tensorflow, Pytorch, and Keras

Lecture 2: Dense neural networks and regression (Jos Cooper)

o Supervised learning

o Epochs, metrics, batch processing

o Training, validation, testing, prediction

Lecture 3: Convolutional neural networks and classification (Emmanouela Rantsiou)

o Filters, convolution, layers

o Connections, activations, down sampling

o Training, classification, metrics

o Pre-processing

o Augmentation, regularization

o Hyper-parameter tuning

o Transfer learning

Lecture 4: Traditional ML methods (Andrew McCluskey)

o Decision trees

o Gradient boosting

o Principle component analysis (PCA)

o Bayesian model selection

Lecture 5: Image segmentation (Anders Kaestner)

o Object detection

o Tomography

o SegNet and/or ResNet

o Semi-supervised learning

Lecture 6: Recurrent neural networks (Gagik Vardanyan)

o Time series

o Simple RNNs

o LSTMs

o GRUs

Lecture 7:Generative Adversarial Networks, GANs (Kuangdai Leng)

o Introduction to generative models: VAEs and GANs

o GANs: basics and practice

Lecture 8: Natural language processing and speech recognition (Gagik Vardanyan & Guanghan Song)

o Semantic space, word-to-vec

o NNTK, spacey

o Machine translation, seq-to-seq methods

Lecture 9: Uncertainty and attention (Mario Teixeira Parente)

o Bayesian methods

o Gaussian attention / spatial transformers

Lecture 10: Unsupervised learning - clustering (Marina Ganeva)

Part 1

o Introduction

o Clustering

o Manifold learning

Part 2

o Reinforcement learning

About

A host for the tutorial material for the machine learning school 2021

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published