PROGRAM
Monday 15
Morning lessons (9:00-13:00)
- Learning in the context of machine learning
- Different types of machine learning (terminology)
- Main 3 components of learning
- Gradient Descent (the most known optimiser) and its variations
- Typical ML pipeline
- Most used Python libraries in each of the ML pipeline phase
- Most important ML metrics in the case of classification and regression
- One-hot encoding and how it works
Hands-on (14:30-17:30)
- Introduction to python
- Introduction to numpy, pandas, matplotlib
- Minimisation of functions in Python (and the concept of the learning rate)
- Gradient Descent in Python (and its variation: batch, mini-batch, stochastic)
- Non-linear fitting with Python
- scikit-learn library introduction
- Examples of regression and classification with python and scikit-learn
Tuesday 16
Morning lessons (9:00-13:00)
- Unbalanced datasets and what approaches can be used in that scenarios
- PCA and its mathematics (limitations, requirements, etc.)
- Best practices in programming in ML projects (file structure, github repository structure, etc.)
- Model validation and what approaches exist and can be used in which specific scenario
- Neural network introduction
- Keras and its main functionalities
Hands-on (14:30-17:30)
- Model validation with scikit-learn (hold-out approach, k-fold, leave-one-out, etc.)
- Handling of imbalanced datasets with imblearn (oversampling, undersampling, SMOTE, etc.)
- Examples of complete project with astrophysics data
- Example of dimensionality reduction and PCA
Wednesday 17
Morning lessons (9:00-13:00)
- Neural Networks: tasks that can be solved with one neuron
- Convolutional Neural Networks
- Regularisation
- Hyper-parameter Tuning
- Computer Vision
Hands-on (14:30-17:30)
- TensorFlow and Keras (introduction to sequential and functional API, custom training loops, callbacks classes, etc.)
- Neural Networks (feed forward and convolutional neural networks)
- Regularisation (l1, l2, drop-out)
- Example of CNN on a computer vision problem
Thursday 18
Morning lessons (9:30-12:30)
- Autoencoders
- Generative Adversarial Networks
- PCA
- Error in Labels
Hands-on (14:30-17:30)
- Feature selection (examples)
- Hyperparameter tuning
- Optimisers and neural networks
- Application of advanced topics: callback classes, custom training loops, etc.
- Advanced neural network architectures (autoencoders and generative adversarial networks)
Friday 19
Morning lessons (9:30-12:30)
- Dr. Kruk's seminar:
Exploring astronomy data archives at large scales using deep-learning and crowd-sourcing
- Transfer Learning
- Feature Selection - an introduction
|