Overview
This classroom based training session will contain presentations and computer based examples and case study exercises to undertake with relevant neural and deep network libraries
Requirements
Knowledge/appreciation of machine learning, systems architecutre and programming languages are desirable
Course Outline
- Overview of neural networks and deep learning
- The concept of Machine Learning (ML)
- Why we need neural networks and deep learning?
- Selecting networks to different problems and data types
- Learning and validating neural networks
- Comparing logistic regression to neural network
- Neural network
- Biological inspirations to Neural network
- Neural Networks– Neuron, Perceptron and MLP(Multilayer Perceptron model)
- Learning MLP – backpropagation algorithm
- Activation functions – linear, sigmoid, Tanh, Softmax
- Loss functions appropriate to forecasting and classification
- Parameters – learning rate, regularization, momentum
- Building Neural Networks in Python
- Evaluating performance of neural networks in Python
- Basics of Deep Networks
- What is deep learning?
- Architecture of Deep Networks– Parameters, Layers, Activation Functions, Loss functions, Solvers
- Restricted Boltzman Machines (RBMs)
- Autoencoders
- Deep Networks Architectures
- Deep Belief Networks(DBN) – architecture, application
- Autoencoders
- Restricted Boltzmann Machines
- Convolutional Neural Network
- Recursive Neural Network
- Recurrent Neural Network
- Overview of libraries and interfaces available in Python
- Caffee
- Theano
- Tensorflow
- Keras
- Mxnet
- Choosing appropriate library to problem
- Building deep networks in Python
- Choosing appropriate architecture to given problem
- Hybrid deep networks
- Learning network – appropriate library, architecture definition
- Tuning network – initialization, activation functions, loss functions, optimization method
- Avoiding overfitting – detecting overfitting problems in deep networks, regularization
- Evaluating deep networks
- Case studies in Python
- Image recognition – CNN
- Detecting anomalies with Autoencoders
- Forecasting time series with RNN
- Dimensionality reduction with Autoencoder
- Classification with RBM