The basics of supervised learning: What are parameters, What is a bias node, Why do we use a learning rate
Techniques for dealing with data: How to Split Datasets, One-hot Encoding, Handling Missing Values
Vectors, matrices and creating faster code using Vectorization
Mathematical concepts such as Optimization, Derivatives and Gradient Descent
Gain a deep understanding behind the fundamentals of Feedforward, Convolutional and Recurrent Neural Networks
Build Feedforward, Convolutional and Recurrent Neural Networks using only the fundamentals
How to use Tensorflow 2.0 and Keras to build models, create TFRecords and save and load models
Practical project: Style Transfer – Use AI to draw an image in the style of your favorite artist
Practical project: Object Detection – Use AI to Detect the bounding box locations of objects inside of images
Practical project: Transfer Learning – Learn to leverage large pretrained AI models to work on new datasets
Practical project: One-Shot Learning – Learn to build AI models to perform tasks such as Face recognition
Practical project: Text Generation – Build an AI model to generate text similar to Romeo and Juliet
Practical project: Sentiment Classification – Build an AI model to determine whether text is overall negative or positive
Practical project: Attention Model – Build an attention model to build an interpretable AI model
Gain a deep understanding of Supervised Learning techniques by studying the fundamentals and implementing them in NumPy.
Gain hands-on experience using popular Deep Learning frameworks such as Tensorflow 2 and Keras.
Section 1 – The Basics:
– Learn what Supervised Learning is, in the context of AI
– Learn the difference between Parametric and non-Parametric models
– Learn the fundamentals: Weights and biases, threshold functions and learning rates
– An introduction to the Vectorization technique to help speed up our self implemented code
– Learn to process real data: Feature Scaling, Splitting Data, One-hot Encoding and Handling missing data
– Classification vs Regression
Section 2 – Feedforward Networks:
– Learn about the Gradient Descent optimization algorithm.
– Implement the Logistic Regression model using NumPy
– Implement a Feedforward Network using NumPy
– Learn the difference between Multi-task and Multi-class Classification
– Understand the Vanishing Gradient Problem
– Overfitting
– Batching and various Optimizers (Momentum, RMSprop, Adam)
Section 3 – Convolutional Neural Networks:
– Fundamentals such as filters, padding, strides and reshaping
– Implement a Convolutional Neural Network using NumPy
– Introduction to Tensorfow 2 and Keras
– Data Augmentation to reduce overfitting
– Understand and implement Transfer Learning to require less data
– Analyse Object Classification models using Occlusion Sensitivity
– Generate Art using Style Transfer
– One-Shot Learning for Face Verification and Face Recognition
– Perform Object Detection for Blood Stream images
Section 4 – Sequential Data
– Understand Sequential Data and when data should be modeled as Sequential Data
– Implement a Recurrent Neural Network using NumPy
– Implement LSTM and GRUs in Tensorflow 2/Keras
– Sentiment Classification from the basics to the more advanced techniques
– Understand Word Embeddings
– Generate text similar to Romeo and Juliet
– Implement an Attention Model using Tensorflow 2/Keras