Machine Learning for Engineering and Science Applications
Machine Learning for Engineering and Science Applications
- Machine Learning for Engineering and Science Applications – Intro Video
- Introduction to the Course History of Artificial Intelligence
- Overview of Machine Learning
- Why Linear Algebra ? Scalars, Vectors, Tensors
- Basic Operations
- Norms
- Linear Combinations Span Linear Independence
- Matrix Operations Special Matrices Matrix Decompositions
- Introduction to Probability Theory Discrete and Continuous Random Variables
- Conditional – Joint – Marginal Probabilities Sum Rule and Product Rule Bayes’ Theorem
- Bayes’ Theorem – Simple Examples
- Independence Conditional Independence Chain Rule Of Probability
- Expectation
- Variance Covariance
- Some Relations for Expectation and Covariance (Slightly Advanced)
- Machine Representation of Numbers, Overflow, Underflow, Condition Number
- Derivatives,Gradient,Hessian,Jacobian,Taylor Series
- Matrix Calculus (Slightly Advanced)
- Optimization – 1 Unconstrained Optimization
- Introduction to Constrained Optimization
- Introduction to Numerical Optimization Gradient Descent – 1
- Gradient Descent – 2 Proof of Steepest Descent Numerical Gradient Calculation Stopping Criteria
- Introduction to Packages
- The Learning Paradigm
- A Linear Regression Example
- Linear Regression Least Squares Gradient Descent
- Coding Linear Regression
- Generalized Function for Linear Regression
- Goodness of Fit
- Bias-Variance Trade Off
- Gradient Descent Algorithms
- Feedforward Neural Network
- Structure of an Artificial Neuron
- Multinomial Classification – One Hot Vector
- Multinomial Classification- Introduction
- XOR Gate
- NOR, AND, NAND Gates
- OR Gate Via Classification
- Logistic Regression
- Summary of Week 05
- Introduction to back prop
- Biological neuron
- Schematic of multinomial logistic regression
- Multinomial Classification – Softmax
- Code for Logistic Regression
- Gradient of logistic regression
- Differentiating the sigmoid
- Binary Entropy cost function
- Introduction to Week 5 (Deep Learning)
- Introduction to Convolution Neural Networks (CNN)
- Types of convolution
- CNN Architecture Part 1 (LeNet and Alex Net)
- CNN Architecture Part 2 (VGG Net)
- CNN Architecture Part 3 (GoogleNet)
- CNN Architecture Part 4 (ResNet)
- CNN Architecture Part 5 (DenseNet)
- Train Network for Image Classification
- Semantic Segmentation
- Hyperparameter optimization
- Transfer Learning
- Segmentation of Brain Tumors from MRI using Deep Learning
- Introduction to RNNs
- Summary of RNNs
- Deep RNNs and Bi- RNNs
- Why LSTM Works
- LSTM
- RNN Architectures
- Vanishing Gradients and TBPTT
- Training RNNs – Loss and BPTT
- Example – Sequence Classification
- Batch Normalizing
- Data Normalization
- Learning Rate decay, Weight initialization
- Activation Functions
- Introduction- Week 09
- Knn
- Binary decision trees
- Binary regression trees
- Bagging
- Random Forest
- Boosting
- Gradient boosting
- Unsupervised learning & Kmeans
- Agglomerative clustering
- Probability Distributions Gaussian, Bernoulli
- Covariance Matrix of Gaussian Distribution
- Central Limit Theorem
- Naïve Bayes
- MLE Intro
- PCA part 1
- PCA part 2
- Support Vector Machines
- MLE, MAP and Bayesian Regression
- Introduction to Generative model
- Generative Adversarial Networks (GAN)
- Variational Auto-encoders (VAE)
- Applications: Cardiac MRI – Segmentation & Diagnosis
- Applications: Cardiac MRI Analysis – Tensorflow code walkthrough
- Introduction to Week 12
- Application 1 description – Fin Heat Transfer
- Application 1 solution
- Application 2 description – Computational Fluid Dynamics
- Application 2 solution
- Application 3 description – Topology Optimization
- Application 3 solution
- Application 4 – Solution of PDE/ODE using Neural Networks
- Summary and road ahead