## Machine Learning for Engineering and Science Applications

Machine Learning for Engineering and Science Applications

• Machine Learning for Engineering and Science Applications – Intro Video
• Introduction to the Course History of Artificial Intelligence
• Overview of Machine Learning
• Why Linear Algebra ? Scalars, Vectors, Tensors
• Basic Operations
• Norms
• Linear Combinations Span Linear Independence
• Matrix Operations Special Matrices Matrix Decompositions
• Introduction to Probability Theory Discrete and Continuous Random Variables
• Conditional – Joint – Marginal Probabilities Sum Rule and Product Rule Bayes’ Theorem
• Bayes’ Theorem – Simple Examples
• Independence Conditional Independence Chain Rule Of Probability
• Expectation
• Variance Covariance
• Some Relations for Expectation and Covariance (Slightly Advanced)
• Machine Representation of Numbers, Overflow, Underflow, Condition Number
• Optimization – 1 Unconstrained Optimization
• Introduction to Constrained Optimization
• Introduction to Numerical Optimization Gradient Descent – 1
• Gradient Descent – 2 Proof of Steepest Descent Numerical Gradient Calculation Stopping Criteria
• Introduction to Packages
• A Linear Regression Example
• Linear Regression Least Squares Gradient Descent
• Coding Linear Regression
• Generalized Function for Linear Regression
• Goodness of Fit
• Feedforward Neural Network
• Structure of an Artificial Neuron
• Multinomial Classification – One Hot Vector
• Multinomial Classification- Introduction
• XOR Gate
• NOR, AND, NAND Gates
• OR Gate Via Classification
• Logistic Regression
• Summary of Week 05
• Introduction to back prop
• Biological neuron
• Schematic of multinomial logistic regression
• Multinomial Classification – Softmax
• Code for Logistic Regression
• Differentiating the sigmoid
• Binary Entropy cost function
• Introduction to Week 5 (Deep Learning)
• Introduction to Convolution Neural Networks (CNN)
• Types of convolution
• CNN Architecture Part 1 (LeNet and Alex Net)
• CNN Architecture Part 2 (VGG Net)
• CNN Architecture Part 3 (GoogleNet)
• CNN Architecture Part 4 (ResNet)
• CNN Architecture Part 5 (DenseNet)
• Train Network for Image Classification
• Semantic Segmentation
• Hyperparameter optimization
• Transfer Learning
• Segmentation of Brain Tumors from MRI using Deep Learning
• Introduction to RNNs
• Summary of RNNs
• Deep RNNs and Bi- RNNs
• Why LSTM Works
• LSTM
• RNN Architectures
• Training RNNs – Loss and BPTT
• Example – Sequence Classification
• Batch Normalizing
• Data Normalization
• Learning Rate decay, Weight initialization
• Activation Functions
• Introduction- Week 09
• Knn
• Binary decision trees
• Binary regression trees
• Bagging
• Random Forest
• Boosting
• Unsupervised learning & Kmeans
• Agglomerative clustering
• Probability Distributions Gaussian, Bernoulli
• Covariance Matrix of Gaussian Distribution
• Central Limit Theorem
• Naïve Bayes
• MLE Intro
• PCA part 1
• PCA part 2
• Support Vector Machines
• MLE, MAP and Bayesian Regression
• Introduction to Generative model