Difference between EEE3773 & EEE4773 from the Professors
Hi EE Majors
-A brief overview of what
topics are covered in each course
EEE3773 – Types of learning,
evaluation metrics and goodness-of-fit measures, k-nearest neighbors, decision
trees, random forests, Bootstrap sampling, Linear Classifiers (LDA, Perceptron,
logistic regression), feature selection and feature extraction, experimental
design and hyperparameter tuning, neural networks (MLP, architectures), Deep
Learning (CNNs), scikit-learn library, PyTorch, Git and GitHub.
EEE4773 – Types of learning,
Experimental Design, Curse of Dimensionality, Maximum Likelihood
Estimation, Maximum A Posteriori, Conjugate priors, probabilistic generative
models, Naïve Bayes classifier, Mixture Model (Gaussian Mixture Model),
Expectation-Maximization (EM) algorithm, evaluation metrics and goodness-of-fit
measures, k-means clustering, k-nearest neighbors, Discriminative
classification (LDA, Perceptron, logistic regression), Lagrange multipliers,
hard-margin and soft-margin SVM, kernel trick, dimensionality reduction (PCA,
LDA), manifold learning (MDS, ISOMAP, LLE), neural networks (MLP,
backpropagation, CNNs, ML vs DL), Git and GitHub.
- EEE3773 is a practical approach to learning ML, where students will learn how to use off-the-shelf libraries to implement ML algorithms during lectures and assignments.
- EEE4773 is a theoretical approach to learning ML, where students will learn the basic principles of algorithms, write ML in equation form and solve for parameters. Students will have a computational approach to ML but instead of using libraries they will be implementing their analytical solutions using Python.
If a student knows they want a career-focused in AI/ML and
they feel confident in their programming ability (particularly python), then
they should go ahead and take 4773 directly. If they want some more
background before taking the more theory rigorous course or just want an
introduction to ML (without intention to study it further), then 3773 is a
better choice.
Comments
Post a Comment