Mathematical Foundations of Machine Learning
- Περιγραφή
- Πρόγραμμα σπουδών
- Συχνές ερωτήσεις
Ενημέρωση για Σεμινάρια Udemy
Για να παρακολουθήσετε ένα Σεμινάριο που ανήκει στο Udemy ακολουθήστε τον κατάλληλο σύνδεσμο «Μπείτε στο Udemy»,και μέσα από το περιβάλλον του Udemy πληρώνετε το κόστος και το Σεμινάριο είναι στην διάθεση σας με τους όρους και τις προϋποθέσεις που αναφέρονται εκεί.
Οι τιμές πού εμφανίζονται στο Seminarpro ενδέχεται να είναι διαφορετικές από αυτές του Udemy , αυτό συμβαίνει διότι περιστασιακά το Udemy μεταβάλλει τις τιμές ανάλογα με την πολιτική προσφορών που εφαρμόζει. Η τιμή αγοράς είναι αυτή που εμφανίζεται κάθε φορά στο Udemy.
Τα Σεμινάρια που προβάλλονται στο Seminarpro αλλά προέρχονται από το Udemy περιέχουν video με ελληνικούς υπότιτλους. Οι υπόλοιπες δραστηριότητες όπως σημειώσεις ή ερωτήσεις κατανόησης είναι στα Αγγλικά.
Η πληρωμή γίνεται απευθείας στο Udemy και αυτό είναι υπεύθυνο για την έκδοση αντίστοιχων παραστατικών.
Περιγραφή Σεμιναρίου
Τα μαθηματικά αποτελούν τον πυρήνα της επιστήμης δεδομένων και της μηχανικής μάθησης. Επομένως, για να είστε ο καλύτερος επιστήμονας δεδομένων που μπορείτε να είστε, πρέπει να έχετε πλήρη κατανόηση των πιο συναφών μαθηματικών.
Το ξεκίνημα στην επιστήμη δεδομένων είναι εύκολο χάρη σε βιβλιοθήκες υψηλού επιπέδου όπως η Scikit-learn και η Keras.
Αλλά η κατανόηση των μαθηματικών πίσω από τους αλγόριθμους σε αυτές τις βιβλιοθήκες σας ανοίγει άπειρες δυνατότητες. Από τον εντοπισμό ζητημάτων μοντελοποίησης μέχρι την εφεύρεση νέων και πιο ισχυρών λύσεων, η κατανόηση των μαθηματικών πίσω από όλα αυτά μπορεί να αυξήσει δραματικά τον αντίκτυπο που μπορείτε να έχετε κατά τη διάρκεια της καριέρας σας.
Με επικεφαλής τον γκουρού της βαθιάς μάθησης Δρ. Jon Krohn, αυτό το μάθημα παρέχει μια σταθερή αντίληψη των μαθηματικών – συγκεκριμένα της γραμμικής άλγεβρας και του λογισμού – που αποτελούν τη βάση των αλγορίθμων μηχανικής μάθησης και των μοντέλων επιστήμης δεδομένων.
Course Sections
-
Linear Algebra Data Structures
-
Tensor Operations
-
Matrix Properties
-
Eigenvectors and Eigenvalues
-
Matrix Operations for Machine Learning
-
Limits
-
Derivatives and Differentiation
-
Automatic Differentiation
-
Partial-Derivative Calculus
-
Integral Calculus
Σε κάθε μια από τις ενότητες, θα βρείτε πολλές πρακτικές εργασίες, επιδείξεις κώδικα Python και πρακτικές ασκήσεις για ποιό αποτελεσματική κατανόηση.
-
1IntroductionText lesson
This is a warm welcome to the Mathematical Foundations of Machine Learning series of interactive video tutorials. It provides an overview of the Linear Algebra, Calculus, Probability, Stats, and Computer Science that we'll cover in the series and that together make a complete machine learning practitioner.
-
2What Linear Algebra IsVideo lesson
In this first video of my Mathematical Foundations of Machine Learning series, I introduce the basics of Linear Algebra and how Linear Algebra relates to Machine Learning, as well as providing a brief lesson on the origins and applications of modern algebra.
-
3Plotting a System of Linear EquationsVideo lesson
In this video, we recap the sheriff and robber exercise from the preceding video, now viewing the calculations graphically using an interactive code demo in Python.
-
4Linear Algebra ExerciseVideo lesson
This video provides an applied linear algebra exercise (involving solar panels) to challenge your understanding of the content from the preceding video.
-
5TensorsVideo lesson
In this video I describe tensors, the fundamental building block of linear algebra for any kind of machine learning.
-
6ScalarsVideo lesson
This is the first video in the course that makes heavy use of hands-on code demos. As described in the video, the default approach we assume for executing this code is within Jupyter notebooks within the (free!) Google Colab environment.
Pro tip: To prevent abuse of Colab (for, say, bitcoin mining), Colab sessions time out after a period of inactivity -- typically about 30 to 60 minutes. If your session times out, you'll lose all of the variables you had in memory, but you can quickly get back on track by following these three steps:
Click on the code cell you'd like to execute next.
Select "Runtime" from the Colab menubar near the top of your screen.
Select the "Run before" option. This executes all of the preceding cells and then you're good to go!
-
7Vectors and Vector TranspositionVideo lesson
This video addresses the theory and notation of 1-dimensional tensors, also known as vector tensors. In addition, we’ll do some hands-on code exercises to create and transpose vector tensors in NumPy, TensorFlow and PyTorch, the leading Python libraries for working with tensors.
-
8Norms and Unit VectorsVideo lesson
This video builds on the preceding one by explaining how vectors can represent a particular magnitude and direction through space. In addition, I’ll introduce norms, which are functions that quantify vector magnitude, and unit vectors. We’ll also do some hands-on exercises to code some common norms in machine learning, including L2 Norm, L1 Norm, Squared L2 Norm, and others.
-
9Basis, Orthogonal, and Orthonormal VectorsVideo lesson
This quick video addresses special types of vectors (basis, orthogonal, and orthonormal), which are critical for machine learning applications. We’ll also do a hands-on code exercise to mathematically demonstrate orthogonal vectors in NumPy.
-
10Matrix TensorsVideo lesson
This video covers 2-dimensional tensors, also known as matrices (or matrixes). We’ll cover matrix notation, and do a hands-on code demo on calculating matrices in NumPy, TensorFlow, and PyTorch.
-
11Generic Tensor NotationVideo lesson
In this video, we generalize tensor notation to tensors with any number of dimensions, including the high-dimensional tensors common to machine learning models. We also jump into a hands-on code demo to create 4-dimensional tensors in and PyTorch and TensorFlow.
-
12Exercises on Algebra Data StructuresVideo lesson
In this video, I present three questions to test your comprehension of the Linear Algebra concepts introduced in the preceding handful of videos.
-
13Segment IntroVideo lesson
This video introduces the second section, which is on Tensor Operations.
-
14Tensor TranspositionVideo lesson
This video introduces the theory of tensor transposition, and we carry out hands-on demos of transposition in NumPy, TensorFlow, and PyTorch.
-
15Basic Tensor Arithmetic, incl. the Hadamard ProductVideo lesson
This video demonstrates basic tensor arithmetic (including the Hadamard product) through hands-on code demos in NumPy, TensorFlow, and PyTorch.
-
16Tensor ReductionVideo lesson
In this video, we perform hands-on code demos in NumPy, TensorFlow, and PyTorch in order to learn about reduction, a common tensor operation in ML.
-
17The Dot ProductVideo lesson
This video covers the dot product, one of the most common tensor operations in machine learning, particularly deep learning. We’ll carry out hands-on code demos in NumPy, TensorFlow, and PyTorch to see the dot product in action.
-
18Exercises on Tensor OperationsVideo lesson
This video provides three exercises to test your comprehension of the preceding videos on basic tensor operations.
-
19Solving Linear Systems with SubstitutionVideo lesson
In this video, we use substitution to solve systems of linear equations on paper.
-
20Solving Linear Systems with EliminationVideo lesson
In this video, we use elimination to solve systems of linear equations on paper.
-
21Visualizing Linear SystemsVideo lesson
This video demonstrates how to visualize the systems of linear equations we solved in the preceding videos (on substitution and elimination). This video features hands-on code demos in Python that provide a crisp, geometric visualization of the lines in each system as well as the points that we solve for when we solve a system of linear equations.
-
22Segment IntroVideo lesson
We are now moving on to Matrix Properties, the third section of the course. Congratulations on making it here! In this section, we’ll be covering matrix properties that are vital to machine learning, including the Frobenius norm, matrix multiplication, matrix inversion and more. And of course, we’ll be doing plenty of hands-on code demos along the way.
-
23The Frobenius NormVideo lesson
This video explores the Frobenius norm, a function that allows us to quantify the size of a matrix. We’ll use a hands-on code demo in NumPy to solidify our understanding of the topic.
-
24Matrix MultiplicationVideo lesson
This video demonstrates matrix multiplication – the single most important and widely-used mathematical operation in machine learning. To ensure you get a solid grip on the principles of this key skill, we’ll use color diagrams, calculations by hand, interactive code demos, and an applied learning example.
-
25Symmetric and Identity MatricesVideo lesson
This video explores symmetric matrices, a special class of matrix tensors. The most important symmetric matrix to machine learning is the identity matrix. We’ll detail it, and other symmetric matrices, including with a hands-on code demo in PyTorch.
-
26Matrix Multiplication ExercisesVideo lesson
Here are three exercises to test your comprehension of the matrix properties that we’ve learned so far.
-
27Matrix InversionVideo lesson
This video introduces matrix inversion, a wildly useful transformation for machine learning. I’ll introduce the concept, and then we’ll use a series of colorful equations and hands-on code demos to solve for values in a simple regression-style problem.
While detailing how to determine the inverse of a matrix is outside the scope of this course, if you're keen to learn more on the topic, a clear tutorial can be found here: https://www.mathsisfun.com/algebra/matrix-inverse.html
-
28Diagonal MatricesVideo lesson
This video introduces diagonal matrices, a special matrix class that is important in machine learning.
-
29Orthogonal MatricesVideo lesson
This video covers the unique properties of orthogonal matrices as well as their relevance to machine learning.
-
30Orthogonal Matrix ExercisesVideo lesson
In this quick video from my Mathematical Foundations of Machine Learning series, I present a series of paper-and-pencil exercises that test your comprehension of the orthogonal matrix properties covered in the preceding video, as well as many of the other key matrix properties we covered earlier on.
-
31Segment IntroVideo lesson
Welcome to Subject 2 of the course! In this introductory video, I provide an overview of the topics covered in this subject, as well as a quick recap of the essential linear algebra topics we've covered so far -- topics you need to know to make the most of Subject 2.
-
32Applying MatricesVideo lesson
In this video, we go over three matrix application exercises together. Having a firm grasp of matrix application is critical to understanding affine transformations, eigenvectors, and eigenvalues -- the topics coming up next in the series!
-
33Affine TransformationsVideo lesson
In this video we use hands-on code demos in NumPy to carry out affine transformations, a particular type of matrix transformation that may adjust angles or distances between vectors, but preserves parallelism. These operations can transform the target tensor in a variety of ways including scaling, shearing, or rotation. Affine transformations are also key to appreciating eigenvectors and eigenvalues, the focus of the next videos in the series.
-
34Eigenvectors and EigenvaluesVideo lesson
In this video, I leverage colorful illustrations and hands-on code demos in Python to make it intuitive and easy to understand eigenvectors and eigenvalues, concepts that may otherwise be tricky to grasp.
-
35Matrix DeterminantsVideo lesson
In this video, I cover matrix determinants. A determinant is a special scalar value that we can calculate for any given matrix. It has a number of very useful properties, as well as an intimate relationship with eigenvalues that we’ll explore later on.
-
36Determinants of Larger MatricesVideo lesson
We’ve covered how to compute the determinant of a 2x2 matrix, but what if a matrix is larger than that? Well, that’s what this video’s for! In it, we’ll use recursion to calculate the determinant of larger matrices.
-
37Determinant ExercisesVideo lesson
All right, we’ve covered all the theory you need to calculate 2x2 determinants or larger determinants by hand. In this video, I have three exercises to test your comprehension of that theory.
-
38Determinants and EigenvaluesVideo lesson
This video illustrates the relationship between determinants and eigenvalues, using hands-on code demos in Python to give you an intuitive, working understanding of what’s going on.
-
39EigendecompositionVideo lesson
In this video we use hands-on code demos in Python to provide you with a working understanding of the eigendecomposition of a matrix and how we make use of it in machine learning.
-
40Eigenvector and Eigenvalue ApplicationsVideo lesson
In this video, I provide real-world applications of eigenvectors and eigenvalues, with special mention of applications that are directly relevant to machine learning.
-
41Segment IntroVideo lesson
Welcome to the final section of videos on linear algebra! In these videos, we cover the last key pieces of essential linear algebra you need to know to understand machine learning algorithms, including Singular Value Composition, Moore-Penrose Pseudoinversion, the Trace Operator, and Principal Component Analysis.
-
42Singular Value DecompositionVideo lesson
With a focus on hands-on code demos in Python, in this video I introduce the theory and practice of singular value decomposition, a common linear algebra operation in the field of machine learning.
-
43Data Compression with SVDVideo lesson
In this video, we take advantage of the singular value decomposition theory that we covered in the preceding video to dramatically compress data within a hands-on Python demo.
-
44The Moore-Penrose PseudoinverseVideo lesson
This video introduces Moore-Penrose pseudoinversion, a linear algebra concept that enables us to invert non-square matrices. The pseudoinverse is a critical machine learning concept because it solves for unknown variables within the non-square systems of equations that are common in machine learning. To show you how it works, we’ll use a hands-on code demo.
-
45Regression with the PseudoinverseVideo lesson
This is one of my favorite videos in the entire course! In it, we use Moore-Penrose pseudoinversion to solve for unknowns, enabling us to fit a line to points with linear algebra alone. When I first learned how to do this, it blew my mind -- I hope it blows your mind too!
-
46The Trace OperatorVideo lesson
This is a quick video on the Trace Operator, a relatively simple linear algebra concept, but one that frequently comes in handy for rearranging linear algebra equations, including ones that are common in machine learning.
-
47Principal Component Analysis (PCA)Video lesson
Via highly visual hands-on code demos in Python, this video introduces Principal Component Analysis, a prevalent and powerful machine learning technique for finding patterns in unlabeled data.
-
48Resources for Further Study of Linear AlgebraVideo lesson
Welcome to the final linear algebra video of the course! It’s a quick one to leave you with my favorite linear algebra resources so that you can dig deeper into the topics that pique your interest the most, if desired.
-
49Segment IntroVideo lesson
In the third subject of the course, we’ll use differentiation, including powerful automatic differentiation algorithms, to learn how to optimize learning algorithms. We’ll start with an introduction on what calculus is and learn what limits are in order to understand differentiation from first principles, primarily through the use of hands-on code demos in Python.
-
50Intro to Differential CalculusVideo lesson
This video uses colorful visual analogies to introduces what differential calculus at a high level.
-
51Intro to Integral CalculusVideo lesson
This video is a quick high-level intro to integral calculus.
-
52The Method of ExhaustionVideo lesson
This video introduces a centuries-old calculus technique called the Method of Exhaustion, which not only provides us with a richer understanding of how modern calculus works, but is still relevant today.
-
53Calculus of the InfinitesimalsVideo lesson
In this video, we use a hands-on code demo in Python to deeply understand how approaching a curve infinitely closely enables us to determine the slope of the curve.
-
54Calculus ApplicationsVideo lesson
In this video, I provide specific examples of how calculus is applied in the real world, with an emphasis on applications to machine learning.
-
55Calculating LimitsVideo lesson
This video is a big one, but have no fear! It has lots of interactive code demos in Python and opportunities to work through paper-and-pencil exercises to ensure that learning about the critical subject of limits is not only interesting but also fun.
-
56Exercises on LimitsVideo lesson
Feel like you’ve got a good handle on how to calculate limits? Let’s make sure with a handful of comprehension exercises.
-
57Segment IntroVideo lesson
In this section of Calculus videos, we use a combination of color-coded equations, paper-and-pencil exercises, and hands-on Python code demos to deeply understand how differentiation allows us to find derivatives.
-
58The Delta MethodVideo lesson
In this video, we use a hands-on code demo in Python to develop a deep understanding of the Delta Method, a centuries-old differential calculus technique that enables us to determine the slope of a curve.
-
59How Derivatives Arise from LimitsVideo lesson
This video picks up right where we left off, working out the solution to the exercise I left you with at the end of the preceding video, "The Delta Method". As we work through the solution, we’ll derive, from first principles, the most common representation of the equation of differentiation! This is a fun one in which we use hands-on code demos in Python to deeply understand how we can determine the slope of any curve.
-
60Derivative NotationVideo lesson
In this quick video, we cover all of the most common notation for derivatives.
-
61The Derivative of a ConstantVideo lesson
The next several videos will provide you with clear and colorful examples of all of the most important differentiation rules, including all of the rules that are directly relevant to machine learning such as how to find the derivative of cost functions — something we’ll tackle later in the course as an important part of the Calculus II subject. For now, we’ll kick the derivative rules off with a rule about constants.
-
62The Power RuleVideo lesson
This quick video covers the Power Rule, one of the most common and important differentiation rules.
-
63The Constant Multiple RuleVideo lesson
Today’s video covers the Constant Multiple Rule. The Constant Multiple Rule is often used in conjunction with the Power Rule, which was covered in the preceding video.
-
64The Sum RuleVideo lesson
This video covers the Sum Rule, a critical rule for differentiation.
-
65Exercises on Derivative RulesVideo lesson
Feeling comfortable with the derivative rules we’ve covered so far:
1. The derivative of a constant
2. The power rule
3. The constant multiple rule
4. And the sum rule?
Let’s test your understanding of them with five fun exercises that bring all of the rules together.
-
66The Product RuleVideo lesson
In this video I describe the product rule, which allows us to compute the derivative of two variables separately. The product rule can be tremendously useful in simplifying complex derivations, and when the product of the two variables is incalculable before differentiation.
-
67The Quotient RuleVideo lesson
The quotient rule is applicable in the same situations as the product rule, except it involves the division of two variables instead of multiplication.
-
68The Chain RuleVideo lesson
This video introduces the chain rule, which is arguably the single most important differentiation rule for machine learning. It facilitates several of the most ubiquitous ML algorithms, such as gradient descent and backpropagation — algorithms we detail later in this video series.
-
69Advanced Exercises on Derivative RulesVideo lesson
Combining the more basic derivative rules from earlier in the ML Foundations series with the product rule, quotient rule, and chain rule covered most recently, we’re now set for relatively advanced exercises that will confirm your comprehension of all of the rules.
-
70The Power Rule on a Function ChainVideo lesson
The Power Rule on a Function Chain, like it’s name suggests, merges together two other derivative rules — the Power Rule and the Chain Rule — into a single easy step.
-
71Segment IntroVideo lesson
The content we covered in the earlier Calculus sections of the course set us up perfectly for this segment, Automatic Differentiation. AutoDiff is a computational technique that allows us to move beyond calculating derivatives by hand and scale up the calculation of derivatives to the massive scales that are common in machine learning.
-
72What Automatic Differentiation IsVideo lesson
This video introduces what Automatic Differentiation — also known as AutoGrad, Reverse-Mode Differentiation, and Computational Differentiation — is.
-
73Autodiff with PyTorchVideo lesson
In this video, we use a hands-on code demo in PyTorch to see AutoDiff in action first-hand, enabling us to compute the derivatives equations instantaneously.
-
74Autodiff with TensorFlowVideo lesson
In this video, we use a hands-on code demo in TensorFlow to see AutoDiff in action first-hand, enabling us to compute the derivatives equations instantaneously.
-
75The Line Equation as a Tensor GraphVideo lesson
In this video, we get ourselves set up for applying Automatic Differentiation within a Machine Learning loop by first discussing how to represent an equation as a Tensor Graph and then actually creating that graph in Python code using the PyTorch library.
-
76Machine Learning with AutodiffVideo lesson
In preceding videos in this series, we learned all the most essential differential calculus theory needed for machine learning. In this epic video, it all comes together to enable us to perform machine learning from first principles and fit a line to data points. To make learning interactive and intuitive, this video focuses on hands-on code demos featuring the Python library PyTorch.