We will get started.
Today we are planning to do this lecture on Restricted Boltzmann Machine.
This is a generative model.
Normally, it is an unsupervised machine learning.
So you already learned about three different types of machine learning.
Unsupervised is one of them.
Supervised is when we have level data.
Unsupervised is...
Yeah, please come.
I have to record anyway.
Like, okay, thank you for coming.
Okay, I'll get started again.
Okay, since we have one student now, again I'll start.
Okay, we are planning to do this lecture on Restricted Boltzmann Machine.
So you already learned about three different kinds of machine learning, right?
Supervised, unsupervised and reinforcement learning also.
So this is a generative machine learning model and this is normally categorized into unsupervised
learning.
Okay, so we will learn about that.
So when you talk about generative model, so I think you already know now that you can
machine learning generate really realistic looking images of people or anything that
matters, right?
So essentially, these are generative models that can produce realistic but different images
but you cannot distinguish them.
So under the hood what this machine learning models learn is a probability distribution
of the training set.
So for example, if you have given images, those are the data, it will learn the probability
distribution of that image and based on that it can generate new images.
So therefore the goal of this generative model is to use a neural network to generate
previously unseen examples, examples means the data.
So according to some probability distribution of the training samples.
You might already remember that okay, you can use recurrent neural network to generate
text.
For example, if you are given some text, then it will learn also some probability distribution,
then it will be able to based on this probability distribution guess what would be the next
word or sentence.
So this is also something there but what we are planning is something different.
It is called Boltzmann machine.
This is different in the sense that it will really connect you to concepts from statistical
mechanics of spin systems.
So this is really close to physicists.
So in fact, it started in 1980s and 90s.
John Huffield and other people really started working on this kind of models.
So these are models where you have some spin systems and for those spin system you can
write some energy functional.
Like you can have energy of the spins and their connectivities and they found an analogy
between this stochastic nature of this artificial neurons and the spins.
So spins can have plus or minus or up and down.
Similarly artificial neural network also like if we have a sigmoid function you consider
that you have that activation function that gives you 0 and 1.
Presenters
Zugänglich über
Offener Zugang
Dauer
01:07:52 Min
Aufnahmedatum
2024-07-11
Hochgeladen am
2024-07-12 10:49:04
Sprache
en-US