Welcome everybody to the Tuesday session.
For most of you it's the second part for today.
And we are back to pattern analysis.
And on Tuesdays we have just 45 minutes,
so no mind map, no big picture,
dig right away into the theory.
What we are currently considering is
the problem of linear discriminant analysis.
Yesterday we talked about idiot's approaches
and Gaussian's and Gaussian classifiers,
and on the road towards the linear discriminant analysis,
we have considered Gaussian distributed features,
class conditionals that are Gaussian's
that we have yesterday introduced the feature transform
that seems to be quite impressive.
It seems to be quite important.
If we have a Gaussian where the level sets of the covariance
metrics are some kind of squeezed sausage type of structure,
we can find a transformation that more or less
result in so-called spherical data where we say the covariance
metrics and the level sets here are concentric circles more
or less, no ellipses anymore.
And then also we found out that under certain circumstances,
the basic classifier is somehow related to the nearest
neighbor classifier.
Can somebody tell me briefly what the core idea was?
In which case is the nearest neighbor classifier a basic
classifier for which particular situation?
Is your name Steve?
No.
Where is Steve?
No Steve here?
OK, I have heard about you.
Somebody told me you're always late and you're not attending
the exercises on a regular basis.
So.
I have to keep my name in the course.
Pardon me?
I have to keep my name in the course.
Yes, that's fine.
I have to keep my name in the course.
There was no need to justify what you were doing.
So what did, yeah, good question.
Steve, you're my candidate.
So nearest neighbor classifier and basic classifier.
Sometimes.
Yes, when?
When?
Yes?
Perhaps if the centers are in the mean of the Gaussian
Presenters
Zugänglich über
Offener Zugang
Dauer
00:48:06 Min
Aufnahmedatum
2009-05-05
Hochgeladen am
2017-07-05 12:37:00
Sprache
en-US