clustering notes btech cse handwritten.pdf

gulshanparjapati71 7 views 22 slides Sep 04, 2025
Slide 1
Slide 1 of 22
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22

About This Presentation

Notes


Slide Content

What 7
Juste fuster analysis is
nsupervised learning technigue
4 の pre 0 5 x way thor
byes の the same oz の を (cluster
AL mo mila each other
than Lo tho ın の の 2 る
oaithn

-mean chu

ieranchical

Eoupectalion. Moximizalion. algw

> Density based clustercng

®

K- means clustering

Algonithw
Randomly seleck k cluster cenbe
VI Ve
Step 2: Colewlate ho distance belween
each data point aj ond each cluster
Step a

Assign, 008 point a; to

the cluster, ce vi dei which

the distance lla Vell is minimum

Step 4: Recaleulate each cluiter contr by

taking the average of clusters data pais
Step 5: Repeat

P ん し step 2 to steps
until Ahe accalwloted cluster centers
の Az some as previous の ん No

reassignment of date point happend:

いい を 호
性 5
Distance between data poin
We as that each. data paint à

dimensional vector

he distance between two data points

is defined os

A TR 4 Om”

e

the cluster

Assigned
tot

©

> each

observed

there

contatnina ‘all

y whether

decision regard

ano to be merged on not i
y ve

token based on the measure dissimi laute,

between. the cluster

4 dissimilarity

do

Average
Linkage Tinkage Linkage

Pet?)

ee ea

Complete

4(182=mamfd(m9) : eh, geal
t dl

whslitchirn of

gglomenative Clustexing

E

distance matrix, construct a

hiernachical

, method.

ndegram. by

Step 1: a euch 2

ten, so that we have (N=5)

clusters, each containing just one. item

Data set = $ a,b, cd, e%

ret cluster set c,: Sah $05, {cf fab fet

Step4: Repeat stepz ond 3 untill all items
ane clustered into à single cluster
에 sige N

Divisive Analysis

ance of x

Move + objec. with the maximun

4 each.

D verage | gel
y = aegee】d(cg):geCit -

&) Compute
average$ (xy): ye)

り Find on objet x in G dor which De

If Do then move x to G

Step 5: Repeat step4 until al differences
Dy, ane negative. Then. Cy is split
into C and 9

8)

3

xpectaluon. - Maximisalien Agoaithw
cor -maximisahion algorith

© dind が

teal model

sta Ls)

egualions cannot be

variables and

hence

be solved olisecth using MLE

method

algoxithm

ん ose the parametas の to be

estimated.

step 2: Enpectalion step (E-Step)- Using the
observed available data of the
dataset, estimate guess) the values
of the missing data.

Step 3: Maximigälin step (M-step) - Complete deta]
genenaded adten the enpectadion step
ts used 加 update the parameters, 0,
by manimizing Likelhood fanction

Steps: Repent slep 2 and 3 until converge.

9 ds
unlabeled dla pot

pao blero

algoasthm

Sor.

Gaus:

sian Mixture

to blem

Suppose we ant given a
observations Lt, Rz,
ne variable X

Let X be a mix of k normal

distoibubions ond Lek the probability density

Likelihood Sunctien

Mi soe, M (for

the means Ms, the

and the mining

HT

Recolculate the promos using

step 4:Evaluate the Log-likelihood Sunetiom
and check Sor convergence of either
the parameters or the Jey. Sikelihood
function Tf coveye ther stop;
Ehe ga step 2.
Tags