Home
Add Document
Sign In
Register
Contoh Studi Kasus Decision Tree
Home
Contoh Studi Kasus Decision Tree
Full description...
Author:
Amalia Utami
3 downloads
214 Views
354KB Size
Report
DOWNLOAD .PDF
Recommend Documents
Contoh Kasus Decision Tree
Contoh Kasus Decision TreeDeskripsi lengkap
Contoh Kasus Decision Tree
Contoh Kasus Decision Tree
Decision Tree
vvvvvvvvvvvvvvvvvvvv
Contoh Studi Kasus
studi kasus
Contoh Analisis STUDI KASUS
analisis studi kasusDeskripsi lengkap
Contoh Analisis STUDI KASUS
analisis studi kasusFull description
Contoh Kasus Studi Kohort
contohDeskripsi lengkap
Studi Kasus Toksikologi Contoh
skDeskripsi lengkap
Contoh Lap. Studi Kasus
Contoh Kasus Studi Kohort
contohFull description
Studi Kasus Toksikologi Contoh
sk
MAKALAH KLASIFIKASI DECISION TREE
Menghitung Entropy Decision Tree
Bagaimana perhitungan manual untuk membentuk tree menggunakan algoritma decision tree c45
MAKALAH KLASIFIKASI DECISION TREE
Pembahasan Decision Tree
aa
Decision Tree Analysis
Metode Decision Tree
Merck Decision Tree
Decision Tree and calculations for MerckFull description
Decision Tree Algorithm
Java source code for Decision Tree algorith
Decision Tree Clips
clips decision tree
3 Contoh Studi Kasus Kepemimpinan
Studi Kasus
Contoh Studi Kasus Naive Bayes
jurnal Naive BayesFull description
3 Contoh Studi Kasus Kepemimpinan
Studi Kasus
contoh studi kasus dan penyelesaian
Contoh Studi Kasus Bank Bca
Contoh Studi Kasus MSDM BCAFull description
Solusi Quiz Senin,10 Mei 2010
Sistem Pendukung Keputusan
Berdasarkan data-data berikut ini buat Decision tree-nya
Solusi: Node Awal
[1]
Hitung Entropy awal Jumlah Instance Total = 7 Jumlah Instance Yes = 4 Jumlah Instance No = 3
EntropyS ) = − PYes log 2 PYes − P No log 2 PNo
4
= − log 2
4
3
− log 2
3
7 7 7 7 = −0.57 log 2 0.57 − 0.43 log 2 0.43
= −0.57 (− 0.811) − 0.43(− 1.218 ) = 0.462 + 0.524 = 0.986 [2]
Hitung Entropy dan Infromation Gain per Atribut untuk menentukan node awal Atribut <=30
Age Yes No Yes No Yes No
31..40 >40
0 2 2 0 2 1
Entropy ( Age <= 30) = − PYes log 2 PYes − P No log 2 PNo
0
= − log 2
0
2
− log 2
2
2 2 2 2 = −0 log 2 0 − 1log 2 1
=0
Entropy ( Age = 31..40) = − PYes log 2 PYes − P No log 2 PNo
2
= − log 2
2
0
− log 2
0
2 2 2 2 = −1log 2 1 − 0 log 2 0
=0
IF-UTAMA
Ver/Rev:0/0
Halaman: 1 dari 7
Solusi Quiz Senin,10 Mei 2010
Sistem Pendukung Keputusan
Entropy ( Age > 40) = − PYes log 2 PYes − P No log 2 PNo
2
= − log 2
2
1
− log 2
1
3 3 3 3 = −0.67 log 2 0.67 − 0.33 log 2 0.33
= −0.67(− 0.578) − 0.33(− 1.6) = 0.387 + 0.528 = 0.915
∑
InformatonGain( Age ) = Entropy(S ) −
v∈{>= 30 , 31..40 , > 40}
= 0.986 −
S <=30 S
Entropy ( Age <= 30 ) −
S 31..40 S
S v S
Entropy(S v )
Entropy( Age = 31..40) −
S > 40 S
Entropy( Age > 40)
⎛ 2 ⎞ ⎛ 2 ⎞ ⎛ 3 ⎞ = 0.986 − ⎜ × 0 ⎟ − ⎜ × 0 ⎟ − ⎜ × 0.915 ⎟ ⎝ 7 ⎠ ⎝ 7 ⎠ ⎝ 7 ⎠ = 0.986 − 0.392 = 0.594 Atribut High
income Yes 1 No 2 Yes 1 No 0 Yes 2 No 1
Medium Low
Entropy ( Income = High ) = − PYes log 2 PYes − P No log 2 PNo
1
= − log 2
1
2
− log 2
2
3 3 3 3 = −0.33 log 2 0.33 − 0.67 log 2 0.67
= −0.33(− 1.6) − 0.67(− 0.578) = 0.528 + 0.387 = 0.915 Entropy ( Income = Medium) = − PYes log 2 PYes − P No log 2 PNo 1
1
0
= − log 2 − log 2
0
1 1 1 1 = −1log 2 1 − 0 log 2 0
=0 Entropy ( Income = Low) = − PYes log 2 PYes − P No log 2 PNo
2
= − log 2
2
1
− log 2
1
3 3 3 3 = −0.67 log 2 0.67 − 0.33 log 2 0.33
= −0.67(− 0.578) − 0.33(− 1.6) = 0.387 + 0.528 = 0.915 IF-UTAMA
Ver/Rev:0/0
Halaman: 2 dari 7
Solusi Quiz Senin,10 Mei 2010
Sistem Pendukung Keputusan
S v
∑
InformatonGain( Income) = Entropy(S ) −
v∈{ High , Medium , Low}
= 0.986 −
S High S
Entropy( High ) −
S Medium S
S
Entropy( S v )
Entropy( Medium) −
S Low S
Entropy( Low)
⎛ 3 ⎞ ⎛ 1 ⎞ ⎛ 3 ⎞ = 0.986 − ⎜ × 0.915 ⎟ − ⎜ × 0 ⎟ − ⎜ × 0.915 ⎟ ⎝ 7 ⎠ ⎝ 7 ⎠ ⎝ 7 ⎠ = 0.986 − 0.392 − 0 − 0.392 = 0.202 Atribut Yes
student Yes 2 No 1 Yes 2 No 2
No
Entropy(Student = No ) = − PYes log 2 PYes − P No log 2 PNo
2
= − log 2
2
2
− log 2
2
4 4 4 4 = −0.5 log 2 0.5 − 0.5 log 2 0.5
= −0.5(− 1) − 0.5(− 1) = 0.5 + 0.5 =1 Entropy (Student = Yes) = − PYes log 2 PYes − P No log 2 PNo 2
= − log 2
2
1
− log 2
1
3 3 3 3 = −0.67 log 2 0.67 − 0.33 log 2 0.33
= −0.67(− 0.578) − 0.33(− 1.6) = 0.387 + 0.528 = 0.915
∑
InformatonGain(Student ) = Entropy(S ) −
v∈{Yes , No}
= 0.986 −
S Yes S
Entropy (Student = Yes) −
S No S
S v S
Entropy(S v )
Entropy(Student = No )
⎛ 3 ⎞ ⎛ 4 ⎞ = 0.986 − ⎜ × 0.915 ⎟ − ⎜ × 1⎟ ⎝ 7 ⎠ ⎝ 7 ⎠ = 0.986 − 0.392 − 0.571 = 0.023 Atribut Fair
Excelent
IF-UTAMA
Leasing_rating Yes 1 No 3 Yes 2 No 1
Ver/Rev:0/0
Halaman: 3 dari 7
Solusi Quiz Senin,10 Mei 2010
Sistem Pendukung Keputusan
Entropy (Leasing_rating = Fair ) = − PYes log 2 PYes − P No log 2 PNo
1
= − log 2
1
3
− log 2
3
4 4 4 4 = −0.25 log 2 0.25 − 0.75 log 2 0.75
= −0.25(− 2) − 0.75(− 0.415) = 0.5 + 0.311 = 0.811 Entropy (Leasing_rating = Excelent) = − PYes log 2 PYes − P No log 2 PNo 2
= − log 2
2
1
− log 2
1
3 3 3 3 = −0.67 log 2 0.67 − 0.33 log 2 0.33
= −0.67(− 0.578) − 0.33(− 1.6) = 0.387 + 0.528 = 0.915 InformatonGain(Leasing_rating ) = Entropy(S ) −
S v
∑ v∈{Fair , Excelent }
= 0.986 −
S Fair S
Entropy (Leasing_rating = Fair ) −
S Excelent S
S
Entropy(S v )
Entropy(Leasing_rating = Excelent)
⎛ 4 ⎞ ⎛ 3 ⎞ = 0.986 − ⎜ × 0.811⎟ − ⎜ × 0.915 ⎟ ⎝ 7 ⎠ ⎝ 7 ⎠ = 0.986 − 0.463 − 0.392 = 0.131 Atribut Age Income Student Leasing_Rate
Information Gain 0.594 0.202 0.023 0.131
Karena Atribut Age memiliki Nilai Information Gain tertinggi maka Atribut tersebut dijadikan node awal, sehingga decision tree-nya menjadi
[3]
Hitung Entropy dan Infromation Gain per Atribut untuk menentukan node cabang dari edge >40
IF-UTAMA
Ver/Rev:0/0
Halaman: 4 dari 7
Solusi Quiz Senin,10 Mei 2010
Sistem Pendukung Keputusan
Jumlah Instance untuk Atribut Age > 40 = 3 Jumlah Instance Yes = 2 Jumlah Instance No = 1 Atribut High
income Yes 0 No 0 Yes 1 No 0 Yes 1 No 1
Medium Low
Entropy ( Income = High ) = − PYes log 2 PYes − P No log 2 PNo
0
= − log 2
0
0
− log 2
0
0 0 0 0 = −0 log 2 0 − 0 log 2 0
=0 Entropy ( Income = Medium) = − PYes log 2 PYes − P No log 2 PNo
1
1
0
= − log 2 − log 2
0
1 1 1 1 = −1log 2 1 − 0 log 2 0
=0
Entropy ( Income = Low) = − PYes log 2 PYes − P No log 2 PNo
1
= − log 2
1
1
− log 2
1
2 2 2 2 = −0.5 log 2 0.5 − 0.5 log 2 0.5
= −0.5(− 1) − 0.5(− 1) = 0.5 + 0.5 =1 InformatonGain( Income) = Entropy(S ) −
= 0.986 −
S High S Age > 40
Entropy ( High ) −
S Medium S Age > 40
∑
S v
v∈{ High , Medium , Low} S Age > 40
Entropy( Medium) −
Entropy(S v ) S Low S Age > 40
Entropy( Low)
⎛ 0 ⎞ ⎛ 1 ⎞ ⎛ 2 ⎞ = 0.986 − ⎜ × 0 ⎟ − ⎜ × 0 ⎟ − ⎜ × 1⎟ ⎝ 3 ⎠ ⎝ 3 ⎠ ⎝ 3 ⎠ = 0.986 − 0 − 0 − 0.667 = 0.319 Atribut Yes
No
IF-UTAMA
student Yes 1 No 1 Yes 1 No 0
Ver/Rev:0/0
Halaman: 5 dari 7
Solusi Quiz Senin,10 Mei 2010
Sistem Pendukung Keputusan
Entropy(Student = No ) = − PYes log 2 PYes − P No log 2 PNo
1
= − log 2
1
1
− log 2
1
2 2 2 2 = −0.5 log 2 0.5 − 0.5 log 2 0.5
= −0.5(− 1) − 0.5(− 1) = 0.5 + 0.5 =1 Entropy (Student = Yes) = − PYes log 2 PYes − P No log 2 PNo 1
1
0
= − log 2 − log 2
0
1 1 1 1 = −1log 2 1 − 0 log 2 0
= −1(0) − 0 =0
∑
InformatonGain(Student ) = Entropy(S ) −
v∈{Yes, No}
= 0.986 −
S Yes S Age > 40
Entropy (Student = Yes) −
S v S Age > 40
S No S Age > 40
Entropy(S v )
Entropy(Student = No)
⎛ 2 ⎞ ⎛ 1 ⎞ = 0.986 − ⎜ × 1⎟ − ⎜ × 0 ⎟ ⎝ 3 ⎠ ⎝ 3 ⎠ = 0.986 − 0.667 − 0 = 0.319 Atribut Fair
Leasing_rating Yes 2 No 0 Yes 0 No 1
Excelent
Entropy (Leasing_rating = Fair ) = − PYes log 2 PYes − P No log 2 PNo
2
= − log 2
2
1
− log 2
1
3 3 3 3 = −0.67 log 2 0.67 − 0.33 log 2 0.33
= −0.67(− 0.578) − 0.33(− 1.6) = 0.387 + 0.528 = 0.915 Entropy (Leasing_rating = Excelent) = − PYes log 2 PYes − P No log 2 PNo 0
= − log 2
0
1
− log 2
1
1 1 1 1 = −0 log 2 0 − 1log 2 1
= 0 − 1(0) =0
IF-UTAMA
Ver/Rev:0/0
Halaman: 6 dari 7
Solusi Quiz Senin,10 Mei 2010
Sistem Pendukung Keputusan
S v
∑
InformatonGain(Leasing_rating ) = Entropy(S ) −
v∈{Fair , Excelent }
= 0.986 −
S Fair S Age > 40
Entropy (Leasing_rating = Fair ) −
S Excelent S Age > 40
S Age > 40
Entropy(S v )
Entropy(Leasing_rating = Excelent)
⎛ 2 ⎞ ⎛ 1 ⎞ = 0.986 − ⎜ × 0.915 ⎟ − ⎜ × 0 ⎟ ⎝ 3 ⎠ ⎝ 3 ⎠ = 0.986 − 0.61 − 0 = 0.376 Atribut Income Student Leasing_Rate
Information Gain 0.319 0.319 0.376
Karena Atribut Leasing_Rate memiliki Nilai Information Gain tertinggi maka Atribut tersebut dijadikan node cabang untuk edge >40, sehingga decision tree-nya menjadi
[4]
Decision tree yang dihasilkan adalah
IF-UTAMA
Ver/Rev:0/0
Halaman: 7 dari 7
×
Report "Contoh Studi Kasus Decision Tree"
Your name
Email
Reason
-Select Reason-
Pornographic
Defamatory
Illegal/Unlawful
Spam
Other Terms Of Service Violation
File a copyright complaint
Description
×
Sign In
Email
Password
Remember me
Forgot password?
Sign In
Our partners will collect data and use cookies for ad personalization and measurement.
Learn how we and our ad partner Google, collect and use data
.
Agree & close