การพัฒนาโปรแกรมด้วย weka เบื้องต้น by...
TRANSCRIPT
-
8/7/2019 WEKA by
1/12
Waikato Environment for Knowledge Analysis
Presentation By : Mr.Wassanun Sangjun [Nick]
Go >>>
Exit
CS@KMUTNB
WEKA
-
8/7/2019 WEKA by
2/12
Topics in presentation today
1) Explorer Application1.1) How to use data ? - 1.2) Preprocess Tab 1.3) Classify Tab -
2) Classify Tab2.1) Test Option - 42.2) Classifier Output
3) Multilayer Perceptron [MLP]3.1) Principle of MLP MLP
Next
-
8/7/2019 WEKA by
3/12
Explorer Application
How to use data ? - WEKA
Next
Explorer
Application Explorer Open file
-
8/7/2019 WEKA by
4/12
Explorer Application
Next
.CSV .ARFF
.CSV MS Excel
-
8/7/2019 WEKA by
5/12
Explorer Application
Next .ARFF
@attribute column name datatype
- Numeric - Nominal String , DateTime
@relation name
@data attribute
-
8/7/2019 WEKA by
6/12
Next
1 2
3
5
4
-
8/7/2019 WEKA by
7/12
Explorer Application
Next
1
2
3
5
4
Preprocess Tab Maximum , Minimum , Mean , SD
Classify Tab
attribute
attribute
Maximum , Minimum , Mean , SD attribute attribute
-
8/7/2019 WEKA by
8/12
Classify Tab
Next
1
2
3
Classify Tab
-
8/7/2019 WEKA by
9/12
Classify Tab
Next
1
2
3
Multilayer Perceptron
4
(Correlation Coefficient)
(Mean Absolute Error)
attribute numeric nominal
-
8/7/2019 WEKA by
10/12
Classify Tab
Test Option 4
Next
1) Use Training Set Use all data to train and test.
2) Supplied test set Can separate data for train and test by yourself.
3) Cross validation folds Separate all data into number of groupthen leave one out for test and the other for train. Repeat it untilall of datas group being tested.Example : Have data 3 groups (3 cycles) , so the cycle as follows.3.1) group 1 for test , group 2,3 for train.3.2) group 2 for test , group 1,3 for train.
3.3) group 3 for test , group 1,2 for train.
4) Percentage split Data sampling as a percentage.Example : 25% means 25% for train and 75% for test.
-
8/7/2019 WEKA by
11/12
Multilayer Perceptron [MLP]
Multilayer Perceptron ()
Next
-
8/7/2019 WEKA by
12/12
Multilayer Perceptron [MLP]
4
Exit
1) Input Layer -1 1
2) Hidden Layer weight (neurons) hidden layer
1 input
3) Output Layer
4) Neurons