-artificial neural network- chapter 4 朝陽科技大學 資訊管理系 李麗華 教授
Post on 21-Dec-2015
444 views
TRANSCRIPT
-Artificial Neural Network-Chapter 4
朝陽科技大學資訊管理系李麗華 教授
朝陽科技大學 李麗華 教授 2
Outline
• ADALINE-------------------------------------------------- 2• MADALINE------------------------------------------------ 6• Least-Square Learning Rule ------------------------- 7• The proof of Least-Square Learning Rule ------- 13
朝陽科技大學 李麗華 教授 3
ADALINE• ADALINE(Adaptive Linear Neuron) is a network
model proposed by Bernard Widrow in1959.
single processing element
...
PEX1
X2
X3
朝陽科技大學 李麗華 教授 4
Method• Method : The value in each unit must +1 or –1
net = iiWX
0 net 1
if
0 net 1
net 1 221100
Y
XWXWXWWX nn
This is different from perception's transfer function.
朝陽科技大學 李麗華 教授 5
Method (T-Y) , T expected output
ADALINE can solve only linear problem(the limitation)
i i
i i i
W X
W W W
朝陽科技大學 李麗華 教授 6
MADALINE• MADALINE : It is composed of many ADALINE
( Multilayer Adaline. )
.........
Yj
No Wij
netj
Wij
nX
Xi
•After the second layer, the majority vote is used.
•if more than half of netj ≥ 0,then
output + 1,otherwise, output - 1
朝陽科技大學 李麗華 教授 7
Least-Square Learning Rule (1/2)
• Least-Square Learning Rule :
nnjj
n
i
tj XWXWXWNetXWNet
11000
.,i.e ,
) .i.e(,),,,(
pj1 ) .i.e(,),,,(
1
0
100
1
0
10
n
tn
n
tnj
W
W
W
WWWWW
X
X
X
XXXXX
朝陽科技大學 李麗華 教授 8
Least-Square Learning Rule (2/2)
• By applying the least-square learning rule the weights is :
PRW
,'
R
matrixn correlatio:R
e wherPRW
*
1
1
''2
'1
'1-*
p
XT
P
XXRRRRp
R
p
j
tjj
t
P
j
tjjP
or
朝陽科技大學 李麗華 教授 9
Exercise: Use Adaline (1/4)
1 2 3
1 2 3
1 1 1
1 1 1
Example X 1 X 0 X 1
0 1 1T T T
:
X1 X2 X3 Tj
X1 1 1 0 1
X2 1 0 1 1
X3 1 1 1 -1
• Sol. First calculate R
32
31
32
31
32
32
32
321
212
122
223
3
1
111
111
111
111
1
1
1
101
000
101
101
1
0
1
000
011
011
110
0
1
1
'3
'2
'1
R
R
R
R
2-W
2-W
3W
022
022
1223
0
0
3
1
32
31
32
31
32
32
32
321
0,0,3
1100
3
1
1111,1,11
1011,0,11
1100,1,11
1
2
1
321
321
321
3
2
1
*
3
2
1
WWW
WWW
WWW
W
W
W
PWR
P
P
P
P
t
t
t
t
• Verify the net:
• 代入( 1,1,0 ) net=3X1-2X2-2X3=1 Y=1 ok
代入( 1,0,1 ) net=3X1-2X2-2X3=1 Y=1 ok
代入( 1,1,1 ) net=3X1-2X2-2X3=-1 Y=-1 ok
3
ADALINE-2
-2
X1
X2
X3
Y
請同學們回家自己去找可以直接計算反矩陣的方法
朝陽科技大學 李麗華 教授 13
Proof of Least Square Learning Rule(1/3)
• Let us use Least Mean Square Error to ensure the minimum total error. As long as the total error approaches zero, the best solution is found. Therefore, we are looking for the minimum of 〈 〉。
• Proof:
2k
L
k
L
k
tkk
tkk
L
k
L
kkkk
L
k
L
kkkkk
L
kkk
L
kk
WXXWL
YTL
T
TYL
YTL
TL
YYTTL
YTLL
1 1
2k
2k
1 1
2
1
2
1
22
1
2
1
22
])([12
is let 121
)2(1
)(11
mean
mean of <Tk
2>
朝陽科技大學 李麗華 教授 14
Proof of Least Square Learning Rule(2/3)
WXXW
WXXWXWxwYps
L
k
tkk
t
k
L
k
L
k
L
k
tkk
tk
tn
iikik
)(
))(()()(.
1
1
1 1 1 1
22
1
2承上頁
WXXWWXTT
WXXWWXTL
T
WXXWXWTL
T
WXXL
WYTL
T
tkk
ttkkt
L
k
tkk
ttkkt
L
k
tkk
tk
tkkt
L
k
L
k
tk
tkkt
2
])(1
[2
)(2
)](1
[2
2
1
2
1
2
1 1
2
朝陽科技大學 李麗華 教授 15
=PRWPRW
PRWW
PRW
XTXTRW
WXTRWWTW
WXTRWWT
XXLRR
XT
RXXR
*-
k
tkkkk
tkk
tk
k
k
tkk
tk
tkk
L
k
tkk
ktkkk
或 = 即
=2-2 if
22
PLet 22
]'2[
minimal is such that W find want toWe
2
R is which )/'(Let
R'R'R'R'R'Let
Matrix n Correlatio called also ,matrix n n a is i.e., , Let
1*
*2
22
2
1LK21
承上頁
Proof of Least Square Learning Rule(3/3)