chapter 8 人工智慧

82
CHAPTER 8 人人人人

Upload: brody-wyatt

Post on 01-Jan-2016

94 views

Category:

Documents


5 download

DESCRIPTION

CHAPTER 8 人工智慧. CH8 人工智慧. 何謂人工智慧 人工智慧的定義 人工智慧的發展 人工智慧技術的介紹. 8 .1 何謂人工智慧. 簡單的回顧十多年來的「人機大戰」 在 1996 年 國際象棋大師卡斯帕羅夫 (Garry Kasparov) 與電腦“深藍 (Deep-Blue)” 展開交鋒,結果卡斯帕羅夫以 4 比 2 宣告勝利。 到了 1997 年 經過 IBM 一年多的研究與改進,“更深的藍 ( 深藍 Ⅱ)” 誕生,並再次與卡斯帕羅夫對弈,深藍 Ⅱ 擊敗卡斯帕羅夫,此結果震驚世界。 2003 年 2 月 - PowerPoint PPT Presentation

TRANSCRIPT

PowerPoint

CHAPTER 8 1CH8

28.1 1996(Garry Kasparov)(Deep-Blue)421997IBM()20032IBM(Junior)33 200311X3DFritz22

2011217IBMWatsonJeopardy!artificial intelligence, AI 8.2 1985Charniak and McDemott1992Winston1990Schalkoff1993Lauger and Stubblefield1978Bellman1985Haugeland1990Kuzweil1991Rich and Kinght 1950(Alan Turing 1950)

(Agent) (Cont.) 8.3 Warren McCullohWalter Pitts1943 1956John McCarthyDartmouthDartmouthMcCarthy 1990 8.4 - - (Decision Tree) - 1965 Zadeh membership function8.4 -( )( )( ) -John Holland 1975 1) 2) (Selection)(Crossover)(Mutation) 3) 4) 5) (Artificial Intelligence)

......$^%&..AI1997IBM Deep-Blue():

BCD 83-24 170ABCDEFGHIJKA-67-24+3393-67 60-24+40+3319

- - -

(Rule)If, Then If, Then: :

Question : Fact:If Then If Then If Then If Then (Rule Base)(Semantic Net)

Why.?Why.?Why.?24!!:-> :12Question: ? (1)(2)(3)() :-> (Classifier)(supervised)(Classification model) :

(Decision Tree)ID3Quinlan1992(Decision Tree)ID31. 2. An ExampleDayOutlookTemperatureHumidityWindPlayTennisD1SunnyHotHighWeakNoD2SunnyHotHighStrongNoD3OvercastHotHighWeakYesD4RainMildHighWeakYesD5RainCoolNormalWeakYesD6RainCoolNormalStrongNoD7OvercastCoolNormalStrongYesD8SunnyMildHighWeakNoD9SunnyCoolNormalWeakYesD10RainMildNormalWeakYesD11SunnyMildNormalStrongYesD12OvercastMildHighStrongYesD13OvercastHotNormalWeakYesD14RainMildHighStrongNo

(1): If Outlook is Sunny and Humidity = High, Then NoPlay;(2): If Outlook is Sunny and Humidity = Normal, Then YesPlay;(3): If Outlook is Overcast, Then YesPlay;(4): If Outlook is Rain and Wind is Strong, Then NoPlay;(5): If Outlook is Rain and Humidity is High, Then YesPlay;(1)(2)(3)(4)(5)36

(1)(0)2526?6060?(1)(0)(0.8)(0.6)(0.4)(0.2)Question:168?

Example:Close to 0e.g. A(3) = 0.01A(1) = 0.09A(0.25) = 0.62A(0) = 1

Define a Membership Function: A(x) =

Example:Close to 0

Very Close to 0: A(x) = `

Membership function [0, 1]e.g.high : x [0, 1]0.6 high0.8 high0.1 highxSimpleIntuitively pleasingA generalization of crisp set

Vague member non-memberHighNot High1 0.8 0.6 0.4 0.2 00 or 1Non-member membergradualcrisp set Vague 42(AND)EX:(0.8) (0.6) (0.6)(OR)EX:(0.8) (0.6) (0.8)(NOT)1EX:0.8, 0.2

()Question : 00.80.310.60.800.30.90.70.10.50.80.50.3Answer= (0 AND 0.8) OR 0.3 = 0 OR 0.3 = 0.3= (1 AND 0.6) OR 0.8 = 0.8= (0 AND 0.3) OR 0.9 = 0.9= (0.7 AND 0.1) OR 0.5 = 0.5= (0.8 AND 0.5) OR 0.3 = 0.5

!

!A = {A1, A2, A3, A4, A5}A set of alternativesC = {C1, C2, C3}A set of criteriaC1 (big eyes)C2 (small mouth)C3 (good shape)A1 (Mary)00.80.3A2 (Judy)10.60.8A3 (Jan)00.30.9A4 (Mandy)0.70.10.5A5 (Nancy)0.80.50.3Assume : C1 and C2 or C3E (Ai) : evaluation function

E (A1) = (0 0.8) 0.3 = 0 0.3 = 0.3E (A2) = (1 0.6) 0.8 = 0.6 0.8 = 0.8E (A3) = (0 0.3) 0.9 = 0 0.9 = 0.9 the best choiceE (A4) = (0.7 0.1) 0.5 = 0.1 0.5 = 0.5E (A5) = (0.8 0.5) 0.3 = 0.5 0.3 = 0.5C1 (big eyes)C2 (small mouth)C3 (good shape)A1 (Mary)00.80.3A2 (Judy)10.60.8A3 (Jan)00.30.9A4 (Mandy)0.70.10.5A5 (Nancy)0.80.50.3(Cell)

a0a1a2a3a4a5a6a7a8a9a10a112 1 3 +1 5 1 -2 4 2 -3 2 1 +5-42I1(2, 1, 3, +) I2(1, 5, 1, -) I3(2, 4, 2, -) I4(3, 2, 1, +) Chosen Instance Current Weight VectorCycle0 (0,0,0)Cycle 1 (2,1,3,+) (2,1,3)Cycle 2 (1,5,1,-) (1,-4,2)Cycle 3 (2,4,2,-) (1,-4,2)Cycle 4 (3,2,1,+) (4,-2,3)Cycle 5 (2,1,3,+) (4,-2,3)Cycle 6 (1,5,1,-) (4,-2,3)Cycle 7 (2,4,2,-) (2,-6,1)Cycle 8 (3,2,1,+) (5,-4,2)Cycle 9 (2,1,3,+) (5,-4,2)Cycle 10 (1,5,1,-) (5,-4,2)Cycle 11 (2,4,2,-) (5,-4,2)Separable vs Non-Separable

Back-Propagation Model

(Crossover)(Mutation)

(Binary)(Integer)(Real)(Alphabet) (Selection) (Crossover) (One-point Crossover)(Two- point Crossover)(Uniform Crossover)(Mutation)0 11 0 (Fitness function) pp03131p p2 (genome)5p5031p15160111110000

fitness function31p p2 2. 3.2 5p117.75 p 10110 22 176 00011 3 87 00010 2 58 11001 25 150 - (selection(crossover)mutation

fitness value 10110 176 37.4% 1.50 00011 87 18.5% 0.74 00010 58 12.3% 0.49 11001 150 31.8% 1.27 4 00011101102117. 75140 10110 176 11001 150 00010 58 10110 176 1011000010223 (children) (crossover probability)pc0 . 5 1011000010223117 . 75178 . 5 p 10010 18 234 11001 25 150 00110 6 150 10110 22 176 1 23 p 10010 18 234 11101 29 58 00110 6 150 10110 22 176 optimal solution An ExampleA Function Find the max

Step1 Define a suitable representation Each Chromosome2 bits00 -> 001 -> 110 -> 211 -> 3In [0, 1]00 -> 001 -> 0.33310 -> 0.66611 -> 1Step1 Define a suitable representation Each Chromosome12 bits e.g. t = 0 000000000000 t = 1 111111111111 t = 0.680 101011100001Step2 Create an initial population of N N Population size Assume N = 40 Step3 Define a suitable fitness function f to evaluate the individuals Fitness function f(t)e.g. The first six individualsNo.bit stringtf(t)100011000000010,0940.97420100110011010.3000.91730001111111000.1240.64441011010001110.7050.44451110110001000.9230.15460111001111110.4530.125Step 4Perform the crossover and the mutation operations to generate the possible offsprings Crossover OffspringInheriting some characteristics of their parentse.g. Parent 1 : 00011 0000001Parent 2 : 01001 1001101

Child 1 : 000111001101Child 2 : 010010000001 Mutation For avoiding local optima

e.g. Bit change

e.g. Inversion1 1 1 1 0 0 0 0 0 1 0 0

0 1 1 1 0 0 0 0 0 1 0 01 1 1 1 0 1 0 0 0 1 0 0

1 1 1 0 1 1 0 0 0 1 0 0New OffspringsThe new offsprings produced by the operatorsNo.bit stringtf(t)10001100111100,1010.99920001100000010.0940.97430100110011010.3000.91740001111111000.1240.64451011010001110.7050.444Step 5Replace the individuale.g. The first six individuals

No.bit stringtf(t)10001100111100,1010.99920001100000010.0940.97430100110011010.3000.91740001111111000.1240.64451011010001110.7050.44460111001111110.4530.125NEWStep 6If the termination criteria are not satisfied, go to Step 4; otherwise, stop the genetic algorithm The termination criteria The maximum number of generations The time limit The population converged Experiment

8.5

8.5

8.5

8.5

8.5

8.5

8.5

8.5

+

Grand Challenges of AITranslating TelephoneAccidence-Avoiding CarLearning System