lecture 2 – problem solving, search and optimization

51
Lecture 2 – Problem Solving, Search and Optimization Shuaiqiang Wang ( 王王王 ) School of Computer Science and Technology Shandong University of Finance and Economics http://www2.sdufe.edu.cn/wangsq/ [email protected]

Upload: luana

Post on 24-Feb-2016

26 views

Category:

Documents


0 download

DESCRIPTION

Lecture 2 – Problem Solving, Search and Optimization. Shuaiqiang Wang ( 王帅强 ) School of Computer Science and Technology Shandong University of Finance and Economics http://www2.sdufe.edu.cn/wangsq/ [email protected]. Examples. What are Problems Here?. Property Nondeterministic - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Lecture 2 – Problem Solving, Search and Optimization

Lecture 2 – Problem Solving, Search and Optimization

Shuaiqiang Wang (王帅强 )School of Computer Science and Technology

Shandong University of Finance and Economicshttp://www2.sdufe.edu.cn/wangsq/

[email protected]

Page 2: Lecture 2 – Problem Solving, Search and Optimization

Examples

Page 3: Lecture 2 – Problem Solving, Search and Optimization

What are Problems Here?

• Property– Nondeterministic– Partially observable

• State: A representation of current information• Solution: A plan or a policy

– An feasible/optimal state– A feasible/optimal sequence of states

• Search!

Page 4: Lecture 2 – Problem Solving, Search and Optimization

Example: 4 Queens

Q

Q

Q

Q

Page 5: Lecture 2 – Problem Solving, Search and Optimization

( )

Page 6: Lecture 2 – Problem Solving, Search and Optimization

( )

Q

((1,1))

Page 7: Lecture 2 – Problem Solving, Search and Optimization

( )

QQ

((1,1))

((1,1) (2,3))

Page 8: Lecture 2 – Problem Solving, Search and Optimization

( )

Q

((1,1))

((1,1) (2,3))

Page 9: Lecture 2 – Problem Solving, Search and Optimization

( )

QQ

((1,1))

((1,1) (2,3)) ((1,1) (2,4))

Page 10: Lecture 2 – Problem Solving, Search and Optimization

( )

QQ

((1,1))

((1,1) (2,3)) ((1,1) (2,4))

Q

((1,1) (2,4) (3.2))

Page 11: Lecture 2 – Problem Solving, Search and Optimization

( )

QQ

((1,1))

((1,1) (2,3)) ((1,1) (2,4))

((1,1) (2,4) (3.2))

Page 12: Lecture 2 – Problem Solving, Search and Optimization

( )

Q

((1,1))

((1,1) (2,3)) ((1,1) (2,4))

((1,1) (2,4) (3.2))

Page 13: Lecture 2 – Problem Solving, Search and Optimization

( )

((1,1))

((1,1) (2,3)) ((1,1) (2,4))

((1,1) (2,4) (3.2))

Page 14: Lecture 2 – Problem Solving, Search and Optimization

( )

((1,1))

((1,1) (2,3)) ((1,1) (2,4))

((1,1) (2,4) (3.2))

Q

((1,2))

Page 15: Lecture 2 – Problem Solving, Search and Optimization

( )

((1,1))

((1,1) (2,3)) ((1,1) (2,4))

((1,1) (2,4) (3.2))

Q

((1,2))

Q

((1,2) (2,4))

Page 16: Lecture 2 – Problem Solving, Search and Optimization

( )

((1,1))

((1,1) (2,3)) ((1,1) (2,4))

((1,1) (2,4) (3.2))

Q

((1,2))

Q

((1,2) (2,4))

Q

((1,2) (2,4) (3,1))

Page 17: Lecture 2 – Problem Solving, Search and Optimization

QQ

QQ

( )

((1,1))

((1,1) (2,3)) ((1,1) (2,4))

((1,1) (2,4) (3.2))

((1,2))

((1,2) (2,4))

((1,2) (2,4) (3,1))

((1,2) (2,4) (3,1) (4,3))

Page 18: Lecture 2 – Problem Solving, Search and Optimization

Search Strategies

• Search strategies are evaluated along the following dimensions:– Completeness: does it always find a solution if one

exists?– Time complexity: number of nodes generated– Space complexity: maximum number of nodes in

memory– Optimality: does it always find a least-cost

solution?

Page 19: Lecture 2 – Problem Solving, Search and Optimization

Categories

• Uninformed search– Breadth-first search– Depth-first search

• Informed search– A* search– Hill-climbing search– Simulated annealing search– Genetic algorithms

Page 20: Lecture 2 – Problem Solving, Search and Optimization

A* Search

• Idea: avoid expanding paths that are already expensive

• Evaluation function f(n) = g(n) + h(n)• g(n) = cost so far to reach n• h(n) = estimated cost from n to goal• f(n) = estimated total cost of path through n to

goal

Page 21: Lecture 2 – Problem Solving, Search and Optimization

Example

Page 22: Lecture 2 – Problem Solving, Search and Optimization

Example

Page 23: Lecture 2 – Problem Solving, Search and Optimization

Example

Page 24: Lecture 2 – Problem Solving, Search and Optimization

Example

Page 25: Lecture 2 – Problem Solving, Search and Optimization

Example

Page 26: Lecture 2 – Problem Solving, Search and Optimization

Algorithm• Add the starting square (or node) to the open list.• Repeat the following:

a) Look for the lowest F cost square on the open list. We refer to this as the current square.b) Switch it to the closed list.c) For each of the 8 squares adjacent to this current square …

– If it is not walkable or if it is on the closed list, ignore it. Otherwise do the following.– If it isn't on the open list, add it to the open list. Make the current square the parent of this

square. Record the F, G, and H costs of the square.– If it is on the open list already, check to see if this path to that square is better, using G cost as the

measure. A lower G cost means that this is a better path. If so, change the parent of the square to the current square, and recalculate the G and F scores of the square. If you are keeping your open list sorted by F score, you may need to resort the list to account for the change.

• d) Stop when you:– Add the target square to the closed list, in which case the path has been found (see note below),

or– Fail to find the target square, and the open list is empty. In this case, there is no path.

Page 27: Lecture 2 – Problem Solving, Search and Optimization

Hill-Climbing

Problem: depending on initial state, can get stuck in local maxima

Page 28: Lecture 2 – Problem Solving, Search and Optimization

Example

Page 29: Lecture 2 – Problem Solving, Search and Optimization

Algorithm

Page 30: Lecture 2 – Problem Solving, Search and Optimization

Simulated Annealing

• Idea: escape local maxima by allowing some "bad" moves but gradually decrease their frequency

Page 31: Lecture 2 – Problem Solving, Search and Optimization

Genetic Algorithm

• A genetic representation of potential solutions to the problem.• A way to create a population (an initial set of potential solutions).• An evaluation function rating solutions in terms of their fitness.• Genetic operators that alter the genetic composition of offspring

(selection, crossover, mutation, etc.). • Parameter values that genetic algorithm uses (population size,

probabilities of applying genetic operators, etc.).

In general, a GA has 5 basic components

Page 32: Lecture 2 – Problem Solving, Search and Optimization

General Structure

Initialsolutions

start

1100101010

1011101110

0011011001

1100110001

encoding

chromosome

1100101010

1011101110

1100101110

1011101010

0011011001

0011001001

crossover

mutation

1100101110

1011101010

0011001001

solutions candidatesdecoding

fitness computation

evaluation

roulette wheel

selection

termination condition?

Y

N

best solutionstop

newpopulation

offspring

offspring

t 0 P(t)CC(t)

CM(t)

P(t) + C(t)

Page 33: Lecture 2 – Problem Solving, Search and Optimization

Example

Page 34: Lecture 2 – Problem Solving, Search and Optimization

Heuristic Search = Optimization

Evaluation function f(n)

Objective function f(x)

Optimization

Solution

Search

Page 35: Lecture 2 – Problem Solving, Search and Optimization

Optimization

• Definition:

• Local search– Hill-Climbing– Simulated Annealing– Genetic Algorithms

min ( )

. . ( ) 0, 1,2, ,( ) 0, 1,2, ,

x

i

j

f

s t g i mh j n

x

xx

Page 36: Lecture 2 – Problem Solving, Search and Optimization

Conventional Optimization

• Based on derivation/gradient– Construct F(x) based on f, g and h, and let– For example:

• Problem: For many problems, F(x) is very complicated, and it is very difficult to solve the differential equations

( ) 0i

Fx

x

( ) ( ) ( ) ( )i i j ji j

F f g h x x x x

. . 0is t

Page 37: Lecture 2 – Problem Solving, Search and Optimization

Gradient Descent

Page 38: Lecture 2 – Problem Solving, Search and Optimization

Gradient

1

2

n

xx

x

x

1

2

( )

( )( )

( )

n

f xxf xxf x

f xx

( ( ) ( )) ( ) ( )f x g x f x g x

( ( )) ( )a f x a f x

Page 39: Lecture 2 – Problem Solving, Search and Optimization

Example 1

11 2

2

( )T xa

f x ax bxb x

1

2

( )

( )( )

f xx a

f xf x bx

Page 40: Lecture 2 – Problem Solving, Search and Optimization

Property 1

T x

1

Proof

( )n

Ti i

i

f x x a x

11

22

( )

( )( )

( ) n

n

f xx

af x

axf x

af xx

Page 41: Lecture 2 – Problem Solving, Search and Optimization

Example 2

1

2

,x a b

x Ax b a

11 2

2

11 2 1 2

2

2 21 1 2 2

( )

2

T xa bf x x Ax x x

xb a

xax bx bx ax

x

ax bx x ax

1 2 1

1 2 2

2 2( ) 2 2

2 2ax bx xa b

f x Axbx ax xb a

Page 42: Lecture 2 – Problem Solving, Search and Optimization

Property 2

, ,Let be a symmetric matrix ( , ),

2

Tk l l k

T

A A A a a

x Ax Ax

1,1 1,2 1, 1

2,1 2,2 2, 21 2

,1 ,2 ,

( )

n

nTn

n n n n n

a a a xa a a x

f x x Ax x x x

a a a x

Proof

1

2,1 ,2 , ,

1 1 1 1 1

n n n n n

i i i i i n i i j i ji i i i j

n

xx

a x a x a x a x x

x

Page 43: Lecture 2 – Problem Solving, Search and Optimization

Proof (cont)

,1 1

2, , , ,

, , ,

, ,1 1

, ,1 1

( )

2

2 2

( is symmetric)

n n

i j i ji jk k

i j i j i k i k k j k j k k ki k j k i k j kk

i k i k j j k k ki k j k

n n

i k i k j ji j

n n

k j j i k ij i

f x a x xx x

a x x a x x a x x a xx

a x a x a x

a x a x

a x a x

A

Page 44: Lecture 2 – Problem Solving, Search and Optimization

Proof (cont)

11

22

( )

( )( ) 2 2

( )

T

T

Tn

n

f xxf xxf x x Ax

f xx

MM

1,

2,1 2

,

( )Let ... , where 2

k

k Tn k k

k

n k

aa f xA x

xa

M

Page 45: Lecture 2 – Problem Solving, Search and Optimization

Example 3

1

2

2 21 1 2 2

2

, ,

( ) 2

2 2( ) 2

2 2

T

x a bx A

x b a

f x x Ax ax bx x bx

a bf x A

b a

Page 46: Lecture 2 – Problem Solving, Search and Optimization

Property 3

, ,

2

Let be a symmetric matrix ( , ),

( ) 2

Tk l l k

T

A A A a a

x Ax A

,1 1

2

, ,1

2

( )

( ) ( )2 , 2

( ) 2

n nT

i j i ji j

n

k j j k ljk k l

f x x Ax a x x

f x f xa x ax x x

f x A

Proof

Page 47: Lecture 2 – Problem Solving, Search and Optimization

Principles

According to the first order Taylor approximation of ( ) :

( ) ( ) ( ) (1)It can be written as:

( ) ( ) ( ) (1)where is the learning rate, and is a unit vector represent

Tn n n

Tn n n

f x

f x hu f x h f x u O

f x hu f x h f x u Oh u

1

ing direction.Let , which is the value of in the next iteration.Our optimization objective function is:

arg min ( ) ( ) arg min ( ) (1)

The optimal solution is: ( )

n n

Tn n n

u u

n

x x hu x

f x hu f x h f x u O

u f x

Page 48: Lecture 2 – Problem Solving, Search and Optimization

Algorithm

max

1

For n 1,2, , :( )

if || ( )|| , return

1End

n n

n n

n n n

Ng f xg x x

x x hgn n

K

Page 49: Lecture 2 – Problem Solving, Search and Optimization

Principles

2

1

According to the second order Taylor approximation of ( ) :1( ) ( ) ( ) ( ) (1)2

It can be written as:1( ) ( ) ( ) ( ) (1)2

Let , which is the value of in

T Tn n n n

T Tn n n n

n n

f x

f x d f x f x d d H x d O

f x d f x f x d d f x d O

x x d x

2

2

2

the next iteration.Our optimization objective function is:

1arg min ( ) ( ) arg min ( ) ( ) (1)2

1Let ( ) ( ) ( ) (1),2

( ) ( ) ( ) 0

The optimal solution is

T Tn n n n

d d

T Tn n

Tn n

f x d f x f x d d f x d O

F d f x d d f x d O

F d f x f x dd

12 1: ( ) ( )T Tn n n nd f x f x H g

Page 50: Lecture 2 – Problem Solving, Search and Optimization

Algorithm

max

11 2

11

For n 1,2, , :( )

if || ( )|| , return

( )

1End

n n

n n

n n

Tn n n n

Ng f xg x x

H f x

x x H gn n

K

Page 51: Lecture 2 – Problem Solving, Search and Optimization

Thank You!