design and analysis of algorithmsailab.cs.nchu.edu.tw/course/algorithms/105/al10.pdf ·...
TRANSCRIPT
1
Design and Analysis of
Algorithms
演算法設計與分析
Lecture 10
November 30, 2016
洪國寶
2
Homework # 8
1. 20.2-1 (p. 488) / 19.2-1 (p. 518) 2. 20.4-1 (p. 496) / 19.4-1 (p. 526)3. Provide a sequence of operations on a Fibonacci heap
(initially empty) such that Figure 20.1 (a) (p. 478) (Figure 19.2 (a) (p. 508)) is produced. Justify your answer.
4. Provide a sequence of operations on a Fibonacci heap such that, for any n 0, a binomial tree Bn is produced.
5. Execute the following operations on an initialy empty Fibonacci heap: insert(18), insert(14), insert(17), insert(28), insert(32), insert(37), insert(25), insert(36), insert(53), insert(40), extract-min(), decrease-key(40, 30), delete(36), extract-min(). For all intermediate steps, illustrate the resulting Fibonacci heap. New elements should always be inserted to the right of the current minimum. The consolidation operation after extract-min() starts with the next element on the right hand side of the deleted minimum.
Due December 7, 2016
3
Outline
• Review
• Data structures for disjoint sets
• Elementary graph algorithms
4
Review: Complexity of Mergeable Heaps
• Extract-Min(H): deletes the node with the minimum key.
• Decrease-Key(H, x, k): assigns to node x the new key value k, which is its current key value.
5
Review: Fibonacci Heaps
• Fibonacci heap history.
– Fredman and Tarjan (1986)
– Ingenious data structure and analysis.
• Fibonacci heap intuition.
– Similar to binomial heaps, but less structured.
– Decrease-key and union run in O(1) time.
– "Lazy" unions and inserts
• We do not attempt to consolidate trees in a Fibonacci heap
when we unite two heaps or insert a new node.
6
Fibonacci Heaps: Structure
• Fibonacci heap.
– Set of min-heap ordered trees.
723
30
17
35
26 46
24
H 39
4118 52
3
44
min
marked
7
Fibonacci Heaps: Potential Function
• Key quantities.– degree[x] = degree of node x.
– mark[x] = mark of node x (black or gray).
– t(H) = # trees.
– m(H)= # marked nodes.
– (H)= t(H) + 2m(H) = potential function.
723
30
17
35
26 46
24
H
t(H) = 5, m(H) = 3
(H) = 11
39
4118 52
3
44
mindegree = 3
8
Fibonacci Heap Operation 5: Extract
Min• Extract min.
– Delete min and concatenate its children into root list.
– Consolidate trees so that no two roots have same degree.
39
411723 18 52
30
7
35
26 46
24
44
currentmin
9
Fibonacci Heap Extract Min Analysis
• Notation.– D(n) = max degree of any node in Fibonacci heap with n nodes.
– t(H) = # trees in heap H.
– (H)= t(H) + 2m(H).
• Actual cost. O(D(n) + t(H))
• Amortized cost. O(D(n))– t(H') D(n) + 1 since no two trees have same degree.
– (H) D(n) + 1 - t(H).
scale up the units of potential to dominate the constant hidden in O(t(H)).
10
• Decrease key of element x to k.– Case 0: min-heap property not violated.
• decrease key of x to k
• change heap min pointer if necessary
– Case 1: parent of x is unmarked.• decrease key of x to k
• cut off link between x and its parent
• mark parent
• add tree rooted at x to root list, updating heap min pointer
– Case 2: parent of x is marked.• decrease key of x to k
• cut off link between x and its parent p[x], and add x to root list
• cut off link between p[x] and p[p[x]], add p[x] to root list– If p[p[x]] unmarked, then mark it.
– If p[p[x]] marked, cut off p[p[x]], unmark, and repeat. Cascading-cuts
Fibonacci Heap Decrease Key
11
• Notation.– t(H) = # trees in heap H.
– m(H) = # marked nodes in heap H.
– (H) = t(H) + 2m(H).
• Actual cost. O(c)
• Amortized cost. O(1)– t(H') = t(H) + c
– m(H') m(H) - c + 2• each cascading cut (except the last one) unmarks a node
• last cascading cut could potentially mark a node
– c + 2(-c + 2) = 4 - c.
Fibonacci Heap Decrease Key
scale up the units of potential to dominate the constant hidden in O(c).
12
Fibonacci Heaps: Bounding Max Degree
• Definition.
D(N) = max degree in Fibonacci heap with N nodes.
• Key lemma.
D(N) log N, where = (1 + 5) / 2.
• Corollary.
Delete and Extract-min take O(log N)
amortized time.
13
Outline
• Review
• Data structures for disjoint sets
• Elementary graph algorithms
14
Data structure for disjoint sets
• Discuss the data structures to maintain a
collection of pair-wise disjoint dynamic sets.
– Operations: make-set, union, find-set
– Implementations: linked-lists, rooted trees
– Applications: many
15
Disjoint Sets• Maintain a collection S = {S1, …, Sk} of disjoint dynamic sets.
• Each set has a representative member.
• Support Operations:
– Make-Set(x): Make new singleton set containing object x (x is representative).
– Union(x, y): Like before (x and y are objects in two sets to be merged).
– Find-Set(x): Returns pointer to representative set containing x.
• Complexity: In terms of
– n = # of Make-Set operations.
– m = total # of operations.
– Note: m n.
16
Applications for disjoint sets
• Fortran compliers (common, equivalent
statements)
• Computational geometry problems
• Unification in logic programming
• Longest common subsequences
• Graph problems (minimum spanning trees,
connected components, …)
17
Connected components
• Given a graph G=(V,E), a path of length k from a
vertex u to a vertex u’ is a sequence of vertices
<V0, V1, …, Vk> such that u=V0, u’=Vk, and (Vi-1, Vi) E for i=1,2, … ,k.
• If there is a path p from u to u’, we say that u’ is
reachable from u to u’ via p.
• The connected components of a graph are the
equivalence classes of vertices under the is-
reachable-from relation.
18
Compute the connected components
19
20
Determine whether two vertices are in
the same component
Time complexity?
21
Data structure for disjoint sets
• Discuss the data structures to maintain a
collection of pair-wise disjoint dynamic sets.
– Operations: make-set, union, find-set
– Implementations: linked-lists, rooted trees
– Applications: many
22
Linked ListsStore set {a, b, c} as:
Make-Set and Find-Set are O(1).
Union(x, y): Append x’s list onto the end of y’s list. Update
representative pointers in x’s list (Figure 21.2).
•Time is linear in |x|.
•Running time for a sequence of m ops can take (m²) time.
(Figure 21.3) (Not very good.)
a b c
representativetail
23
union
Union(x, y): Append x’s list onto the end of y’s list.
Update representative pointers in x’s list.
24
Example
n make-set operations
n-1 union operations
Total time: next slide
25
Example (Cont.)
Operation
M-S(x1)
M-S(x2)
M-S(xn)
U(x1, x2)
U(x2, x3)
U(x3, x4)
U(xn-1, xn)
“Time”
1
1
1
1
2
3
n–1
m = 2n – 1 operations.
Total Time is:
• (n) = (m) for Make-Set ops.
• for Union ops.
• (m²) total.
• (m) amortized.
)Θ(m)Θ(ni 221n
1i
26
Improvement: Weighted-Union Heuristic
• Keep track of list length in representative.
(Time/space tradeoff)
• Modify Union so that smaller list is
appended to longer one.
• Time for Union is now proportional to the
length of the smaller list.
27
Amortized Running Time of WUHTheorem 21.1: Sequence of m operations takes
O(m + n log n) time.
Proof:
M-S and F-S contribute O(m) total.
What about Union? (See the next slide)
Time is dominated by no. of total times we change a rep.
pointer.
A given object’s rep. pointer can change at most log n times.
28
What about Union?
o1 o2 o3 o4… .. On total
U1 v v 2
U2 v v 2
U3 v v v 3
.
.
Uj v v v v v O(n)
Total O(?)
29
Proof of Theorem 21.1 (Continued)
Note: n = no. of M-S’s = no. of objects
After object x’s rep. ptr. has been changed once, set has 2 members.
…………………………………………… twice..…….. 4 members.
…………………………………………… three times... 8 members.
…………………………………………… log k times.. k members.
k n , so x’s rep. pointer can change at most log n times.
O(n log n) for n objects.
O(m + n log n) total.
30
Data structure for disjoint sets
• Discuss the data structures to maintain a
collection of pair-wise disjoint dynamic sets.
– Operations: make-set, union, find-set
– Implementations: linked-lists, rooted trees
– Applications: many
31
Disjoint-Set Forests
M-S, F-S: Easy
Union: As follows…
Union
Will speed up sequence of Union, M-S, and F-S operations by
means of two heuristics.
x y x
y
a
b d
c
representative
set is {a, b, c, d}
32
33
Heuristics to improve the running
time of the rooted tree representation
• Weighted-union heuristic
– union by size: make the root of the smaller tree
point to the root of the larger, arbitrarily
breaking a tie
– union by rank: make the root of the shallower
tree point to the root of the other, arbitrarily
breaking a tie
34
Heuristics to improve the running
time of the rooted tree representation
• Find-path compaction
– path compression: make every encountered node point to the root node
– path splitting: make every encountered node (except the last and next to last) point to its grandparent
– path halving: make every other encountered node (except the last and next to last) point to its grandparent
35
Two heuristics used in the textbook
1) Union by Rank
• Store rank of tree in rep.
• Rank u.b. on height.
• Make root with smaller rank point to root with larger rank.
2) Path Compression
• During Find-Set, “flatten” tree.
b
c
d
a b c
dF-S(a)
a
36
OperationsMake-Set(x)
p[x] := x;
rank[x] := 0
Union(x, y)
Link(Find-Set(x), Find-Set(y))
Link(x, y)
if rank[x] > rank[y] then
p[y] := x
else
p[x] := y;
if rank[x] = rank[y] then
rank[y] := rank[y] + 1
Find-Set(x)
if x p[x] then
p[x] := Find-Set(p[x])
return p[x]
37
Find-Set
c
ab
ca b
F-S(a)
p[a] := F-S(b)
p[b] := F-S(c)
{ return c
return c
return c
38
ExampleMS(a) ; MS(b) ; ... ; MS(i) ; MS(j)
e/0
j/0
a/0
f/0
c/0
h/0
b/0
g/0
d/0
i/0
U(a,b) ; U(c,d) ; U(e,f) ; U(g,h); U(i j)
rankparent pointer
j/1
i/0
b/1
a/0
f/1
e/0
d/1
c/0
h/1
g/0
39
Example (Continued)
j/1
i/0
b/1
a/0
f/1
e/0
d/1
c/0
h/1
g/0
U(a,d)
b/1
a/0
d/2
c/0
j/1
i/0
f/1
e/0
h/1
g/0
d
d
d
d
40
Example (Continued)
b/1
a/0
d/2
c/0
j/1
i/0
f/1
e/0
h/1
g/0
d
d
d
d
U(f,h)
j/1
i/0
b/1
a/0
d/2
c/0
d
d
d
d
f/1
e/0
h/2
g/0
h
h
hh
41
Example (Continued)
U(d,h)
j/1
i/0
b/1
a/0
d/2
c/0
d
d
d
d
f/1
e/0
h/2
g/0
h
h
hh
j/1
i/0
b/1
a/0
d/2
c/0
d
d d
d
f/1
e/0
h/3
g/0
h
h
h
h
42
Example (Continued)
U(e,j)
j/1
i/0
b/1
a/0
d/2
c/0
d
d d
d
f/1
e/0
h/3
g/0
h
h
h
h
j/1
i/0
b/1
a/0
d/2
c/0
d
d
dd
f/1e/0
h/3
g/0
h
h hh
43
Example (Continued)
j/1
i/0
b/1
a/0
d/2
c/0
d
d
dd
f/1e/0
h/3
g/0
h
h hh
FS(i)
j/1i/0b/1
a/0
d/2
c/0
d
d
d
d
f/1e/0
h/3
g/0
h
h hh
BC
BC
BC
PC
Block 0
Block 0
Block 1
Block 2
44
Example (Continued)
FS(a)
j/1i/0b/1
a/0
d/2
c/0
d
d
d
d
f/1e/0
h/3
g/0
h
h hh
j/1i/0b/1a/0 d/2
c/0
dd
d
d
f/1e/0
h/3
g/0
h
hhh
45
Time Complexity
• We will cover the complexity analysis
found in CLR rather than CLRS.
– Note: This was Chapter 22 in CLR, which is
why the remaining lemmas etc. are
numbered the way they are.
– CLRS uses potential method (pp. 509 –
517 2nd edition; pp. 573 – 581 3rd edition)
CLR uses aggregate method
46
Potential function
47
Potential function
A4(1) >> 1080
48
Time Complexity
• Tight upper bound on time complexity:
O(m (n)).
– (n) is almost a constant.
• A slightly easier bound of O(m log*n)
is established in CLR.
49
Bound we will establish
• We establish O(m log*n) as an upper bound.
• log*n = min{i 0: log(i) n 1}.
– In particular:
– And hence: log*265536 = 5.
– Thus, log*n 5 for all practical purposes.
12log
22
*
kk
50
Properties of Ranks
Lemma 22.2:
(i) (x:: rank[x] rank[p[x]]).
(ii) (x: x p[x]: rank[x] < rank[p[x]]).
(iii) rank[x] is initially 0.
(iv) rank[x] does not decrease.
(v) Once x p[x] holds, rank[x] does not change.
(vi) rank[p[x]] is a monotonically increasing function of time.
Proof:
By induction on number of operations (see example Slides 42 - 48).
51
Lemma 22.3
Lemma 22.3: For all tree roots x, size(x) 2rank[x].
no. of nodes in
tree rooted at xProof:
Induction on number of Link operations
Basis:
Before first link, all ranks are 0 and each tree contains one node.
Inductive Step:
Consider Link(x,y).
Assume lemma holds before this operation.
We show it holds after.
2 cases.
52
Case 1: rank[x] rank[y]Assume rank[x] < rank[y].
Link(x,y)x y x
y
rank(x)
size(x)
rank(y)
size(y)
rank(x)
size(x)
rank(y)
size(y)
Note: rank(x) = rank(x)
rank(y) = rank(y)
size(y) = size(x) + size(y)
2rank(x) + 2rank(y)
2rank(y)
= 2rank(y)
No ranks or sizes change for any nodes other than y.
53
Case 2: rank[x] = rank[y]
x y x
y
rank(x)
size(x)
rank(y)
size(y)
rank(x)
size(x)
rank(y)
size(y)
Note: rank(x) = rank(x)
rank(y) = rank(y) + 1
size(y) = size(x) + size(y)
2rank(x) + 2rank(y)
2rank(y) + 1
= 2rank(y)
Link(x,y)
54
Lemma 22.4Lemma 22.4: For any integer r 0, there are at most n/2r nodes of
rank r.
Corollary 22.5: Every node has rank at most log n.
55
Proving the Time Bound
Lemma 22.6: Suppose we convert a sequence S of m MS, U, and
FS operations into a sequence S of m MS, Link, and FS operations
by turning each Union into two FS operations followed by a
Link. Then, if sequence S runs in O(m log*n) time, sequence S
runs in O(m log*n) time.
Only have to consider MS, Link, FS operations.
56
Theorem 22.7Theorem 22.7: A sequence of m MS, L, and FS operations, n of which are MS
operations, can be performed in worst-case time O(m log*n).
Proof: Use blackboard
57
Remarks
58
Remarks (Cont.)
59
Outline
• Review
• Data structures for disjoint sets
• Elementary graph algorithms
60
Outline of the course (1/1)
• Introduction (1-4)
• Data structures (10-14)
• Dynamic programming (15)
• Greedy methods (16)
• Amortized analysis (17)
• Advanced data structures (6, 19-21)
• Graph algorithms (22-25)
• NP-completeness (34-35)
• Other topics (5, 31)
61
Graph algorithms• Topics:
– Elementary graph algorithms
– Minimum spanning trees
– Shortest paths
• Reading:
– Chapters 22, 23, 24, 25
62
Graphs
• A graph G = (V, E) consists of a set V of vertices (nodes) and a set E of directed or undirected edges.– For analysis, we use V for |V| and E for |E|.
• Any binary relation is a graph.– Network of roads and cities, circuit representation, etc.
63
Directed graphs
• A directed graph (or digraph) G is a pair (V, E), where V is a finite set and E is a binary relation on V. – The set V is called the vertex set of G, and its elements
are called vertices (singular: vertex).
– The set E is called the edge set of G, and its elements are called edges.
• Vertices are represented by circles in the figure, and edges are represented by arrows. Note that self-loops--edges from a vertex to itself--are possible.
64
Undirected graphs
• In an undirected graph G = (V, E), the edge set E consists of unordered pairs of vertices, rather than ordered pairs. That is, an edge is a set {u, v}, where u, v V and u ≠ v.
• By convention, we use the notation (u, v) for an edge, rather than the set notation {u,v}, and (u,v) and (v, u) are considered to be the same edge.
• In an undirected graph, self-loops are forbidden, and so every edge consists of exactly two distinct vertices.
65
Figure B.2 Directed and undirected graphs.
(a) A directed graph G = (V, E), where V = {1,2,3,4,5,6} and E =
{(1,2), (2,2), (2,4), (2,5), (4,1), (4,5), (5,4), (6,3)}. The edge (2,2)
is a self-loop.
(b) An undirected graph G = (V,E), where V = {1,2,3,4,5,6} and
E = {(1,2), (1,5), (2,5), (3,6)}. The vertex 4 is isolated.
(c) The subgraph of the graph in part (a) induced by the vertex set
{1,2,3,6}.
66
Definitions
• Many definitions for directed and undirected
graphs are the same, although certain terms have
slightly different meanings in the two contexts.
• If (u, v) is an edge in a directed graph G = (V, E),
we say that (u, v) is incident from or leaves vertex
u and is incident to or enters vertex v.
• If (u, v) is an edge in an undirected graph G = (V,
E), we say that (u, v) is incident on vertices u and
v.
67
Examples
• The edges leaving vertex 2 in Figure B.2(a) are (2, 2), (2, 4), and (2, 5). The edges enteringvertex 2 are (1, 2) and (2, 2).
• In Figure B.2(b), the edges incident on vertex 2 are (1, 2) and (2, 5).
68
Definitions (Cont.)
• If (u, v) is an edge in a graph G = (V, E), we
say that vertex v is adjacent to vertex u.
– When the graph is undirected, the adjacency
relation is symmetric.
– When the graph is directed, the adjacency
relation is not necessarily symmetric.
69
Example (Cont.)
• In parts (a) and (b) of Figure B.2, vertex 2 is adjacent to vertex 1, since the edge (1, 2) belongs to both graphs. Vertex 1 is not adjacent to vertex 2 in Figure B.2(a), since the edge (2, 1) does not belong to the graph.
70
Definitions (Cont.)
• The degree of a vertex in an undirected graph is
the number of edges incident on it.
• In a directed graph,
– the out-degree of a vertex is the number of edges
leaving it, and
– the in-degree of a vertex is the number of edges
entering it.
• The degree of a vertex in a directed graph is its in-
degree plus its out-degree.
71
Example (Cont.)
• Vertex 2 in Figure B.2(a) has in-degree 2, out-
degree 3, and degree 5.
72
Definitions (Cont.)
• A path of length k from a vertex u to a vertex u' in a graph G = (V, E) is a sequence of vertices <V0, V1, …, Vk> such that u=V0, u’=Vk, and (Vi-1, Vi) E for i=1,2, … ,k. The length of the path is the number of edges in the path. The path contains the vertices V0, V1, …, Vk and the edges (V0, V1), (V1, V2), . . . , (Vk-1, Vk).
• If there is a path p from u to u', we say that u' is reachable from u via p, which we sometimes write as if G is directed. A path is simple if all vertices in the path are distinct.
73
Example (Cont.)
• In Figure B.2(a), the path 1, 2, 5, 4 is a simple
path of length 3. The path 2, 5, 4, 5 is not simple.
74
Definitions (Cont.)
• A subpath of path p = V0, V1, . . . , Vk, is a
contiguous subsequence of its vertices.
– That is, for any 0 i j k, the subsequence of
vertices Vi, Vi+1, . . . , Vj, is a subpath of p.
75
Definitions (Cont.)
• In a directed graph, a path v0, v1, . . . , vk forms a cycle if v0 = vk and the path contains at least one edge. – The cycle is simple if, in addition, v1, v2, . . . , vk are distinct.
– A self-loop is a cycle of length 1.
– Two paths v0, v1, v2, . . . , v k - 1, v0 and v'0, v'1, v'2, . . . , v‘ k - 1, v'0form the same cycle if there exists an integer j s. t. v'i = v (i + j) mod k for i = 0, 1, . . . , k - 1.
• A directed graph with no self-loops is simple. In an undirected graph, a path v0, vl, . . . , vk forms a cycle if v0
= vk and v1, v2, . . . , vk are distinct.
• A graph with no cycles is acyclic.
76
Example (Cont.)
• In Figure B.2(a), the path 1, 2, 4, 1 forms the same cycle
as the paths 2, 4, 1, 2 and 4, 1, 2, 4 . This cycle is simple,
but the cycle 1, 2, 4, 5, 4, 1 is not. The cycle 2, 2 formed
by the edge (2, 2) is a self-loop.
• In Figure B.2(b), the path 1, 2, 5, 1 is a cycle.
77
Definitions (Cont.)
• An undirected graph is connected if every pair of vertices is connected by a path.
– The connected components of a graph are the equivalence classes of vertices under the "is reachable from" relation.
– An undirected graph is connected if it has exactly one connected component, that is, if every vertex is reachable from every other vertex.
78
Example (Cont.)
• The graph in Figure B.2(b) has three connected
components: {1, 2, 5}, {3, 6}, and {4}. Every
vertex in {1,2,5} is reachable from every other
vertex in {1, 2, 5}.
79
Definitions (Cont.)
• A directed graph is strongly connected if
every two vertices are reachable from each
other.
– The strongly connected components of a graph
are the equivalence classes of vertices under
the "are mutually reachable" relation.
– A directed graph is strongly connected if it has
only one strongly connected component.
80
Example (Cont.)
• The graph in Figure B.2(a) has three strongly connected
components: {1, 2, 4, 5}, {3}, and {6}. All pairs of vertices
in {1, 2, 4, 5} are mutually reachable. The vertices {3, 6} do
not form a strongly connected component, since vertex 6
cannot be reached from vertex 3.
81
Definitions (Cont.)
• Two graphs G = (V, E) and G' = (V', E') are
isomorphic if there exists a bijection f : V
V' such that (u, v) E if and only if (f(u),
f(v)) E'.
– In other words, we can relabel the vertices of G
to be vertices of G', maintaining the
corresponding edges in G and G'.
82
Example (Cont.)• Figure B.3(a) shows a pair of
isomorphic graphs G and G'with respective vertex sets V= {1, 2, 3, 4, 5, 6} and V' = {u, v, w, x, y, z}. The mapping from V to V' given by f(1) = u, f(2) = v, f(3) = w, f(4) = x, f(5) = y, f(6) = z is the required bijective function.
• The graphs in Figure B.3(b) are not isomorphic. Although both graphs have 5 vertices and 7 edges, the top graph has a vertex of degree 4 and the bottom graph does not.
83
Definitions (Cont.)
• We say that a graph G' = (V', E') is a
subgraph of G = (V,E) if V' V and E' E.
Given a set V' V, the subgraph of G
induced by V' is the graph G' = (V', E'),
where E' = {(u, v) E: u, v V'} .
84
Example (Cont.)
• The subgraph induced by the vertex set {1, 2,
3, 6} in Figure B.2(a) appears in Figure B.2
(c) and has the edge set {(1, 2), (2, 2), (6, 3)}.
85
Definitions (Cont.)• Given an undirected graph G = (V, E), the directed version
of G is the directed graph G' = (V, E'), where (u, v) E' if and only if (u, v) E. – That is, each undirected edge (u, v) in G is replaced in the directed
version by the two directed edges (u, v) and (v, u).
• Given a directed graph G =(V, E), the undirected version of G is the undirected graph G' = (V, E'), where (u, v) E' if and only if u v and (u, v) E. – That is, the undirected version contains the edges of G "with their
directions removed" and with self-loops eliminated.
• In a directed graph G = (V, E), a neighbor of a vertex u is any vertex that is adjacent to u in the undirected version of G. That is, v is a neighbor of u if either (u, v) E or (v, u) E. In an undirected graph, u and v are neighbors if they are adjacent.
86
Definitions (Cont.)
• Several kinds of graphs are given special names. – A complete graph is an undirected graph in which
every pair of vertices is adjacent.
– A bipartite graph is an undirected graph G = (V, E) in which V can be partitioned into two sets V1 and V2 such that (u, v) E implies either u V1 and v V2 or u V2 and v V1. That is, all edges go between the two sets V1and V2.
– An acyclic, undirected graph is a forest, and a connected, acyclic, undirected graph is a (free) tree. We often take the first letters of "directed acyclic graph" and call such a graph a dag.
87
Definitions (Cont.)
• The contraction of an undirected graph G =
(V, E) by an edge e=(u,v) is a graph
G’=(V’,E’), where V’= V-{u,v} U {x} and x
is a new vertex. The set of edges E’ is
formed from E by deleting the edge (u,v)
and, for each vertex w incident to u or v,
deleting whichever of (u,w) and (v,w) in E
and adding the new edge (x,w).
88
Representations of Graphs:
Adjacency List• Adjacency list: An array Adj of |V | lists, one for each vertex
in V. For each u V, Adj[u] pointers to all the vertices adjacent to u.
• Advantage: O(V+E) storage, good for sparse graph.
• Drawback: Need to traverse list to find an edge.
2
89
Representations of Graphs: Adjacency Matrix
• Adjacency matrix: A |V| |V| matrix A = (aij) such that
• Advantage: O(1) time to find an edge.
• Drawback: O(V2) storage, more suitable for dense graph.
• Q: How to save space if the graph is undirected?
90
Tradeoffs between Adjacency List
and Matrix
91
Questions?