Download - Recurrent Neural Network tutorial (2nd)
Recurrent Neural Network (2nd)
ujava.org workshop
2016-10-03
www.idosi.com
CEO Shindong KANG
()
ujava.org
spaceapi.org
Recurrent Stream ()
Recurrent Iteration
Convergence & Attractor ( & )
Divergence ()
Cobweb ()
Cobweb Plot (Cobweb Diagram)
Elman Recurrent Neural Network
Jordan Recurrent Neural Network
Sequence data
RNN
RNN (hidden state, h)
RNN
RNN
Vanilla word
Vanilla
Common Euphemism ()
for regular or without any fancy stuff
JFYI: fancy stuff ( )
Vanilla RNN
One-hot encoding
Softmax
Lighten Softmax
Character Level Language Model
Input Vector
CLLM Vanilla RNN
CLLM Vanilla RNN
CLLM Vanilla RNN
RNN
One to one
One to many (image captioning)
Many to one (sentiment, )
Many to many
Many to many (video captioning)
Deep Learning for Music Composition
RNN hand writing recognition
captioning
Multi-Layer RNN
BPTT (Back-Propagation Through Time)
Deep Gradient by gradients multiply
Exploding Gradient & Vanishing Gradient
Exploding Gradients : Gradient Clipping
Vanishing Gradients : memory loss
RNN Weights
Error of BPTT
Cost function of BPTT
Outer Product of vector
BPTT of V
BPTT of W
BPTT of W
BPTT sum
Bidirectional RNN
Deep Bidirectional RNN
Sigmoid function
Derivative of sigmoid function
Hyperbolic tangent function
LSTM & GRU computing target
LSTM concept
LSTM (Long-Short Term Memory)
Sc = cell state h = activation functionYc = cell output
LSTM (Long-Short Term Memory)
LSTM (Long-Short Term Memory)
LSTM equations
LSTM to simple RNN
LSTM with step delay
n-th layerk-th cell1 : step delay
Learn if input gate and output gate ON
GRU concept
GRU concept
r : reset gate , 1 = input level z : update gate , 0 = fully new update
GRU (Gated Recurrent Unit, Dr Cho)
GRU equations
LSTM GRU RNN activation function
No Good
Good
LSTM & GRU
LSTM & GRU multilayer
Thank you !
()Intelligent City Ltd.
Shindong KANG
www.idosi.com .
www.idosi.com .
www.idosi.com .
www.idosi.com .