[d2-31] 特徴間のor組み合わせに対する単調性と セーフプルーニング

37
[D2-31] 特徴間のOR組み合わせに対する単調性と セーフプルーニングによるスパースモデルの学習 学生優秀プレゼンテーション賞対象 +1 -1 = 1 1 + 2 2 + 3 3 + 1,2 1 2 + 1,3 1 3 + 2,3 2 3 + 1,2,3 1 2 3 2値特徴のすべてのORを項に もつスパースモデルを考える 1 <1 1 =0 1,2 =0 1,2,3 =0 中川和也(名工大), 鈴村真矢(名工大), 烏山昌幸(名工大), 津田宏治(東大), 竹内一郎(名工大)

Upload: dinhquynh

Post on 06-Feb-2017

236 views

Category:

Documents


0 download

TRANSCRIPT

  • [D2-31] OR

    +1 -1 = 1 1 + 2 2 + 3 3

    + 1,2 1 2 + 1,3 1 3 + 2,3 2 3 + 1,2,3 1 2 3

    2OR

    1 < 1 1 = 0 1,2 = 0 1,2,3 = 0

    (), (), (), (), ()

  • Introduction

    Methodology

    , ()

    Pr in ZnO

    { E1, x1=(0,0,1,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,1,0) }{ E2, x2=(0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0) }

    D2-32

    *J. Roh et al., J. Am. Ceram. Soc (2015)

    0 : Cu , 1 : Ag

    http://www.dokindokin.com/

  • (D2-33 )

    xk+1 = f(xk, uk)

    xk

    uk

    !!

    "!

    #!!!""

    !#!"#

    $

    % !"#$

    $%!&'(!$

    =

    (D2-33) 1 / 1

  • D2-34 :

    2

  • CNN adversarialexamples

    ( ), ( /JSTCREST)

    D2-35

    clean adversarial

  • D2-36 () ()

    Y G = (Y, E)

    q(x) = q(x)/ZZ Bregman-divZ

    * Bregman-div p(y |), q(y |)

    * Bregman-div

    y1By1

    y2By2

    ()

    Typeset by FoilTEX 1

  • le m

    a p i

    e eee

    l bl

    0000111000000

    m m Undersampling Oversampling / rs

    95% n e c

    5% n e c

    n

    l i o Oversampling

    e c

    e

    30min

    D2-37

  • 2 () () TOH Kim-Chuan (NUS)

    l +

    Backtracking (Beck & Teboulle, 09) Modified Backtracking (Scheinberg et al., 14) Restarting (ODonoghue & Candes, 13) unknown Maintaining top-speed (Monteiro et al., 15) unknown

    D2-38

    O(1/k2)

    O(1/k2)

    + O

    (log k/k)2

    l 2(Takeda, Mitsugi, & Kanamori, 13) SVM

  • ,

    AUC

    (): , = ,

    1 1 1 21 2NEC

    A

    A B

    B

    D2-39

  • !!

    !

    * NTT

    D2'40!

    ! !! !!

    !=!

    tt1 t2 t3

    !!D2'40!

  • MDL

    2016/11/17 JST CREST Poster Preview

    D2-41

    (), (), ()

    MDL+EM

  • tt!1t!2t!3t!4t!5

    1

    3

    4

    2

    D2-42

  • D2-43SVM

    ( ), ( )

    SVM ( )

    m"#$%&'%(

    12+

    ,-+ /,+

    [1]

    : ,:

    10% ,

    [1]K.W.ChangandD.Roth.Selectiveblockminimizationforfasterconvergenceoflimitedmemorylarge-scalelinearmodels.In KDD,pages699-707,2011

  • D2-44 ()

    Theoretical Analysis for Parameter Transfer Learning

    :

    &

  • D2-45 LPV NEC NEC NEC

    LPV

    With Luggage State-Space model

    Without Luggage State-Space model

    LPV

  • (

    D2-46

    )(

    ,(

  • D2-47Recurrent infomax input-driven recurrent neural network

    I(x(t-1); x(t))

    x(t)x(t-1)

    u(t)

    y(t-1) y(t)

    u(t-1)

  • D2-48: Recurrent Neural NetworkNTT ,

    RNN (back prop)

    l

    t(time)

    (LSTM, Memory NN) (1+10, 1+5+10, 10)

    x

    ltj = f

    0

    @X

    i

    w

    ljix

    l1ti +

    X

    p2P

    X

    k

    (wp)ljk xltpk + b

    lj

    1

    A

    Forward prop P = {1, 2}P()

  • D2-49gamreg:

    () ()

    IBIS2015

    IBIS2016tuning parameterCross-Validation(n=59,p=22,283)

    40,000

    130R package gamreg

    (30)

  • D2-50: X

    Original Image Bicubic Image27.36 dB

    SRCNN Image29.55 dB, 500M iterations

    (2 days)

    9-1-5 3-layer Super Resolution Convolutional Neural Networks (CRCNN)

    Computer simulation study- 8000 x 8000 pixel original image- 9000 projections- Binning: 3 in sinogram- FBP reconstruction- No noise added

    cf) Dong et. al., ECCV, 2014

  • D2-51

    ) ,( ,

    (CTMRG)

    CTMRG LoopyBeliefPropaga6on

    Loopy Belief Propagation CTMRG ( )

  • NML

    LDAD2-52

    LDAMDL

    MDL: NML

    LDA

    JST-CREST

    2, ,

    MDL[Rissanen 1978] NML:

    : LDA

    , = log , , + log

    , ,

    + ()

  • D2-53 Shapelet

    Shapelet

    Shapelet

    ShapeletShapelet

    ( )

  • D2-54: normalized cut ~ NcutSpectral Clustering ~

    / /

    (von Luxburg et al. 2008, AoS) Normalized cut (Ncut) Spectral clustering (SC) SC = Q : Ncut (RKHS) Q : Ncut

    Spectral Ncut

    =

    n w k-means

    RKHS w k-means

    Laplacian matrix

    von Luxburg et al. (2008, AoS)

    Ncut

    Ncut

    SC

    Dhillon et al. (2007, IEEE PAMI)

    Shi and Malik (2000, IEEE PAMI)

    n!1!

    n!1!6=

  • **D2-55

    min.0 5 10 15 20

    opera0

    onal

    unit

    varia

    ble

    min.0 5 10 15 20

    CO2

    or

    faultm

    agnitude

    faultm

    agnitude

    0

    10

    1

  • **D2-55

    VECRBC

    step0 5 10 15 20

    step0 5 10 15 20

    CO2

    opera0

    onal

    unit

    varia

    ble

  • Kriging

    Results Al2O310,000

    11121 12

    D2-56

    0

    2.0 4.0

    dx dz

    dy

    Introduction

  • ,Varga Istvan

    D2-57

    Yelp

  • AUC, , () , /JST(/JST/ThinkCyte), (ThinkCyte)

    ()

    D2-58

    MDS MDS()

    :98%AUC

    AUCCNN

  • D2-59

    , , Preferred Networks

    (reparameterization trick)

    (x1e-5 x1e-8)

  • - -Kullback-Leibler

    t3 t1

    t2

    D2-60

    x = (x1, x2) y 2 { },

  • D2-61 AdaGrad

    , ( ) ( )

    minxRd

    {E(a,b)[(ax, b)] + R(x)

    }

    .

    AdaGrad

    Lazy Update

  • D2-62: Shun Watanabe (Tokyo Univ. Agriculture and Technology)A Neyman-Pearson-like Test for Multi-terminal Hypothesis Testing

    Xn

    Y n

    (Xn, Y n) PXY or QXYf1

    f2

    gzero-rate communication (eg. send marginal types)

    Hoeffding-like test [Han-Kobayashi 89]

    PXY

    P XY

    QXY

    E(xy(P ))

    E(xy(Q))

    M(x(P ), y(P ))

    Neyman-Pearson-like test

    PXY

    P XY

    QXY

    E(xy(P ))

    E(xy(Q))

    M(x(P ), y(P ))

    M(x(Q), y(Q))

    QXY

    M()

    Neyman-Pearson-like test has better error trade-off than Hoeffding-like test for finite block length; Neyman-Pearson-like test has the same (optimal) large deviation performance as Hoeffding-like test; Information geometric observation [Amari-Han 89] plays an important role.

  • 19

    c dWD2-63

    Wildcard c

    u

    u

    u wildcard ()

    NH

    NNH

    N

    NH

    S

    NS

    N

    S

    O

    NO

    N

    O

    wildcard

    a

    u Wildcard

    uwildcard

    (!",+1) (!&,1) (!(,+1) (!),+1)

    !*+,-.* +1or-1

    Active(+1) Inactive(-1)

  • D2-64 ( )

    :

    ( ), ( ), ( )

    1 / 1

  • D2-65 :

    Q : ?

    ()

    raw data k-anonymized data

  • [A2]

    [A2]

    0.5 0 0.5 1 1.5 2 2.5 30

    0.5

    1

    1.5

    2

    2.5

    3

    Multi

    plic

    ity

    0.2

    0.4

    0.6

    0.8

    1

    1.2

    1.4

    1.6

    1.8

    2

    [A2]

    [A2]

    0.5 0 0.5 1 1.5 2 2.5 30

    0.5

    1

    1.5

    2

    2.5

    3

    Multi

    plic

    ity

    0.2

    0.4

    0.6

    0.8

    1

    1.2

    1.4

    1.6

    1.8

    2

    [A2]

    [A2]

    0.5 0 0.5 1 1.5 2 2.5 30

    0.5

    1

    1.5

    2

    2.5

    3

    Multi

    plic

    ity0.2

    0.4

    0.6

    0.8

    1

    1.2

    1.4

    1.6

    1.8

    2

    (), (), ()

    105

    05

    1015

    5

    0

    5

    1015

    10

    5

    0

    5

    10

    05

    1015

    2025

    05

    1015

    20250

    5

    10

    15

    20

    25

    30

    05

    1015

    2025

    05

    1015

    20250

    5

    10

    15

    20

    25

    30

    () () ()

    D2-66 :