an effective & interactive approach to particle tracking for dna melting curve analysis...

Post on 18-Jan-2016

216 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

An Effective & Interactive Approach to Particle Tracking for DNA Melting Curve Analysis李穎忠DEPARTMENT OF COMPUTER SC IENCE & INFORMAT ION ENGINEER INGNAT IONAL TA IWAN UNIVERS ITY

DNA Melting Curve Analysis Used for the detection of DNA sequence variants

DNA Melting Analysis in Temperature-Gradient Micro-channel

Temperature-Gradient Micro-channel

Heater

Carrier (Bead/Droplet

) ThermometerSubstrate

2/54

DNA Melting Curve Analysis

Temperature

Fluo

resc

ent I

nten

sity

Melting Temperature

3/54

DNA Melting Curve Analysis

4/54

MotivationPeople label each particles (carrier) frame by frame

That is time-consuming

We design an annotation tool to reduce human effort

5/54

Related WorkParticle tracking

ParticleTracker: An ImageJ plugin for multiple particle detection and tracking [Sbalzarini et al., Journal of structural biology 2005]

u-track [Jaqaman et al., Nature Methods 2008]

Interactive video annotationTracking with active learning [Vondrick et al., NIPS 2011]Interactive object detection [Yao et al., CVPR 2012]

6/54

Proposed System

Userannotation

Detection of bounding circle of

the particle

Acquisition of labels at other frames by

tracking the particle

User correctionUpdate of tracker & labels

Acquisition of all correct labels

7/54

Detecting Bounding Circle of a Particle

Median filter

Otsu's method

Edgedetection

Least-squares fitting

Dilation

Erosion

8/54

Least-Squares Fitting of Bounding CircleAssume the coordinates of the detected edge are

Let and denote the center and the radius of circle respectively

{ (𝑥𝑐−𝑥1 )2+( 𝑦𝑐−𝑦 1 )2=𝑟 2

(𝑥𝑐−𝑥𝑁 )2+( 𝑦𝑐−𝑦𝑁 )2=𝑟2

⇒ { 2 𝑥1𝑥𝑐+2 𝑦 𝑐𝑦1+(𝑟2−𝑥𝑐2− 𝑦𝑐

2 )=𝑥12+𝑦1

2

⋮2 𝑥𝑁 𝑥𝑐+2 𝑦𝑁 𝑦1+(𝑟 2−𝑥𝑐

2−𝑦 𝑐2 )=𝑥𝑁

2 +𝑦𝑁2

9/54

Least-Squares Fitting of Bounding Circle

[ 2𝑥1 2 𝑦1 1⋮ ⋮ ⋮

2𝑥𝑁 2 𝑦𝑁 1] [ 𝑥𝑐

𝑦 𝑐

𝑟2−𝑥𝑐2− 𝑦𝑐

2 ]=[ 𝑥12+𝑦1

2

⋮𝑥𝑁

2 +𝑦𝑁2 ]

𝒛=(𝑨T 𝑨 )−1𝑨T𝑩

𝑨𝒛=𝑩

10/54

Possible Choices of TrackersLinear interpolation

Correlation filter based tracker [Zhang et al., ECCV 2014]

Normalized cross-correlation matching

11/54

Linear Interpolation

1 2 3 4 5 6 7 8 9 10 11 12

12/54

Linear Interpolation: User Correction

1 2 3 4 5 6 7 8 9 10 11 12

13/54

Linear Interpolation: Update of Labels

14

5 6 7 8 9 10 11 122 3

14/54

Linear Interpolation: Update of Labels

14 5 6 7 8 9 10 11 122 3

15/54

Linear Interpolation: User Correction

1 2 3 4 5 6 7 8 9 10 11 12

16/54

Correlation Filter Based Tracker

[Zhang et al., ECCV 2014]𝐺=h⊗ 𝑓

Input image

h=ℱ− 1(ℱ (𝐺 )ℱ ( 𝑓 ) )

¿ℱ−1(ℱ (𝑒−‖𝒙− 𝒙∗

𝛼 ‖)ℱ ( 𝑓 ) )

Correlation Filter

17/54

Online Update of FilterFrame 1

𝐻1=h1

18/54

Online Update of FilterFrame 2

𝐻1⊗𝐹

𝐻2=(1− 𝜌 ) 𝐻1+𝜌 h2

19/54

12

One-Way Method

20/54

12

One-Way Method

21/54

12

One-Way Method3

22/54

12

One-Way Method3 Re-train the filter

23/54

12 3 4 5 6 7 8

9 10 1112

1314

Two-Way Method

24/54

12 3 5 6 7 8

9 10 1112

1314

Two-Way Method

4

25/54

15 6 7 8

9 10 1112

1314

Two-Way Method4

2 3

26/54

15 6 7 8

9 10 1112

1314

Two-Way Method4

23

27/54

1

7 89

14

Two-Way Method4

23 5 6

10 1112

13

28/54

1

7 89

10 1112

1314

Two-Way Method4

23 5 6

29/54

12

3 4 5 6 78 9 10 11

1213

14

Two-Way Method

30/54

Normalized Cross-Correlation MatchingGiven a image f and template t, normalized cross-correlation (NCC) measures the similarity between each part of f and t:

𝛾 (𝑢 ,𝑣 )=∑𝑥 ,𝑦

( 𝑓 (𝑥 , 𝑦 )− 𝑓 𝑢 ,𝑣 ) (𝑡 (𝑥−𝑢 , 𝑦−𝑣 )−𝑡 )

√∑𝑥, 𝑦

( 𝑓 (𝑥 , 𝑦 )− 𝑓 𝑢 ,𝑣 )2∑𝑥 , 𝑦

(𝑡 (𝑥−𝑢 , 𝑦−𝑣 )−𝑡 )2

Template Input image Output NCC

31/54

Normalized Cross-Correlation Matching

Template

Frame 1

32/54

Normalized Cross-Correlation Matching

Frame 2

33/54

12

One-Way Method

34/54

12

One-Way Method

35/54

12

One-Way Method3

36/54

12

One-Way Method3

Update the template

37/54

12

3 4 5 6 78 9 10 11

1213

14

Two-Way Method

38/54

Failure in Tracking with Normalized Cross-Correlation

Template of particle 1

12

39/54

Combining NCC & Extrapolation

Frame t-2 Frame t-1 Frame t

12 1 2 1 2x x x

where

𝛿 (𝑢 ,𝑣 )=𝑒−‖(𝑢−𝑥 ′ ,𝑣− 𝑦 ′ )

𝜎 ∙ 𝑙 ‖2

40/54

Combining NCC & Extrapolation

NCC Score of predicted location

Combined score

41/54

ExperimentsEvaluate how much human effort our system can reduce

Simulate the process of annotating video with our system

Evaluation metricNumber of manual annotation

Count a tracked bounding box as a correct label if the distance between the centers of it and the ground-truth bounding box is not more than 10 pixels

42/54

MethodsInterp

CF-1way

CF-2way

NCC-1way

NCC-2way

NCC-Extrap-1way

NCC-Extrap-2way

43/54

The Order of LabelingFor those methods not restricting the order of labeling

Always correct the label with maximum center location error

For other methodsSame as the video display order

44/54

Video DatasetName # frames # particles # annotations

Droplet1 1203 15 635

Droplet2 637 53 4192

Bead 420 5 727

Video Droplet 1 is for parameter tuning which is performed using brutal force search

45/54

Parameter Tuning for CF-1way

Ground-truth correlation

46/54

Parameter Tuning for CF-1way

𝐻𝑡=(1−𝜌 ) 𝐻𝑡 −1+𝜌 h𝑡

47/54

Parameter Tuning for NCC-Extrap-1way

𝛿 (𝑢 ,𝑣 )=𝑒−‖(𝑢−𝑥 ′ ,𝑣− 𝑦 ′ )

𝜎 ∙ 𝑙 ‖2

48/54

Parameter Tuning for NCC-Extrap-1way

𝜙 (𝑢 ,𝑣 )=𝑤×𝛾 (𝑢 ,𝑣 )+ (1−𝑤 )×𝛿 (𝑢 ,𝑣 )

49/54

ResultDroplet2

(# annotations = 4192)Bead

(# annotations = 727)Interp 457 (10.90%) 88 (12.10%)

CF-1way 1475 (35.19%) 79 (10.89%)

CF-2way 1973 (47.07%) 112 (15.41%)

NCC-1way 56 (1.34%) 11 (1.51%)

NCC-2way 129 (3.08%) 21 (2.89%)

NCC-Extrap-1way 53 (1.26%) 9 (1.24%)

NCC-Extrap-2way 115 (2.74%) 20 (2.75%)

50/54

Error Analysis for NCC-Extrap-1way

51/54

Error Analysis for NCC-Extrap-1way

52/54

Error Analysis for NCC-Extrap-1way

53/54

Target

Error

ConclusionsWe designed a system for particle annotation in video sequences

Our system can reduce human effort in annotation

Combining NCC and extrapolation achieves the best result

It is better to annotate video in its display order

Future workUse polynomial curve fitting to predict the location of particle in the next

frame

54/54

Thank you for listening

top related