paris video tech - 1st edition: dailymotion améliorer l'expérience utilisateur grâce aux...
TRANSCRIPT
User Experience – Streaming Analytics
• 100 millions video views each day
• 60% desktop - 31 % mobile - 8% tablet - 1% TV
• 97% VOD - 3% live
• World Wide (France = ~ 30%)
• 69% html5 - 31% flash
Introduction : Dailymotion Facts
• Increasing user engagement => Raising revenue
• Multi dimensional : loading, engagement, rebuffering, video quality
• User experience is context sensitive • device• content
Goal : What is user experience ?
• How to implement data pipeline ?
• How to understand what is going on ?• User quality metrics• Video quality metrics• Network quality metrics
• How to improve user experience ?• Optimize delivery• Optimize player
Goal : Improving user experience
Player events Data aggregation Visualization
Data pipeline : Architecture Overview
• real-time activity
• watched > 0
• 1 month
Data visualization : Heatmap example
• More data beats better models - avoiding overfitting
• Better data beats more data - cleaning outliers
• The 80/20 rule
• P-value - measure uncertainty
Data pipeline : Basic Rules
• Choosing metrics => process not deterministic
• User engagement : Played , Watched , Watched ratio?
• Rebuffering event => waitTime > X ms ?
Data pipeline : choosing key metrics
• CDN comparison
• Routing optimization
• Country:KR
• Stream type : recorded
Data analysis : latency / CDN
• CDN comparison
• Routing optimization
• Country:KR
• Stream type : recorded
Data analysis : kbps / CDN
• seekNb
• negative correlation
• stream type : recorded
• 1 month
Data Analysis : seekNb / engagement
• buffering measure choice
• negative correlation
• stream type : recorded
• 1 month
Data analysis : buffering ratio / engagement
• rebufferingNb
• Negative Correlation
• stream type : recorded
• 1 month
Data analysis : rebufferingNb / engagement
• quality switch
• ABR algorithm
• stream type : recorded
• 1 month
Data analysis : level avg
State of ABR - stream tech comparison - VoD
rebufferingNb, percentage per tech worldwide
native 83.6%
hls.js 89.4%
flashls90.6%
State of ABR - stream tech comparison - live
rebufferingNb, percentage per tech worldwide
native 70.4%
hls.js 73.6%
flashls80.6%
introduce history parameter to bandwidth estimation in
inspired from
ABR now based on two bandwidth moving average• a fast one : adapting down quickly• a slow one : adapting up more slowly
bw estimate = min ( fast, slow)
Data-Driven Development : ABR Algorithm
uses
Are these magic numbers suitable for our use case ?
ABR magic numbers
A/B testing ABRdefine 20 traffic segments, each using a different config
enable in production …
Iteration 1 Fast average Slow average
control group 0 0
test group 1 0 1
test group 2 0 2
... 1 1
test group 18 1 9
test group 19 1 10
A/B testing ABR
wait for enough samples ( ~ 1 million per group)
compare key metrics• rebuffering rate• rebuffering ratio• user engagement• average quality, quality switches
iterate/circle around best samples
State of ABR - stream tech comparison - VoD
number of rebuffering, percentage per tech worldwide
native 83.6%
hls.js 89.4%
flashls90.6%
hls.js,s=15,f=4 90.7%
hls.js,s=9,f=4 90.2%
State of ABR - stream tech comparison - live
number of rebuffering, percentage per tech worldwide
native 70.4%
hls.js,s=0,f=0 73.6%
flashls80.6%
hls.js,s=9,f=5 79.3%
hls.js,s=7,f=5 74.7%
nb of level switch - live
hls.js,s=0,f=0
hls.js,s=9,f=5
network deliveryuse streaming metrics to rank CDNs per region / ISPredirect stream to best CDNs based on past history
transcodingA/B test different fragment duration
media engine / player optimizationstart renditionprogressive fragment parsing (Fetch API)
Next data driven improvement
[email protected]@dailymotion.comhttps://github.com/dailymotion/hls.js
thanks!