Антон Семенченко | (epam systems, dpi.solutions )Сравнительный...

99
Anton Semenchenko Сравнительный анализ инструментов Автоматизации Desktop AUT

Upload: rif-technology

Post on 09-Jan-2017

397 views

Category:

Technology


4 download

TRANSCRIPT

Page 1: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Anton Semenchenko

Сравнительный анализ инструментов

Автоматизации Desktop AUT

Page 2: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Agenda, part 1 (general)

1. Problem2. Solutions 2016

Page 3: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Agenda, part 2 (tools and criteria's)

1. Tools to be compared (15)

2. How \ why we selected this list of tools?

3. Comparison criteria types (3)

4. Stakeholders oriented comparison criteria (7)

5. Mixed comparison criteria (7)

6. Tech stuff oriented comparison criteria (8)

7. How \ why we selected these lists of criteria's?

8. How to select proper criteria's for your project

Page 4: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Agenda, part 3 (comparison analyses)

1. Mixed comparison criteria

2. Tech stuff oriented comparison criteria

3. Stakeholders oriented comparison criteria

4. Define our “standard” context

5. Summarized scores

6. How to calculate scores

7. How to use scores / presentation

8. 4 summarized tables

Page 5: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Agenda, part 4 (tools, “how to” and examples)

1. How to define proper tool based on selected criteria's

2. How to link information from presentation to QA Automation

metrics

3. How to link information from presentation to Project Health

Check

4. How to link information from presentation to QA Automation

ROI

5. Tools tiny overview

6. Tools overview structure

7. Example of tool usage structure

Page 6: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Agenda, part 5 (trends, science and “what’s next”)

1. Define a Trend! Is it possible ..?

2. Trend – an option

3. Why so?

4. What’s next

Page 7: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Problem

• There is an implicit leader for Web automation

Page 8: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Problem

• It’s not that simple if to talk about desktop apps

Page 9: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Tools to be compared

• TestComplete Desktop

• Unified Functional Testing (UFT)

• Ranorex

• Telerik Test Studio

• Zeenyx AscentialTest

• MS VS Coded UI

• CUIT

• AUTOIT

• Sikuli

• Jubula

• Robot Framework

• Winium

• WinAppDriver

• QTWebDriver

• PyWinAuto

Page 10: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

How \ why we selected this list of tools?

Page 11: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Comparison criteria types

1. Stakeholders oriented2. Tech stuff oriented3. Mixed

Page 12: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Stakeholders oriented comparison criteria

1. Approximate complexity of auto-test development2. Approximate complexity of auto-test support3. Approximate “entrance” level4. Required technical skills level5. Tests readability6. How fast tests run7. Ability to re-use "Business-Logic" layer in other technical

context

Page 13: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Mixed comparison criteria

1. Supported platforms2. Supported technologies3. Licensing4. Maturity5. Record-Play system support6. Standard actions pack

Page 14: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Tech stuff oriented comparison criteria

1. Programming languages support2. Have tools for mapping3. Self-Made architecture support4. Data-Driven testing support5. Test-Driven development support6. Key-word driven7. Behavior Driven Development support8. Continues integration system support

Page 15: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

How \ why we selected these lists of criteria's?

Page 16: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

How to select proper criteria's for your project

Page 17: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Mixed comparison criteria

Page 18: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Supported platforms – “the best” tool?

1. TestComplete Desktop

2. Unified Functional Testing (UFT)

3. Ranorex

4. Telerik Test Studio

5. Zeenyx AscentialTest

6. MS VS Coded UI

7. CUIT

8. AUTOIT

9. Sikuli

10. Jubula

11. Robot Framework

12. Winium

13. WinAppDriver

14. QTWebDriver

15. PyWinAuto

Page 19: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Supported platformsTool Platforms Mark

TestComplete Desktop Windows

Unified Functional Testing

Windows

Ranorex WindowsTelerik Test Studio WindowsZeenyx AscentialTest Windows

MS VS Coded UI ; CUIT Windows

AUTOIT WindowsSikuli Windows, Unix-like GoodJubula Windows, Unix-like GoodRobot Framework Windows, Unix-like GoodWinium / WinAppDriver ;QTWebDriver

Windows / Windows; Cross-Platform

/ ; Good

PyWinAuto Windows

Page 20: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Supported technologies – “the best” tool?

1. TestComplete Desktop

2. Unified Functional Testing (UFT)

3. Ranorex

4. Telerik Test Studio

5. Zeenyx AscentialTest

6. MS VS Coded UI

7. CUIT

8. AUTOIT

9. Sikuli

10.Jubula

11. Robot Framework

12. Winium

13. WinAppDriver

14. QTWebDriver

15. PyWinAuto

Page 21: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Supported technologiesTool Technologies Mark

TestComplete Desktop C/C++, WinForms, WPF, Java, Qt

Unified Functional Testing

WinForms, WPF, Java, SAP

Ranorex WinForms, WPF, Java, Qt, SAP

Telerik Test Studio WPF BadZeenyx AscentialTest Win Forms, WPF, Java BadMS VS Coded UI ; CUIT Win Forms (partial), WPF BadAUTOIT OS level GoodSikuli Image recognition based GoodJubula WinForms, WPF, Java BadRobot Framework Uses AutoIT (and co inside) GoodWinium / WinAppDriver ;QTWebDriver

WinForms, WPF / Any ; QT Bad

PyWinAuto Win32 API, WinForms (partial, Win32 API bases)

Bad

Page 22: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Licensing – “the worst” tool?

1. TestComplete Desktop

2. Unified Functional Testing (UFT)

3. Ranorex

4. Telerik Test Studio

5. Zeenyx AscentialTest

6. MS VS Coded UI

7. CUIT

8. AUTOIT

9. Sikuli

10. Jubula

11. Robot Framework

12. Winium

13. WinAppDriver

14. QTWebDriver

15. PyWinAuto

Page 23: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

LicensingTool License Mark

TestComplete Desktop Paid BadUnified Functional Testing

Paid Bad

Ranorex Paid BadTelerik Test Studio Paid BadZeenyx AscentialTest Paid BadMS VS Coded UI ; CUIT Paid BadAUTOIT FreeSikuli Open source GoodJubula Open source GoodRobot Framework Open source GoodWinium / WinAppDriver ;QTWebDriver

Open source Good

PyWinAuto Open source Good

Page 24: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Maturity – “the worst” tool?

1. TestComplete Desktop

2. Unified Functional Testing (UFT)

3. Ranorex

4. Telerik Test Studio

5. Zeenyx AscentialTest

6. MS VS Coded UI

7. CUIT

8. AUTOIT

9. Sikuli

10.Jubula

11.Robot Framework

12.Winium

13.WinAppDriver

14.QTWebDriver

15.PyWinAuto

Page 25: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

MaturityTool Maturity

TestComplete Desktop GoodUnified Functional Testing

Good

Ranorex GoodTelerik Test Studio GoodZeenyx AscentialTest MS VS Coded UI ; CUIT GoodAUTOITSikuliJubulaRobot FrameworkWinium / WinAppDriver ;QTWebDriver

Bad

PyWinAuto

Page 26: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Record-Play support – do we really need it?

Page 27: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Record-Play supportTool Record-Play Mark

TestComplete Desktop Yes GoodUnified Functional Testing

Yes Good

Ranorex Yes GoodTelerik Test Studio Yes GoodZeenyx AscentialTest NoMS VS Coded UI ; CUIT NoAUTOIT NoSikuli NoJubula NoRobot Framework NoWinium / WinAppDriver ;QTWebDriver

No

PyWinAuto No

Page 28: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Standard actions pack – do we really need it?

Page 29: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Standard actions packTool STD actions Mark

TestComplete Desktop NoUnified Functional Testing

No

Ranorex NoTelerik Test Studio NoZeenyx AscentialTest Yes GoodMS VS Coded UI ; CUIT NoAUTOIT NoSikuli Yes GoodJubula Yes GoodRobot Framework NoWinium / WinAppDriver ;QTWebDriver

No

PyWinAuto Yes / No (via SWAPY)

Page 30: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Tech stuff oriented comparison criteria

Page 31: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Programming languages – “the best” tool?

1. TestComplete Desktop

2. Unified Functional Testing (UFT)

3. Ranorex

4. Telerik Test Studio

5. Zeenyx AscentialTest

6. MS VS Coded UI

7. CUIT

8. AUTOIT

9. Sikuli

10. Jubula

11. Robot Framework

12. Winium

13. WinAppDriver

14. QTWebDriver

15. PyWinAuto

Page 32: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Programming languages supportTool Language Mark

TestComplete Desktop Python, C#Script, JScript, C++Script, VBScript,

DelphiScript

Good

Unified Functional Testing

VBScript Bad

Ranorex C#, VB.NetTelerik Test Studio C#, VB.NetZeenyx AscentialTest Own DSL BadMS VS Coded UI ; CUIT C#, VB.NetAUTOIT Own Basic-like language BadSikuli Jython, JavaJubula -Robot Framework Own DSL, Java, Python

Winium / WinAppDriver ;QTWebDriver

Java, JavaScript, PHP, Python, Ruby, C#

Good

PyWinAuto CPython

Page 33: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Tools for mapping – “the best” tool?

1. TestComplete Desktop

2. Unified Functional Testing (UFT)

3. Ranorex

4. Telerik Test Studio

5. Zeenyx AscentialTest

6. MS VS Coded UI

7. CUIT

8. AUTOIT

9. Sikuli

10. Jubula

11. Robot Framework

12. Winium

13. WinAppDriver

14. QTWebDriver

15. PyWinAuto

Page 34: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Tools for mappingTool Tools for mapping Mark

TestComplete Desktop Yes GoodUnified Functional Testing

Yes Good

Ranorex Yes GoodTelerik Test Studio Yes GoodZeenyx AscentialTest Yes / No GoodMS VS Coded UI ; CUIT NoAUTOIT NoSikuli Yes / NoJubula Yes GoodRobot Framework No

Winium / WinAppDriver ;QTWebDriver

No

PyWinAuto No

Page 35: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Custom architecture – “the worst” tool?

1. TestComplete Desktop

2. Unified Functional Testing (UFT)

3. Ranorex

4. Telerik Test Studio

5. Zeenyx AscentialTest

6. MS VS Coded UI

7. CUIT

8. AUTOIT

9. Sikuli

10. Jubula

11. Robot Framework

12. Winium

13. WinAppDriver

14. QTWebDriver

15. PyWinAuto

Page 36: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Custom architecture

Tool Custom architecture MarkTestComplete Desktop Yes / NoUnified Functional Testing

Yes / No

Ranorex Yes / NoTelerik Test Studio Yes / NoZeenyx AscentialTest No BadMS VS Coded UI ; CUIT Yes GoodAUTOIT No BadSikuli Yes GoodJubula No / YesRobot Framework Yes GoodWinium / WinAppDriver ;QTWebDriver

Yes Good

PyWinAuto Yes Good

Page 37: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

DDT support – “the worst” tool?

1. TestComplete Desktop

2. Unified Functional Testing (UFT)

3. Ranorex

4. Telerik Test Studio

5. Zeenyx AscentialTest

6. MS VS Coded UI

7. CUIT

8. AUTOIT

9. Sikuli

10. Jubula

11. Robot Framework

12. Winium

13. WinAppDriver

14. QTWebDriver

15. PyWinAuto

Page 38: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

DDT supportTool DDT support Mark

TestComplete Desktop Yes GoodUnified Functional Testing

Yes Good

Ranorex Yes GoodTelerik Test Studio Yes GoodZeenyx AscentialTest Yes GoodMS VS Coded UI ; CUIT Yes GoodAUTOIT No BadSikuli Yes / No GoodJubula Yes GoodRobot Framework Yes Good

Winium / WinAppDriver ;QTWebDriver

Yes Good

PyWinAuto Yes Good

Page 39: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

TDD support – “the worst” tool?

1. TestComplete Desktop

2. Unified Functional Testing (UFT)

3. Ranorex

4. Telerik Test Studio

5. Zeenyx AscentialTest

6. MS VS Coded UI

7. CUIT

8. AUTOIT

9. Sikuli

10.Jubula

11. Robot Framework

12. Winium

13. WinAppDriver

14. QTWebDriver

15. PyWinAuto

Page 40: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

TDD supportTool TDD Mark

TestComplete Desktop Yes / No GoodUnified Functional Testing

Yes / No Good

Ranorex Yes / No GoodTelerik Test Studio Yes / No GoodZeenyx AscentialTest No BadMS VS Coded UI ; CUIT Yes GoodAUTOIT No BadSikuli Yes / No GoodJubula Yes GoodRobot Framework Yes GoodWinium / WinAppDriver ;QTWebDriver

Yes Good

PyWinAuto Yes Good

Page 41: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Key-word driven – “the best” tool?

1. TestComplete Desktop

2. Unified Functional Testing (UFT)

3. Ranorex

4. Telerik Test Studio

5. Zeenyx AscentialTest

6. MS VS Coded UI

7. CUIT

8. AUTOIT

9. Sikuli

10. Jubula

11. Robot Framework

12. Winium

13. WinAppDriver

14. QTWebDriver

15. PyWinAuto

Page 42: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Key-word driven supportTool Key-word Mark

TestComplete Desktop No BadUnified Functional Testing

No Bad

Ranorex Yes / NoTelerik Test Studio Yes / NoZeenyx AscentialTest Yes GoodMS VS Coded UI ; CUIT Yes / NoAUTOIT No BadSikuli Yes / NoJubula No BadRobot Framework Yes GoodWinium / WinAppDriver ;QTWebDriver

Yes / No

PyWinAuto Yes / No

Page 43: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

BDD support – “the worst” tool?

1. TestComplete Desktop

2. Unified Functional Testing (UFT)

3. Ranorex

4. Telerik Test Studio

5. Zeenyx AscentialTest

6. MS VS Coded UI

7. CUIT

8. AUTOIT

9. Sikuli

10. Jubula

11. Robot Framework

12. Winium

13. WinAppDriver

14. QTWebDriver

15. PyWinAuto

Page 44: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

BDD supportTool BDD Mark

TestComplete Desktop No BadUnified Functional Testing

No Bad

Ranorex Yes GoodTelerik Test Studio Yes GoodZeenyx AscentialTest No BadMS VS Coded UI ; CUIT Yes GoodAUTOIT No BadSikuli Yes GoodJubula No BadRobot Framework Yes / NoWinium / WinAppDriver ;QTWebDriver

Yes Good

PyWinAuto Yes Good

Page 45: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

CI support – “the worst” tool?

1. TestComplete Desktop

2. Unified Functional Testing (UFT)

3. Ranorex

4. Telerik Test Studio

5. Zeenyx AscentialTest

6. MS VS Coded UI

7. CUIT

8. AUTOIT

9. Sikuli

10. Jubula

11. Robot Framework

12. Winium

13. WinAppDriver

14. QTWebDriver

15. PyWinAuto

Page 46: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

CI supportTool CI Mark

TestComplete Desktop Automated Build StudioUnified Functional Testing

Jenkins plugin

Ranorex JenkinsTelerik Test Studio BambooZeenyx AscentialTest Test Execution

ManagementMS VS Coded UI ; CUIT Any GoodAUTOIT - / AnySikuli - / Any Java-compatibleJubula No BadRobot Framework Jenkins pluginWinium / WinAppDriver ;QTWebDriver

Any Good

PyWinAuto Any Good

Page 47: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Stakeholders oriented comparison criteria

Page 48: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Define our “standard” context

Page 49: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Approximate complexity of auto-test development

Tool Development MarkTestComplete Desktop ~3hUnified Functional Testing

~3h

Ranorex ~2h GoodTelerik Test Studio ~2h GoodZeenyx AscentialTest ~2h GoodMS VS Coded UI ; CUIT ~3h ; 2h ; GoodAUTOIT ~1h GoodSikuli ~2h GoodJubula ~2h GoodRobot Framework ~4hWinium / WinAppDriver ;QTWebDriver

~3h / 6h -> 2h / Bad -> Good

PyWinAuto ~1h Good

Page 50: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Approximate complexity of auto-test support (per year)

Tool Support MarkTestComplete Desktop ~3h BadUnified Functional Testing

~3h Bad

Ranorex ~2h GoodTelerik Test Studio ~2h GoodZeenyx AscentialTest ~3h BadMS VS Coded UI ; CUIT ~2h ; 1h GoodAUTOIT ~4h BadSikuli ~5h BadJubula ~2h GoodRobot Framework ~1h GoodWinium / WinAppDriver ;QTWebDriver

~2h / 10h -> 1h Good / Bad -> Good

PyWinAuto ~2h Good

Page 51: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Approximate “entrance” level – “the best” tool?

1. TestComplete Desktop

2. Unified Functional Testing (UFT)

3. Ranorex

4. Telerik Test Studio

5. Zeenyx AscentialTest

6. MS VS Coded UI

7. CUIT

8. AUTOIT

9. Sikuli

10.Jubula

11.Robot Framework

12.Winium

13.WinAppDriver

14.QTWebDriver

15.PyWinAuto

Page 52: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Approximate “entrance” levelTool Level

TestComplete Desktop HighUnified Functional Testing

High

RanorexTelerik Test StudioZeenyx AscentialTest MS VS Coded UI ; CUIT HighAUTOIT LowSikuli LowJubulaRobot Framework HighWinium / WinAppDriver ;QTWebDriver

High ->

PyWinAuto

Page 53: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Required “technical skills” level – “the best” tool?

1. TestComplete Desktop

2. Unified Functional Testing (UFT)

3. Ranorex

4. Telerik Test Studio

5. Zeenyx AscentialTest

6. MS VS Coded UI

7. CUIT

8. AUTOIT

9. Sikuli

10. Jubula

11. Robot Framework

12. Winium

13. WinAppDriver

14. QTWebDriver

15. PyWinAuto

Page 54: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Required “technical skills” levelTool Level

TestComplete DesktopUnified Functional Testing RanorexTelerik Test StudioZeenyx AscentialTest LowMS VS Coded UI ; CUIT High ; AUTOIT LowSikuli LowJubula LowRobot Framework HighWinium / WinAppDriver ;QTWebDriver

High ->

PyWinAuto Low

Page 55: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Test readability – “the worst” tool?

1. TestComplete Desktop

2. Unified Functional Testing (UFT)

3. Ranorex

4. Telerik Test Studio

5. Zeenyx AscentialTest

6. MS VS Coded UI

7. CUIT

8. AUTOIT

9. Sikuli

10. Jubula

11. Robot Framework

12. Winium

13. WinAppDriver

14. QTWebDriver

15. PyWinAuto

Page 56: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Test readabilityTool Level

TestComplete DesktopUnified Functional Testing RanorexTelerik Test StudioZeenyx AscentialTest HighMS VS Coded UI ; CUITAUTOIT LowSikuli HighJubula HighRobot Framework - > HighWinium / WinAppDriver ;QTWebDriver

- > High

PyWinAuto High

Page 57: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

How fast tests run – “the best” tool?

1. TestComplete Desktop

2. Unified Functional Testing (UFT)

3. Ranorex

4. Telerik Test Studio

5. Zeenyx AscentialTest

6. MS VS Coded UI

7. CUIT

8. AUTOIT

9. Sikuli

10. Jubula

11. Robot Framework

12. Winium

13. WinAppDriver

14. QTWebDriver

15. PyWinAuto

Page 58: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

How fast tests runTool Level

TestComplete Desktop BadUnified Functional Testing

Bad

RanorexTelerik Test StudioZeenyx AscentialTest MS VS Coded UI ; CUIT GoodAUTOIT GoodSikuli BadJubula BadRobot Framework GoodWinium / WinAppDriver ;QTWebDriver

Good

PyWinAuto Good

Page 59: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Ability to re-use "Business-Logic" layerTool “BDD” Mark

TestComplete Desktop No BadUnified Functional Testing

No Bad

Ranorex Yes GoodTelerik Test Studio Yes GoodZeenyx AscentialTest No BadMS VS Coded UI ; CUIT Yes GoodAUTOIT No BadSikuli Yes GoodJubula No BadRobot Framework Yes GoodWinium / WinAppDriver ;QTWebDriver

Yes Good

PyWinAuto Yes Good

Page 60: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Summarized scores

Page 61: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

How to calculate scores

Page 62: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

How to use scores

Page 63: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Stakeholders oriented scoreTool Score

TestComplete Desktop -2Unified Functional Testing

-2

Ranorex +3Telerik Test Studio +3Zeenyx AscentialTest +1MS VS Coded UI ; CUIT +1AUTOIT +1Sikuli +3Jubula +2Robot Framework +2Winium / WinAppDriver ;QTWebDriver

+2

PyWinAuto +6

Page 64: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Mixed scoreTool Score

TestComplete Desktop +1Unified Functional Testing

+1

Ranorex +1Telerik Test Studio 0Zeenyx AscentialTest -1MS VS Coded UI ; CUIT -1AUTOIT +1Sikuli +4Jubula +1Robot Framework +2Winium / WinAppDriver ;QTWebDriver

-2

PyWinAuto -1

Page 65: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Tech stuff oriented scoreTool Score

TestComplete Desktop +2Unified Functional Testing

0

Ranorex +4Telerik Test Studio +4Zeenyx AscentialTest -1MS VS Coded UI ; CUIT +4AUTOIT -6Sikuli +4Jubula +1Robot Framework +4Winium / WinAppDriver ;QTWebDriver

+6

PyWinAuto +5

Page 66: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Summarized scoreTool Stub

TestComplete Desktop +1Unified Functional Testing

-1

Ranorex +8Telerik Test Studio +7Zeenyx AscentialTest -1MS VS Coded UI ; CUIT +4AUTOIT -4Sikuli +11Jubula +4Robot Framework +8Winium / WinAppDriver ;QTWebDriver

+6

PyWinAuto +10

Page 67: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

How to define proper tool based on selected criteria's

Page 68: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

How to

1. link information from presentation to QA Automation metrics

2. link information from presentation to Project Health Check3. link information from presentation to QA Automation ROI

Page 69: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Tools tiny overview

Page 70: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Tools overview structure

1. Pros2. Cons3. What kind of project / product / problem /

situation certain tools could be used for!

Page 71: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Example of tool usage structure

1. Plus several examples of each tool usage— Example structure:

• Values:— Value the individual— Act as a team— Strive for excellence— Focus on customer— Act with integrity

• Prisms:— Technology— Delivery— Leadership

Page 72: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Project A

Page 73: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Project A

Page 74: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Project A

Page 75: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Test Complete Desktop

1. Pros• Low entrance level• High level of test scripts’ flexibility• Huge knowledge base (at about MSDN level)• Wide choice of script languages which look like common

languages

2. Cons• Very expensive license• Very specific own script languages

Page 76: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Unified functional testing

1. Pros• Low “entrance” level• High level of test scripts’ flexibility• Good tech support

2. Cons• Strict integration with other HP solutions • Very specific own DSL

Page 77: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Ranorex

1. Pros• Low “entrance” level• Script tests are written on common languages (C#, VB,Net)• Good tech support

2. Cons• Paid license

Page 78: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Telerik Test Studio (Desktop)

1. Pros• Low “entrance” level• Great parameterization of Keyword tests• DDT support using common formats (CSV, XLS, DB)• Converting tests to common languages (C#, VB.NET)

2. Cons• Only WPF-applications

Page 79: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Zeenyx

1. Pros• Supporting complex logic• Great organization of DDT• Using standard .Net libraries support

2. Cons• Need time to learn how to use• Specific own DSL

Page 80: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

MS VS Coded UI

1. Pros• “Native” for Windows• Supports a huge set of UI technologies• Generated UI Map• Ready to go infrastructure• Good documentation and support

2. Cons• License cost• Relatively “low level” API

Page 81: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

MS VS Coded UI + CUIT

1. Pros• The same as for MS VS Coded UI• Elegant “High level” API

2. Cons• The same as for MS VS Coded UI

Page 82: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

AutoIT

1. Pros• Easy• Universal• Free

2. Cons• There is no ready-to-use verification instruments• Test = exe file• There is no ready-to-use reports

Page 83: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Sikuli

1. Pros• IDE is easy to learn and use• Standard actions pack• Supports an ability to write tests using common languages (Java,

Python)• Supports an ability to work on different platforms and with any

applications• Free

2. Cons• Low test’s reliability• Slow tests work• No ability to work with texts• Complicated to support tests

Page 84: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Jubula

1. Pros• IDE is easy to use• Supports an ability to work on requirement base• Integrated DB for storing test data and results• Free

2. Cons• No flexibility which is ingrain to script tests• No CI support

Page 85: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Robot Framework

1. Pros• Their own not complex and easy-to-read Keyword-based language• Plugins for different IDE’s• Work with different Oss• Different programming languages support• Tools for creating user-own libraries• Free

2. Cons• High entrance level

Page 86: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Winium

1. Pros• Familiar syntax and API• Supports all the languages that are supported by Selenium

WebDriver• Free

2. Cons• “Immature” testing tool• Incomplete way of locating elements• A lack of documentation

Page 87: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

WinAppDriver

1. Pros• Familiar syntax and API• “Native” for Windows• Free

2. Cons• “Immature” testing tool• Complicated (in special case usage)• A lack of documentation

Page 88: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

QTWebDriver

1. Pros• Familiar syntax and API• QT Applications oriented / “Native” (unique tool)• Free

2. Cons• “Immature” testing tool• Complicated (in special case usage)• A lack of documentation

Page 89: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

PyWinAuto

1. Pros• Extremely simple to use• Easy to support• Free

2. Cons• Do not support all popular UI technologies• CPython only

Page 90: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Define a Trend! Is it possible ..?

Page 91: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Trend

1. There is a potential leader for Desktop Automation

Page 93: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Why so?

Non-technical scientific prove of Trend

• Peter Drucker “Management. Challenges for the 21st Century”

Note: It’s a topic of the whole big conversation, and I’m sure we’re going to get back to it, but not today…

Page 94: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

How to

1. use this presentation on different project phases2. use this presentation based on main project roles

Page 95: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

What’s next (just a possible way)

• Shu1. Use Presentation

1. Please, follow recommendationsa) “How to select proper criteria's for your project” b) “How to define proper tool based on selected criteria's”c) “How to link information from presentation to QA Automation

metrics”d) “How to link information from presentation to Project Health Check”e) “How to link information from presentation to QA Automation ROI”f) “How to use this presentation on different project phases”g) “how to use this presentation based on main project roles”

Page 96: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

What’s next

• Ha1. Update a set of criteria's2. Update a set of tools3. Update Presentation4. Read “Scientific” prove of Trend

Page 97: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

What’s next

• Ri1. Re-Read “Scientific” prove of Trend2. Update a set of criteria's3. Update a set of tools4. Update Presentation5. Predict the “Trend”6. Manage the “Trend”

Page 98: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

Next iteration • Move from static (Presentation) to dynamic (Application)• For example, “https://telescope.epam.com”

Page 99: Антон Семенченко | (EPAM Systems, DPI.Solutions )Сравнительный анализ инструментов Desktop-ной автоматизации

CONTACT [email protected]

semenchenko_anton_v

https://www.linkedin.com/in/anton-semenchenko-612a926b

https://www.facebook.com/semenchenko.anton.v

https://twitter.com/comaqa

Thanks for your attention

Anton SemenchenkoDPI.Solutions

EPAM Systems

www.comaqa.bywww.corehard.by