test automation on large agile projects: it's not a cakewalk
DESCRIPTION
Automating regression testing on large-scale agile projects with multiple independent Scrum teams is not a cake walk. Because there is no single "test team" that performs all the testing-each Scrum team develops and runs independent tests, gaps arise as different automation implementations spring up. One team adds a new function which breaks automated tests, setting back the progress of other teams. Scott Schnier reviews one organization's journey developing a "test community of practice" to coordinate test development and maintenance across Scrum teams. Scott shares the lessons they learned, particularly selecting tools compatible with other developer and tester needs. Learn how Scott extended the JUnit framework to support automated functional testing and how his teams keep the standard that a user story is not really done until all its tests are "green" in the continuous integration, regression test pipeline. Take back a new appreciation for the challenges-and solutions-for automating testing on really big agile projects.TRANSCRIPT
AW12 Concurrent Session 11/7/2012 3:45 PM
"Test Automation on Large Agile Projects: It?s Not a Cakewalk"
Presented by:
Scott Schnier Agilex Technologies
Brought to you by:
340 Corporate Way, Suite 300, Orange Park, FL 32073 888‐268‐8770 ∙ 904‐278‐0524 ∙ [email protected] ∙ www.sqe.com
Scott Schnier Agilex Technologies
Scott Schnier is currently a senior agile practice manager at Agilex Technologies working with other courageous and passionate people to bring agile development practices to the Federal Government. He has held positions of software engineer, director of development, mentor, architect, director of quality assurance, project manager, program manager, agile coach, ScrumMaster, and proxy product owner. Scott’s varied work experience has found him in small startups, Fortune 500 firms, and as a government contractor. A founding member of the Cincinnati chapter of ALN, Scott is currently active in the Washington DC chapter. Scott takes special pleasure in and has a passion for helping people work better together.
.
9/3/2012
1
Test Automation on Large Agile Projects, It’s not a Cakewalk
Scott SchnierAgilex [email protected]
1
The Story
Test Automation◦ Growth and division of work and
responsibility◦ Support of Agile Values◦ Lessons learned and victories
There are no “best practices”
2
9/3/2012
2
It’s not a cakewalk
3
The Setting
Core group of Agile/Scrum practioners Many staff new to Agile/Scrum Customer willing to try something new,
frustrated with past failures. Government contract, multiple vendors. Driven by Legislation
4
9/3/2012
3
Geography
5
Scrum Team Evolution
6
1
6
7
2
8
3
10
11
4
9
5
13
12R
D
S
After 2 ½ years
9/3/2012
4
Test Automation - Why ?
7
Keep Development on Track
8
9/3/2012
5
Capture Knowledge
9
Automate repetitive work
10
9/3/2012
6
“Working software over comprehensive documentation”
11
How do you know it still works today?
Why Automate Tests?
Move discovery of defects to the left ( earlier)
Respond to emergent requirements Capture Intellectual property (test skill) Enable test engineers to focus on creative
work not repetitive testing tasks. Make specifications executable and a
trusted way to understand the impact of change.
12
9/3/2012
7
Test Vision
The scrum team is the value creation engine
Tests are best created in the Scrum team Test is a skill not a role Need to support test while making the
scrum team primary
13
Issues on the Journey Test debt accumulation
Specialized testing tools contribute to segmentation of responsibility
People who do functional testing straddling more than one team
14
9/3/2012
8
Managing Test Debt
Organization Definition of “Done”
15
How do we get test debt
16
How many points is
that story?
Oh… 8 points plus testing
9/3/2012
9
Test Debt avoidance
Size of a story is more accurate with an integrated discussion regarding all of the test and product work.
17
Regression test debt
The story is complete but 30% of the regression tests are broken.
18
9/3/2012
10
Definition of “Done” To be complete - tests
must be done and running green on the Continuous Integration pipeline.
If we make an exception -then “test fixing” stories should be estimated and in the backlog so PO’s can agree to the exception.
19
20
Testing work straddling teams
9/3/2012
11
Traditional SDLC Workstreams
21
A more Agile Organization
22
9/3/2012
12
Where is testing done?
In the scrum team Recall definition of done What happens to the regression tests
that accumulate?
23
Organization for Test Management
Partitioned rotating triage/leadership
Completely partitioned Autonomous teams
Scrum teams Plus System Test Integration
Dedicated Maintenance Team
24
End of sprint handoff
Scrum 1
2
34
5
6
7
9
STI
Maint’
Scrum 1
2
3
45
6
7
9
9/3/2012
13
Lessons
Organize test sets to support scrum team affinity.
With more than ~5 scrum teams an integration or “DevOps” team is necessary and good.
Listen for and stamp out opportunities for debt to accumulate.
Test Community of Practice is valuable and takes work to be effective
25
Key challenge
26
Design organization/responsibility so that test debt does not accumulate.
9/3/2012
14
Shared responsibility
Functional scrum team◦ Develops tests◦ Executes acceptance
tests◦ Promotes to regression◦ Monitors selective
regression projects◦ Performs impact analysis
of new functions/fixes regression
System Test integration ◦ Maintains test
framework/standards◦ Executes regression
tests – All◦ Creates and maintains
reusable test components.◦ Helps functional scrum
teams with massive breakages.
27
Development workflow
1.Update personal workspace2.Implement change (Story/task, defect, test)3.Test locally, unit, integration, smoke4.Update personal workspace5.Resolve conflicts6.Commit change to repository & test pipe line7.Monitor key test projects8.Revert or fix any problems
28
9/3/2012
15
A slice of life
29
Skype Chat snippet [7/2/2012 12:23:28 PM] Dan : Trunk‐Dev is broken[7/2/2012 12:23:45 PM] Dan : anyone working for a fix?[7/2/2012 12:24:11 PM] Dan : …. ERROR :….….[7/2/2012 12:27:39 PM] Steve : I'm working with Mike on resolving it[7/2/2012 12:30:10 PM] Dan : Thanks
Continuous Integration status
30
9/3/2012
16
Continuous Integration Release Build - Integration and Unit tests
(1000’s)(commit package) Rapid Smoke Test and Regression Lite (20 min)
(commit package) Smoke on each functional test server(>= daily) Regression Lite on other Browsers(>= daily) Regression Heavy (2 hours) (>=daily) All other automated regression tests 1000’s(daily) Semi- automated (100’s) & Manual tests
(10’s)(delivery at least) Ad-Hoc testing 10’s of hours( each delivery)◦ Human intuition, UI (CSS & other risks)
31
Ongoing challenge
32
• As system gets larger individuals are less likely to feel responsible or capable of fixing broken tests.
• Start with one, two then three functional scrum teams.
• Months latter a team forms that specializes on a particular component of the system.
• A few more months another component team arises….
• Becomes harder to maintain the social norms of “Stop and Fix “
• The technical challenges also increase when the system complexity grows.
• The need and the risk of integration problems also grow at the same time.
9/3/2012
17
Challenges of Scale
People are socially more distant Technical skills become more focused Accountability becomes more elusive One mistake can impact more people,
makes actions more conservative, slows velocity
33
Wisdom When number of teams exceeds 7 +/-2 need a
system level team focused on test assets/regression
Test Community of Practice is essential Organize test sets with an affinity for teams or
system components. As the program gets larger needs to have a team
with gentle authority to ensure consistently Ultimately with > 10 teams will need to consider
multilevel integration. Performance/stress test is a separate team.
34
9/3/2012
18
Managing – Tool Specialists
Common tool platform for test developers and product developers
Increase the pool of people who can create or fix a test.
35
Why a “new” framework
Conejo Test Framework motivators◦ Eliminate barriers to “every one is a tester”◦ Enable data driving for more resilient tests◦ Integrate multi modal testing into one
coherent framework Web UI, component, web services, manual. ◦ Support the workflow from acceptance test
development to regression testing to obsolesce◦ Integrate all test assets
36
9/3/2012
19
Test design Goals
Minimize collateral code, focus on test target.
Enable tests to quickly respond to changes in Application Under Test
Easy to understand when it breaks (reuse common patterns, canonical test classes)
37
Test types
Unit Integration Functional
Scope Class Component(s) System
Persistence No Maybe Yes
Author Self Anyone Not the Author
Tests system interface No No Yes
Traceable to epic, story or defect
No Maybe Yes
Execution Pre release build
Pre release build Post release build*
38
9/3/2012
20
Test Architecture
39
Application Under Test
Test Framework
Interface Classes
TestsUtility Classes
Se JUnit
Test Execution
40
@Test@TestHeaderInfo(description = "Test Buyer Sequence",…functionalArea = {"Buyer" , "Order" } )public void testBuyerSequence() {//Test code goes here....
}
mvn test ‐Dconejo.filter.functionalAreas="Seller"
@Test@TestHeaderInfo(description = "Test Seller Sequence",…functionalArea = {"Seller" } )public void testSellerSequence() {//Test code goes here....
}
9/3/2012
21
Product and “Anti-Product”
41
Product
Test
DBA
DBA
Operations
Operations
ArchitectureArchitecture
It’s not a cakewalk
42
9/3/2012
22
Wrap Up
Multiple Motivations Test is the Product anti-matter Needs to be approached as first class
component of the solution. Complex organizational and technical
concerns Part of the secret sauce of a successful
Agile effort.
43
Questions? – Ideas!
44