Agile DW/BI Testing -- Just Get Started!
When quality and testing is moved up front on a project, everyone enjoys a higher quality project. Here's how your team can be happier with an agile approach to testing, even before implementing key automation tools.
By Lynn Winterboer and Ken Collier
[Editor's note: Lynn Winterboer and Ken Collier are leading a session at the TDWI World Conference in San Diego (August 18-23, 2013), Agile BI: Test Automation and Test-Driven Database Development. We asked them to share a realistic scenario of how agile testing works in the real world.]
Rather than explain the well-understood theory and benefits of agile, let's examine how agile testing works in the real world and see how agile business and delivery teams reap the benefits of the methodology and have more fun in the process.
Beth, a sales analyst, has an existing report showing total number of orders and related income by sales rep by month. The director of sales, a key user of this report, has noticed that over the past six months the number of orders increased an average of 1.5 percent each month while the total income has only increased an average of 0.75 percent per month. He hypothesizes that customer discounts may explain this and asks Beth to provide insight into discounts.
Because the present report does not contain discount data, Beth asks the BI product owner for an enhanced report with two additional fields. The new user story reads, "As director of sales, I need to see the total value and percent of discounts applied to orders by sales rep each month, so that I can determine if reps are applying discounts outside normal sales guidelines."
This BI team uses an agile approach to deliver frequent value to business stakeholders, which involves several important practices including behavior-driven development, representative data sets, and frequent and early testing. This team is still establishing a robust agile development environment with rigorous version control, test automation, and a dedicated continuous integration server. Nonetheless, they are eager to get started with agile testing practices and then evolve toward better automation.
Behavior-Driven Development (BDD):
Before the team begins development on the new story, Beth pairs with the team's lead tester to define how the team will know they are done and have met the director's needs. Together they craft a set of acceptance criteria that will determine when the objective has been met. This team uses a behavior-driven development approach for story specification. BDD tests follow the pattern, "Given [some initial context], when [action performed or event occurs], then [expect result occurs]." Examples of acceptance criteria include:
- Order discounted: Given an order for $200, when the discount is $50, then the report should reflect a discount of $50 and 25% as new fields appended to the right of the existing columns.
- Order not discounted: Given an order for $300, when there is no discount applied (i.e., the discount value is $0 or [null]), then the report should reflect a discount of $0 and 0% as new fields appended to the right of the existing columns.
- Negative discount: Given an order for $100, when the discount is -$1.00, then the report should reflect a discount of -$1.00 and -1% as new fields appended to the right of the existing columns, as well as highlight the order in red and show a red flag next to any roll-ups that include that order. (Note that the BI team would not have been aware of this specific need based solely on Beth's initial requirements. The discussion with the test lead, who is trained to look for oddities and anomalies in data, was useful in prompting Beth to think about something she normally wouldn't have considered until she saw the issue in the final report)
When acceptance criteria are formulated in this BDD style, they are easily captured as automated test cases using testing frameworks such as the open source Cucumber and others.
Representative Data Sets
To automate these test cases, the team needs a test data set that is as small as possible while still containing a representative sample of actual production data. Because these functional tests are not testing performance and load, they only need a small test set to reflect the spectrum of data found in production. The resulting test data includes at least one order representing each of the following discounts: $0, $1, $1,000,000, [null], and -$1. There may very well be additional data sets Beth and the team decide to include.
Once the tests and test data are defined, the developers can start working on the request. They ask Beth if she can plan on coming in that afternoon to take a look at their results.
Two of the team's developers will bring the discount data from the order management system into the data warehouse (DW), add the discount data to the reporting infrastructure, and update the sales report. They have a short meeting with the team's lead data modeler to determine where the new fields should reside in the DW and BI data models.
The developers take a short time to do their work and immediately unit test their code. They discover one of them accidentally forgot to set the discount-percent field type correctly, and fix it immediately. Once the code has passed unit testing, they integrate their code on the development server and run the pre-defined test cases using the data sets Beth and the test lead provided. The tests pass. (Note that BI teams can leverage automated unit testing tools to save developer time. We advocate using version control and a dedicated integration sandbox. However, teams without this agile infrastructure can still get started.)
Later, Beth smoke tests the updated report on the demo environment and notices that some anomalous negative discounts are appearing. She asks the team to suppress them from the report and provide an exception report for these records so they can be addressed separately.
The team is able to quickly suppress these negative discount records, re-run the tests, and show Beth the results. However, they agree that the exception report request represents a new user story, which should be prioritized into the backlog for a future iteration.
Beth stops by the director's office to let him know the Discount report will be in production with the next release. He is relieved to hear that, and mentions how nice it is to get answers so quickly these days. It's so much easier to do his job now that the BI team doesn't need six months to deliver anything useful!
As this scenario demonstrates, the benefits to DW/BI teams of these agile testing techniques include:
- Clarity: Tests written at the beginning of the project provide clarity to all involved about where the "goal line" is for each requirement:
- Acceptance criteria are the definition of "done"
- Passing tests are the measure of "done"
- Regression tests are the measure of "still done"
- Quality: Focusing on quality and testing up-front ensures a higher level of quality in the resulting product by eliminating the tendency to "squeeze" tail-end testing when schedules get tight
- Feasibility: Testing with small, representative data sets reduces the complexity of large data loads and related server infrastructures
- Alignment: Frequent testing provides timely results to users, quick feedback to developers, reduces the time it takes for a developer to respond to defects, and keeps the users and delivery team closely aligned on delivering value, together
- Regression Testing: Each set of tests is added to the test suite for the project and are re-run for regression testing with every new development delivery; the ability to confidently regression test reduces the risks related to future changes and enhancements
Furthermore, by driving testing into the story specification conversation early, testers become better partners with product owners and developers.
There's one final benefit we've seen in our work: those business and delivery teams surrounded by passing tests have more fun!
Lynn Winterboer teaches and coaches DW/BI teams on how to effectively apply agile principles and practices to their work. She is the founder of Winterboer Agile Analytics, and can be reached atLynn@LynnWinterboer.com.
Ken Collier is the agile analytics practice lead for ThoughtWorks, a company specializing in custom software and solutions development. Ken can be reached at KenCollier@theagilist.com.