We have begun the content migration to the new website based on the Kentico 10 platform. Dual-entry of items is required for content which has migrated. For details see the migration status page.

Get certified - Transform your world of work today

Close

Test Automation: Let Service Be Your Middle Man

Tests should be automated. You know it. I know it. Agile methods insist on it. Yet, all too often we don’t do enough of it, don’t do it soon enough, or worse don’t do it at all. I believe one big reason why we fall short is that we tend to automate at the wrong level. Most teams focus all their energy on unit testing and UI testing, while ignoring service-level testing altogether.

To see why this middle layer is so valuable, let’s look at each layer of the test automation pyramid a bit more closely.













Unit Testing

Unit testing forms the foundation of the test automation pyramid. As such, it should be the largest part of a solid test automation strategy. Automated unit tests are wonderful because they give specific data to a programmer—there is a bug and it’s on line 56. Programmers have learned that the bug may really be on line 54 or 62, but it’s much nicer to have an automated unit test narrow it down than it is to have a tester say, “There’s a bug in how you’re retrieving member records from the database,” which might represent 1,000 or more lines of code. Also, because unit tests are usually written in the same language as the system, programmers are often most comfortable writing them.

User Interface Testing

In contrast, we want to do as little automated user interface testing as possible. Why? Because it’s brittle, expensive, and time consuming. Suppose, for example, we wish to test a very simple calculator that allows a user to enter two integers, click either a multiply or divide button, and then see the result of that operation. To test this through the user interface, we would script a series of tests to drive the user interface, type the appropriate values into the fields, press the multiply or divide button, and then compare expected and actual values. It works, but it is not ideal.

Plus, testing an application this way is partially redundant—think about how many times a suite of tests like this will test the user interface. Each test case will invoke the code that connects the multiply or divide button to the code in the guts of the application that does the math. Each test case will also test the code that displays results. And so on. Testing through the user interface like this is expensive and should be minimized.

Service-Level Testing

That doesn’t mean we don’t need to test this kind of feature, though. We just need to find a way to run these test cases outside of the user interface. And this is where the service layer of the test automation pyramid comes in. In the way I’m using it, a service is something the application does in response to some input or set of inputs. Our example calculator involves two services: multiply and divide. Service-level testing is about testing the services of an application separately from its user interface. So instead of running a dozen or so multiplication test cases through the calculator’s user interface, we instead perform those tests at the service level. This is much more effective and much less cumbersome than trying to perform these same tests at the user interface level.

Automated unit testing is wonderful, but it can cover only so much of an application’s testing needs. User interface testing is often necessary, but should be used only in small doses. Service-level testing fills the gap between unit and user interface testing; giving teams the test automation they need, when they need it, with minimal effort and cost.

Article Rating

Current rating: 4.5 (4 ratings)
To leave a comment, please login with your scrumalliance.org credentials.
Comments
Tim Colton
Test
5/6/2016 9:59:02 AM

Tim Colton
Nice article.
5/6/2016 9:21:29 AM

Mike Cohn
You're welcome, Rohan.

And absolutely. Testers should write the test cases either concurrent with programmers writing code or even before (in Acceptance Test Driven Development). And testers and programmers can even pair occasionally to ensure a shared understanding of the functionality being built.
4/10/2015 11:39:26 AM

Rohan Bhokardankar
Thanks Mike for reminding the importance of Service level testing.

I also recommend that testers write the test cases, particularly for the service layer, from their interpretation of the requirements and sit besides the developers to see the functionality develop, update the test cases, if required.

Ultimately, when the code is delivered all the tests can be automated and run at a go. At times the developers can also run them and fix the bugs themselves !!
4/8/2015 9:55:12 AM

Mike Cohn
Hi Mike--

I agree--testing at the service-level can definitely be the most rewarding. Thanks for your comments.
3/16/2015 2:48:44 PM

Mike Dwyer
Mike
Service level testing can be the most rewarding as it is possible to use the Interface document as a very nice test architecture and even some cool products.
Working from the simplest of associations, say 3270 protocol and the emulators that came from it. You can trace great documentation that created code that passed a test of correspondence and thus allowed for software test harnesses to thoroughly exercise both sides of the conversation and lead to the creation of (http://en.wikipedia.org/wiki/3270_emulator).
It was a great time and very satisfying. I hope others take a couple of steps down this path
(YIKES is he that OLD!?!?!)
3/15/2015 6:21:18 PM

Mike Cohn
Thanks, Joel. I'm glad you liked this.

What I do is gradually get teams to try to improve their definitions of done. Maybe their definition of done doesn't include much (or any!) automated testing at first but gradually it's added at different layers until each new user story is being fully and adequately tested at all appropriate layers.
2/20/2015 11:18:27 PM

Joel Maslyn
Thank you for this article. Many opinions but there is no doubt that at what you call the service level does in fact result in less maintenance. Others may call it the sweet spot, where your return on investment is greater because there is less rework.

With that said, in our case we automated the API Layer to validate the methods/calls/packets. This helped us eliminate the issues coming from the client side when we did our UI testing.

By far, you get the most bang for your buck with unit testing.

The problem that we one day need to cover is how the acceptance criteria for any of these levels has to be clearly defined (and) the task itself should be a part of the sprint component type.

Too many times I see code being checked in without a firm requirement that the test is also part of the acceptance criteria.

That is another story, but thank you for your article and pragmatic view to automation.
2/17/2015 3:24:18 PM

Ronald
Hi Mike, (typo's corrected) none of us have yet found the holy grail of user interface testing as indeed it is hard. Yet on the other hand it is required, so we keep searching for it.
Even with the proper service test in place (next to the unit tests) we find it necessary to have UI tests. As an example: the service-test may produce a positive outcome, but the UI in IE8 or on an Android phone (just to name an example) may fail, while other UI's/browsers may succeed.
Automating the UI testing may proove invaluable as it is next to impossible to perform regression testing on the UI level by hand for all supported browsers of a major company (even though this is only a small subset of all the available browsers).
Could you please elaborate on how you expect to solve this UI-testing phenomena?
1/16/2015 12:49:42 AM

Ronald
Hi Mike, none of us have yet found the holy grail of user interface testing as indeed it is hard. Yet on the other hand it is required, so we keep searching for it.
Even with the proper interface test in place (next to the unit tests) we find it necessary to have UI tests. As an example: the interfac-test may produce a positive outcome, but the UI in IE8 or on an Android phone (just to name an example) may fail, while other UI's/browsers may succeed.
Automating the UI testing may proove invaluable as it is next to impossible to perform regression testing on the UI level by hand for all supported browsers of a major company (even though this is only a small subject of all the available browsers).
Could you please elaborate on how you expect to solve this UI-testing phenomena?
1/16/2015 12:46:22 AM

Mike Cohn
Hi Deepak--

Thanks for your comments.

In general, yes, I think it would be wise for a tester to focus on any non-user interface tests first because the UI is the most likely to change. (Even if it changes only in small ways that can dramatically affect the tests.)

Of course, I'd like all forms of tests automated within the sprint within which the coding is done. But I'd emphasize the non-UI tests in particular.
12/13/2014 2:10:15 PM

Deepak Joshi
Hi Mike,

You are correct. I have observed that during testing, testers want to ensure that the events of buttons/links are working correctly and the intended processes are being executed through these events. They also want to verify that user inputs are captured correctly through UI. In the same sequence they also verify results of these events. To minimize the initial efforts of UI and service level testing, both are combined through automated suite and yes, this becomes costly and time consuming in later stages. Also, in general, testing user interface requires human intervention because the UI cannot be truly valued without this, especially to verify the look and feel. To reduce the cost as you suggested, tester can be more focused on service level testing and then go for required automated user interface testing.
Do you think this is a better idea to first get done with service level testing ensuring functionality is working fine and then as a next step, extend this adding user interface testing. This will reduce the time consumption and the test suite can be executed for multiple browsers testing also.

Thanks and best regards
12/8/2014 8:59:30 AM

 

Newsletter Sign-Up

Subscribe