Lean testing automation in 2018: what are the key challenges?

If you were to characterise lean testing with a single phrase, what would it be? I’d use a motto you’ve probably heard: think big, act small, fail fast, and learn rapidly. As an integral process of lean software development, lean testing is an approach that values effectiveness above all. It prioritizes flexibility and adaptability over schedules, documentation, and clearly-defined roles.

Given the focus on quick response to change, the lean approach to software testing is reminiscent of other QA methodologies in Agile. This said, it makes more sense to compare lean testing to a more old-school methodology like traditional enterprise testing.


Lean testing versus traditional testing

Lean testing Traditional testing
Main goal Having a flexible quality assurance strategy that’s compatible with quick, frequent releases. Having a comprehensive and clearly-defined quality assurance strategy, with test policies, strategies, and best practices in place.
What tools are preferred? Regression testing framework or tool, bug tracker. The key focus is on simplicity and maintainability. Tools for test management, bug/issue management (bug trackers), and time management.
Documentation Test automation precedes (TDD) and/or happens in parallel with feature development. TDD cases become the de-facto documentation. Testers write test documentation during the early stages of project development and get it approved by stakeholders.
Tester’s role Dedicated QA engineers or a team members with combined roles (e.g. development, customer support) who can take ownership of software testing. Lean software development teams can also resort to crowdsourced testing. Dedicated teams of engineers handling integration, system, performance, and acceptance testing. Outsourcing of QA roles is a widespread practice.
Test planning Ad hoc/just-in-time planning is acceptable. Scheduled, formal, and carried out by a dedicated team.
Bug prioritization Frequent re-prioritization based on feedback during iterations There’s a standard procedure used throughout the company.
Bug tracking Bugs are filed by any team member and tracked via a dedicated tool, CRM and or bug reports. Bugs are filed by testers only, with tracking via a dedicated tools.


So, technically, we’re talking about a software testing methodology that puts freedom over process. But how does this approach stand against the harsh reality of testing automation in 2017–2018?

Based on what we all can observe in testing automation in 2017, the very principles of Lean will pose challenges for lean testing teams. To a larger extend, it’s UI testing that could drastically improve from “getting leaner” which is why this post will focus on it.


In 2018, the Lean principles reflect the key challenges of lean testing

I’m sure you’re well-acquainted with the principles of Lean development, but let’s have them posted here for a good measure:

  • Eliminate waste by either improving or getting rid of anything that requires micromanagement or causes unnecessary work, rework, delays, or feature bloat.
  • Deliver as fast as possible to get early feedback on your product.
  • Decide as late as possible to deal with fewer assumptions and more facts.
  • Amplify learning. In lean development and lean testing, team members learn from experience with the product, not documentation.
  • See the whole by looking at how individual tools and processes influence the product outside of their narrow scope.
  • Empower the team by encouraging team members to take ownership of the product, not just write code or close tasks.
  • Build integrity by promoting clarity and simplicity via efficient goal-setting, refactoring, and automated testing.

When it comes to software testing — and UI testing in particular — these seem good on paper. But does the QA industry have the best practices and the tools to bring these principles to real-life product development? Let’s take a closer look at the four problems that prevent us from bringing these seven principles into the reality of automated software testing.


Problem: brittle Selenium tests are wasteful

lean testing strategies can't be hampered by fragile test code
Speaking of waste in software testing, manual testing is the first thing that comes to mind. It causes rework, it produces unnecessary cognitive load, and it’s too inefficient to guard your product against bugs. But are hand-written UI tests different in this respect? If Selenium and Selenium-based frameworks are so great, why do QA automation teams fall back on manual testing?

With Selenium, you end up rewriting dozens of tests whenever someone deletes an ID or changes the parent-child structure of a UI element. Given how often these things happen, brittle test code often turns into a major headache. What’s worse, you also have to keep track of what test cases get affected by every UI update. Doesn’t sound lean at all to me…

Solution forecast for 2018: a software testing tool with robust locators

What if you could capture all selectors for each element and automatically swap valid locators for broken ones. Isn’t this functionality long overdue in 2018? Screenster, the software testing solution that we’ve built, has had this functionality since 2016. Knowing how important it is to make UI tests less brittle, I’m sure that other tools and frameworks will prioritize this in 2018.


Problem: hand-coded UI tests are too slow for short development iterations

slow UI tests that have no place in lean testing
Remember about failing fast and learning rapidly? Short development iterations and frequent releases help you get early feedback on the product, which amplifies learning from real-life experience and enables fact-driven decision-making. Frequent development iterations are also integral to fast delivery.

When it comes to fast-paced development cycles, the big question is whether modern-day testing techniques can keep up with them. True, TDD works well on the unit level, but the scope on the GUI level is too vast for hand-coding.

Solution forecast: faster test automation makes IDEs cool again

Faster test creation is the only way to optimize the UI testing process for lean development. This pretty much rules out hand-coded tests, but does this mean that 2018 may see the comeback of record-playback solutions? Maybe, but not in a way that you expect.

To entice testers to choose low-code solutions over Selenium, platforms need to offer much more than mere record-playback or screenshot comparison. Namely, UI testing will become more efficient with intelligent content comparison, DOM comparison, and automatic handling of timeouts and dynamic regions. Naturally, all of these won’t matter unless you can automate a real-life test in under 5 minutes.


Problem: with hand-coded UI tests, you don’t “see” the whole

coded UI tests make you zoom in on individual elements rather than see the whole
I’ve talked about this in my post on happy path testing, but let me say this again: coded tests don’t really “see” the whole UI. The best you get is a fraction f the UI state under test.

Whether you’re using Selenium or a higher-level code-based web automation tool, your tests only deal with what they explicitly target. All they do is check if the key elements are there and do what they’re expected to. In the meantime, they ignore all sorts of visual bugs and issues with content and DOM. Obviously, this is far from seeing the whole.

Solution forecast: intelligent visual testing and DOM testing

While WebDrive automates user actions, visual testing catches visual bugs that indicate problems with the underlying logic. By simulating how humans see the UI, visual testing tools can significantly improve lean software testing in 2018.

There’s one thing to note, though. When I say “visual testing”, most people hear “screenshot comparison”, but these two things are different. Screenshot comparison is a must have, but screenshot comparison tools can’t differentiate between content- and CSS-related differences or associate UI regions with DOM elements. It’s this functionality that will contribute to efficient software testing strategies in 2018 and beyond.


Problem: UI testing automation introduces unnecessary complexity

lean testing should reduce complexity rather than increase it
I’ve met quite a few sceptics who say that codeless platforms are a crutch for non-programmers. This is only partially true because the unnecessary complexity of coded UI testing doesn’t just come down to its reliance on programming.

Programming is actually the fun part. What’s not fun, though, is having to maintain a convoluted module-based infrastructure that breaks whenever you update one of the modules. What’s even less fun is having to keep track of the updates of WebDriver versions in respect to the auto-updating browser.

Solution forecast: simpler UI testing solution

In the case of UI testing tools, “simple” doesn’t just mean “codeless”. Rather, it means a solution where the core functionality required little maintenance and works out of the box. Simplicity stems from integrity, and it enables testers to focus on catching bugs rather than tinkering with automation tools. That’s exactly what lean testing of 2018 should be about.


What’s your strategy for making UI testing lean in 2018?

To make UI testing lean, you need a solution that offers a comprehensive coverage of the UI while keeping the maintenance footprint low. In theory, a testing automation solution that meets the requirements of lean testing can be code-based, but our own experience proved otherwise.

Last year, we’ve released a low-code UI testing platform optimized for lean development environments. The roadmap of this project essentially reflects the solution forecasts mentioned in this post. So if you’re excited about the idea of robust self-healing UI tests and intelligent visual testing, check out our demo and see how these features work in real life.


Want to try Screenster on the cloud?

Try Online
Lean testing automation in 2018: what are the key challenges? was last modified: April 17th, 2018 by Ilya Goncharov

Leave a Reply

Your email address will not be published. Required fields are marked *