How to Rescue a Failed Test Automation?
Tips on how to rescue failed test automation technically, and more importantly, address the subtle human factors
Update (2022–10–04): Case Study: Rescue Unreliable 20 hours of Automated Regression Testing in Jenkins ⇒ 6-Minute Highly-Reliable in BuildWise CT Server
Over the last 10 years, I have successfully rescued many failed test automation in different software projects using the same technical formula. I was unable to rescue some as well, due to human factors. In this article, I will share my thoughts and experience.
First of all, compared to the green-field project, it is not going to be easy with an existing failed approach in place. There are several human factors.
Table of Contents:
∘ Technically Competent
∘ Human factors on existing failed automation solution
∘ Talk Test Automation in the context of "Release Often" to upper management
∘ Talk Test Automation in the context of "Automated Regression Testing" to team members
∘ People tend to be fixated on a tool; often, they want to be that way
∘ Replacing with another "Tool"
∘ Specific Advice
Technically Competent
Testing is practical, automated testing means practical and objective (clearly measurable with no grey area). To rescue a failed test automation (being a hero), you must be technically competent first. I mean real competence here. Try to answer these questions:
For your last project,
What's the size of the E2E test suite?
How often do you run them?
Do they all pass?
How long does it take?
How often does your team push updates to production?
I can answer the above (except the last one) with one screenshot.
My answer to the last question: once I get a green build (passing all UI tests, like the one above), I push it to production immediately. Quite often, multiple times a day.
It is perfectly okay that you want to rescue a failed test automation as a manager or architect, without a strong technical background. If that's the case, seek help from a real test automation coach. Assess them objectively by the above questions, followed by real hands-on practices.
It is important to point out, the challenges of Web Test Automation are the same for all projects. In the software industry, changes to web technologies (set by W3C) is not much. The HTML and JavaScript I learned in 1996 are still here. My point is a real test automation engineer, who grows by experience, can get results on the first day.
Human factors on existing failed automation solution
People are attached to their decisions, right or wrong. Unfortunately, in the context of test automation, the decisions are usually wrong. Let's see some examples.
1. A senior software engineer proudly created a 'new test automation framework'.
Every so-called self-created automation framework that I am aware of, failed badly. Check out the article: Please, Not Another Web Test Automation Framework, Just Use Raw Selenium WebDriver
Commonly, ‘the creators’ of the ‘new framework’ blame others, rather than themselves. One time, I heard one saying: “I designed the framework, but those testers were unable to keep test scripts updated”. (his so-called new framework was just a badly implemented syntax on top of Selenium; it actually made testing harder) He did not understand test automation, maintenance is much harder than test creation.
2. Manager proved the spending on the wrong framework/tool
Many IT managers knew little about test automation and continuous testing. These poor managers had no choice but to approve the spending on a proposed testing tool or solution. However, once the signature is on, most managers had to support that decision.
The more money spent, the less likely the manager will acknowledge the mistake.
Once my test automation was recommended to a senior manager. In a meeting, this manager said: “The CIO just wasted 3 million dollars on the current solution (excluding human cost), the last thing he wanted to hear now is another test automation, even your proposal is free Selenium”.
3. Tech lead said that was “impossible”
One principal software engineer once said to others: "Selenium WebDriver cannot test a one-page (dynamic) web app". Of course, it was wrong. During the first meeting, when he expressed his view, I said "Selenium can, I can show you now" and was about to draw out my laptop from my backpack. He quickly said: "Not necessary" and left. As you probably can imagine, he later sabotaged test automation in our project.
4. Some had commercial interests in tool vendors
I only can speak of my own experience in maybe discovering this. Once, I moved back (from offsite) to the office in the headquarter (a government). My job title was test manager. Not long, I received a phone call (external). I was very surprised after knowing this call was for me. It was from a testing tool vendor, who invited me to attend "their conference", transport and accommodation were all covered.
I said "No" and hung up the phone. I was wondering how did these people know my phone number? I could not help thinking about some bad decisions that might be related to this.
5. Selenium is free, too cheap
Managers usually prefer to pay big dollars for a testing tool. If it failed, they could say “we picked Gartner’s recommended IBM RFT”. Using a free and open-source framework such as Selenium, even if it ticks all the boxes (including support), those managers are uncomfortable.
Please check out this article: Crazy Web Test Automation: “Freedom Is Slavery”
Talk Test Automation in the context of "Release Often" to upper management.
Any fool can create a simple automated test using a recorder, but this approach has been proven wrong. Commonly, I saw programmers or testers show off a 'cool' demo of a new test automation tool. But soon, this will be forgotten. Back to daily testing (not just for testers, the whole team), the team members still conducted it manually. The reason is simple: unmaintainable automated tests are useless.
An IT executive must have seen 'fake test automation' many times. To them, test automation is an excellent idea to have. They, just like normal high-school kids, want to hang around with cool ones. In the software industry, the cool stuff is "Agile", "DevOps", "CI/CD", "Continuous Testing", "Release Early, Release Often". The foundation of all the above is solid End-to-End test automation.
As I wrote in my first book (published in 2009), if you want to make a change, "seek executive sponsorship". There is no point in showing a demo of test automation, talking about "Facebook release twice day", then leading to test automation.
“CXOs overwhelmingly overestimate DevOps and Continuous Testing maturity” — Forrester Study: Continuous Testing Separates DevOps Leaders From Laggards
Help executives realize the real situation of DevOps and Continuous Testing, with numbers. For example, based on my assessment of many projects against AgileWay Continuous Testing Grading, >90% are either Level 1 or Level 0.
Talk Test Automation in the context of "Automated Regression Testing" to team members.
People generally don't like changes, even the changes that might make their lives a lot better, at least initially. Fortunately or unfortunately, test automation is something that will significantly change software development for good.
“It was Scott and his team of programmers who completely overhauled how LinkedIn develops and ships new updates to its website and apps, taking a system that required a full month to release new features and turning it into one that pushes out updates multiple times per day.”
“Newly-added code is subjected to an elaborate series of automated tests designed to weed out any bugs.”— The Software Revolution Behind LinkedIn’s Gushing Profits (2013), Wired
Please note that Wired used the term 'Software Revolution' to describe how LinkedIn changed, with the introduction of test automation.
Starting small, making the team realize the benefits of automation gradually. One good angle is regression testing. Testers don't like regression testing manually, yet everyone would agree on the importance of regression testing.
So pick up a set of regression tests (often companies have already defined acceptance tests". Work on them gradually. Check out this article: How to Implement Real Automated Regression Testing?
People tend to be fixated on a tool; often, they want to be that way
In UI Test Automation, history tells us nearly 100% failure on proprietary frameworks/tools, such as IBM RFT, Microsoft Coded UI Test, and HP QTP. These products from software giants are gone now. The failures are not exclusive to expensive tools from big companies, and they are free ones such as PhantomJS and Protractor, which I predicted years ago. For still-popular frameworks/tools, as I wrote in my articles, Cypress and Gherkin are doomed to fail. For a simple reason, every project (that I know of) that were using either of them failed.
Many IT engineers are more comfortable with a branded tool.
More pay
for example, SAP tester gets paid more; SharePoint developers often have a better rate than C# programmers.Sounds better, maybe
I once worked in a team with the job title "EJB developer" (by the way, EJB means Enterprise Java Bean, a bad Java technology). People (including me, shameful, I was young) did think "EJB developer" sounds better than "Java programmer". That's why Martin Fowler invented the term "POJO": Plain Plain old Java object. 😊Privileged
I once designed a workflow system (in raw Java) to replace a failed but expensive third-party workflow engine software. Later, I was shocked to know how much this government project paid the consultants who worked on the failed product.
Back to test automation, if a person referred him as an IBM RFT tester, this means a much smaller competition pool from those who have used this US$10,000+ tool. However, if it is Selenium WebDriver, everyone might have used it as it is completely free (and proven).
So, by switching to a far better Selenium, you will take "the privileges" from some people. It is the right thing to do, however, to tactical to reduce the resistance.
Replacing with another "Tool"
I learned from "The Power of Habit: Why We Do What We Do in Life and Business", a New York Times Bestseller book, that you have a better chance of getting rid of a bad habit by replacing it first. The book used "nail-biting" as an example, when a person is about to (or just noticed) bite a nail, replace it with another action, such as drawing a triangle on the paper. This help breaks the habit.
Almost every project after the engineers started writing real automated tests (in raw Selenium WebDriver) referred to the test scripts as "TestWise tests". Disclaimer, TestWise is a testing IDE that I created. I repeatedly corrected them: "There is no such thing as 'TestWise' tests; You are writing raw Selenium WebDriver tests. TestWise is just a tool. If you find a more productive one, you are free to switch to it". By the same token, they like to use the "BuildWise server" to refer to the Continuous Testing server. As this occurred often, it made me think.
To me, Selenium is associated with "Success". However, many engineers have bad memories that they rather forget. They might come up with some excuses for past failures, but in their heart, they know those excuses are invalid. They wanted a way to get out of embarrassment (as an 'engineer') by blaming a product. Previously, they could say, "QTP sucks, RFT is a piece of s**t, Protractor is no good…". However, Selenium WebDriver made it harder to blame, as it ticks all the boxes: standard-based, free, industry-leading, feature-complete, reliable, proven…, etc.
A common excuse was “Selenium was hard to learn”, which is totally wrong. Selenium is easiest to learn, far far easier than others. My daughter started writing raw Selenium tests at the age of 12, with proper guidance using the right tool, so could your child. Seeing is believing, check out this article with video: Step by Step showing how to learn to write raw Selenium WebDriver test scripts in minutes
Anyway, those incompetent engineers are afraid that if test automation fails again with Selenium, the raw cause might show up: themselves.
In Chinese history, there was a famous general Yue Fei (岳飞). He was perfect in every way. He was sentenced to death under the crime of “maybe” (莫须有).
(Elon Musk tweeted a Chinese poem yesterday, seems a fashionable thing to do)
You may let the team call raw Selenium tests "TestWise tests", "TestCool tests" or whatever your choice of testing tool. The importance is that the team shall be all aware that these tests are 100% raw Selenium tests. They are not locked with a specific tool.
Specific Advice
1. Start with training
The first impression is important. During the training, you (or the external test automation coach) can introduce new concepts or practices to the team, such as:
instead of raising defects, try to replicate in automated test scripts
the first priority for programmers is to fix failed regression errors (shown in the CT server)
daily automated regression testing
…
When you are at that stage, you are more likely to get the audience's attention. Comparatively, Email, Confluence, or Microsoft Teams are bad channels to convey, as people tend to ignore them.
2. Borrow external expertise
“People always value opinions from foreign sources other than their fellows’ advice.“ — a Chinese proverb (“外来的和尚好念经”)
Most software teams do need a dedicated test automation and continuous testing coach to guide them. However, IT companies usually don't know the existence of such services or don't know how to find a good one (extremely rare).
Coaching is to guide and help as needed, and the purpose is to let the team grow. By my definition, a success criterion for a coach is how quickly the students no longer need them.
Here is my proposed schedule for my coaching. After the one-day training, the team would start writing real tests for work. Surely, they need a lot of help and guidance. Still, there is a gap day in between for Week 1. Why? I wanted them to try hard in my absence. This way, when I answer their questions the next day, they will be keener.
1st month: 3 days a week
2nd month: 2 days a week
3rd month: 1 day a week
then: as needed
Commercially, this won't be good for my business. But, I am truly happier when the team can maintain a good test suite without me.
3. Get business analysts on your side
Business analysts use the application (via UI) a lot, and test automation can help them a lot, in particular, creating test data for them using automated scripts. Check out this article: Benefits of Continuous Testing (Part 3: to Business Analysts)
It is easy to get support from business analysts because they don't need to change their core work style. Rather, greater convenience with little effort. For example, instead of spending 10 mins to create a new insurance claim, get one from the latest build on the CT server or ask test automation engineers to generate one.
4. Sprint showcase with automation
Many fake agile projects have no showcases as the team is not capable of demonstrating the app. Test Automation is the solution.
Check out this article: Use UI Automation to Assist Agile Showcases .
5. Run automated tests in the CT server daily
Test Automation cannot succeed without Continuous Testing. Due to the nature of E2E tests (relatively fragile and long execution time), you cannot run a large number of UT tests in a testing tool (or from the command line) and expect all to pass every time.
For example, my app WhenWise regression (UI) test suite has over 24000 test steps, to qualify for a good build, it must pass every step. It is only feasible when mange executions in a CT server like BuildWise or Facebook’s Sandcastle. Forget about traditional CI servers such as Jenkins or Bamboo, they are built for executing unit tests, no good at all for UI tests.
Note: this is another common technical reason that test automation failed. The real test automation engineer needs to master Continuous Testing (not CI) as well.
Please check out my book: Practical Continuous Testing, make Agile/DevOps real, or the articles:
5. Choose a testing tool that every team member can use; The same tool for the whole team
Test Automation and CT, as the above image shows, is a whole team activity, for everyone. Each software team member can benefit from E2E automation, naturally, as it interacts with the software product they are producing.
A common mistake is that a manager or tech lead mandated a testing tool that only they are comfortable with. I learned it the hard way. I was known as a 10x Java programmer and proficient with IntelliJ IDEA (I still love it). However, when I asked my team to use the same toolset (my preference), while no objection at first, it did not work well at all. It was simply too complex for manual testers and BAs. That's the main reason that led me to design a brand new dedicated UI testing tool myself: TestWise.
I am not trying to sell you TestWise (by the way, which can run for free forever). Rather, when you introduce test automation to a team, please consider the team members' different technical backgrounds.