Creating new products from nothing is a challenge. It requires planning and preparation, following step by step instructions, focusing creativity, being goal oriented, and often compromising between various design considerations. In the case of iVolve software as the product, our developers wrap up this process with a functioning software build and a feeling of satisfaction. Until I come along!
I dig through the “finished product” and try to break as many functions as I can, with strategically designed tests to find how it will fail. And then I get to point out the imperfections to the builders, who thought their job was done.
They chase down the root cause of the problem and fix the product, feeling satisfied again. Then we repeat the testing (trying to break it) process, so you can imagine I’m not very popular around the office!
But it’s all in a spirit of cooperation of course. Finding issues in house gives us a chance to fix them before customers encounter them. So wouldn’t it be nice to find all the bugs before the product goes out?
Before I joined the company last October, the team had developed some great test environments and scripting capabilities. While I’ve been getting familiar with the details of how each product works and the tie-ins between them, I’ve also been planning a framework for automation of certain low level test processes. Generating baseline test data of expected (correct) results to saved test datasets will form the basis of our comparators when the data is generated automatically. Each fix and new feature must also go through this process, and added to the test suite and incorporated into the automated regression testing for each release going forward.
Creating new products is a challenge, and rigorous testing adds another layer of complexity, so why bother? Ultimately our joint efforts lead to a robust product that can withstand our harsh industrial environments.
To stay updated, follow Daniel on Twitter.