When you've finished making a change to your website, the work has only just begun. Depending on how complex your system is, you might now have to perform several layers of testing, from unit to integration to acceptance, before actually taking the new version live.
Automating your testing process takes you from "click... check... click... check... click..." to "click... done". Doesn't that sound nice? But it doesn't happen overnight. The first step towards fully automating your tests is finding out what you'd like to automate, and that's what we'll discuss here.
The Status Quo
Currently, your process might look like this:
- Open the webpage
- Check that everything looks right (or at least as right as your memory can remember)
- Click a few links, hope that if there's an issue, it's affected at least one of them
- Close the browser
After this, you're pretty sure your change didn't break anything, but there's always doubt that maybe you missed something. And then there are the issues of "Am I checking the website the same way every time?" and "There has to be a faster, less tedious way of doing this".
Creating A Plan
And there is. The first step of your transition is to document everything you could test about your site - what are all the things that could break, and how could they break? It's OK if this list takes you a while to finish, and to be honest, it'll probably never be totally finished until your site is 100% perfect (by which time it'll of course be time to update it again).
Codifying Your Plan
Now that you know what to test, and what you're looking for, it's time to put it in a document, in order, so that it can be easily translated into a testing script. Your solidified plan might look something like this:
WonderShop Testing Plan Last updated: August 1, 2019 TESTING THE HOMEPAGE - load index.html in the latest version of Firefox - check that all image assets have loaded correctly - check that the title of the page is properly set - check that when the site is loaded from the following countries, the currency displays appropriately: - Canada - Ireland - United States - Japan - check that featured item suggestions have been loaded - click on the first suggested item - check that its images load correctly, and that its currency is properly set ...
And it will obviously go on like this for quite some length. Why go through all this hassle? Two main benefits: Consistency and Speed.
Like we said up above, if your testing strategy consists of "open a browser, check a few things on a few pages, pray that's enough", then you are headed for disaster one day. But how can any human be expected to do things the exact same way, completing a list that might be hundreds of items long every time they make a change? They can't be. But computers never forget, and computers never get bored.
Another hindrance to testing is that most of the time, you've done your job correctly, nothing's broken, and it feels like you've just wasted half an hour combing over a not-broken site looking for errors. An easy way to combat that sense of fatigue is automating things - start the script, go get a cup of coffee, and by the time you're back, a fresh report will be waiting for you.
Once that you have things down in a tangible plan, assumptions stick out like sore thumbs (that's the point of the process, after all). For instance, what does "loaded correctly" mean for your site? The answer will determine how that instruction is implemented as an actual test, be it checking response codes, analyzing file contents, or comparing screenshots to a baseline.
Bringing In WonderProxy
Not to toot our own horn here, but we think we have some pretty great tools that could make your life a lot easier when you start global testing. For instance, in the site above, an easy way to make sure your site converts currency correctly is to use our localization service - 87 different countries, from Osaka to Belfast, ready to show you what you look like around the globe.
Once you've completed your plan, send it around for review. Chances are, someone will point out a step that doesn't make sense in the order it's presented (maybe certain products aren't available in certain countries) or they'll have suggestions for things to add (like that new line of baby diapers that just got added to the site last week). With many eyes, all bugs are small, so get as many people as possible to help you out.
Now that your plan is complete, it's time to put it into action. Here are some great next steps:
- Choose whether you want to use test-driven development or behaviour-driven development.
- Run your tests at the click of a mouse with Travis CI
- Make it easy for users to submit a bug report with easy bug report templates.
Best of luck!