2 More Things Testers Forget When Testing A Website
This is part 2 of an article I wrote on the subject of "Things testers forget when testing a website". Head over here for part 1.
To recap, sometimes a tester forgets things. Sometimes there is just too much to do so trade-offs have to be made. I call these, "things we remember to forget." Sometimes it's a genuine oversight.
If you've read part 1 of this article, here's more to jog your memory. If you haven't read part 1, maybe check it out later.
4. Documentation
When you're in the zone, testing away like crazy, the last thing you want to do is write documentation. It's probably the same for developers. However, like eating your greens, it's highly recommended (and sometimes not so pleasant).
Documentation can take many forms, but I'll mention some here for starters:
Pre-flight checklists
A pilot runs through his or her system checks before leaving the gate. Do they have to remember it all off by heart? No, and neither should you be expected to. You're the test pilot of your project and you're steering the health and safe landing of the project, intact, to users. Checklists are just as important for you as they are for pilots.
Checklists are useful and can be referred to over and over, project after project. If you haven't got one, write one. It can contain whatever you like: things that are important to QA generally and/or your company specifically e.g. compliance, or whatever. It will jog your memory. You can iteratively add to it over time so that you and the team don't need to remember everything project after project, delivery after delivery.
It can be as high level or detailed as you like. Think about things like functionality, performance, scalability, security, dependencies, SEO, copy QA, accessibility, cross browser testing, backwards compatibility with older tools and systems, no-JS support, no CSS support or whatever floats your boat. A checklist means you won't have to remember everything, and can include learnings from other checklists and experiences.
Checklists can be for go-live deployments, QA environment regression checking or critical features. No-one is going to need to sign off your checklist because that's not the point. It's just so that you don't have to remember everything, and to hopefully reduce the stress of feeling like you have to remember absolutely everything all of the time.
You may also be able to delegate some of the tasks around the team. If it's written down, you can share it. If it's in your head, you can't. Checklists - try them sometime.
Preparing for a handover at any minute for any reason
If you are testing using stories or acceptance criteria, some of this is already done for you. Even better if it's in the “Given-When-Then” Gherkin format as that is human-readable (with bonus points if it's written well, succinctly and avoids repetition). In that case, much of the work is already done! It's always good to review it, and even better, to ensure it is clear and up to date, with the bonus that it can be, if it isn’t already, the basis for automation testing.
What if you were to be hit by a bus? It's not just you who is in trouble, but the project. How will anyone know what the "gotchas" or risky/flakey bits of the site are? Who else knows the site's intricate details and how they are expected to work? Sure, there may be a zillion stories in JIRA but all that knowledge is in your head, and you are the go-between between developers and the client, in both "tech-speak" and plain language. Who better to put everything down in a meaningful format than QA?
Always be prepared for an emergency handover so that if something were to happen to you (heaven forbid), or you go on maternity/paternity/bereavement leave or whatever, there is something meaningful for the next person that is practical and understandable for them to pick up.
How long should documentation be?
Your documentation doesn't have to be an epic, The Legends of the Tester that will be told around a virtual campfire for the grandkids. I often write handovers (too late, admittedly, and usually only when handing over to someone who is taking over my project or role). They usually contain the following:
- The objective of the tool/site, i.e. what we or the client is trying to achieve
- Core journeys that everyone should be taking (can be bullet points)
- Where we have validation (but not all the rules, for that refer to the test suite/TestRail/JIRA ticket XYZ, Google doc XYZ etc)
- What isn't tested yet
- What isn't testable/needs to be stubbed in automation to make it skip as if it was all working as expected to get to the next stage
- Who are the stakeholders (internal and external)
- Any terminology/glossary. Companies love terminology that makes no sense to outsiders, especially TLAs (three letter acronyms)
- Links to credentials/sites/repos/rules documents/whatever you think is useful for someone to know or reference
- Dependencies e.g. third party APIs (including what endpoints are the most useful ones), and how reliable they are
- Descriptions of anything specifically tricky, flakey and/or unreliable, and who to contact for tech support if something requires intervention
- What the cross browser requirements are, including the browser/device priority groupings, if known
- A list of what is NOT in scope, e.g. other languages, browsers not previously listed, copy QA (if the client and/or translation agency provides the copy) and anything else you might get called out for
Where should I save my documentation?
It doesn't even have to all be in a single place. You can write the acceptance criteria in TestRail, a document or wherever. The notes about everything else can be in Confluence, in Word (but then consider who it is shared with in case they leave too), on a network drive or company cloud storage or wherever, So long as it's not all just in your head, it's better than nothing at all.
Ensure that a single master document references any others, as it will be the starting point that anyone new to the project should be able to pick up and follow. Reference the other documents/sites that contain the specifics such as acceptance criteria, credentials, etc.
How much detail should I go into?
Strangely, I enjoy writing documentation. It means that I don't have to remember everything long term, and I illustrate my text with marked-up images, as if writing a "Dummies Guide To …" book. I approach it assuming that the reader knows nothing about the subject and that I am training them from scratch.
I start with the aim of the project, then define any terminology. Then I go through the core concepts and journeys, then the lesser ones. If a user needs to use a particular tool or system, I show them where to find it, log in, basic functionality for it, and illustrate it so that the sequence and actions are clear. I once wrote a short manual for new users to Sitecore CMS this way for a whole team of users to add content. It seemed to work as 85 markets went live with it over time. Similarly I did this for other clients so something must have worked.
5. Challenging the requirements
You are the last bastion against chaos. I have worked on a project where there were no written requirements. How do you know when you're "done", "good enough" or "meets the client's expectations" if nothing is defined? That was painful, as were the evenings and weekends working towards arbitrary delivery requirements that just never seem to be satisfied.
Don't start working on something without asking what the requirements are, and without seeing them with your own eyes. I didn't and it cost me sanity and our team wasted many evenings and weekends of personal time that could have been avoided.
Let us assume that you have the requirements (on paper or in stories or whatever form they may take) and you think that they could be more tightly defined. Go for it! I don't recall ever having had a client that didn't like the precision and avoidance of ambiguity that my eagle-eyed input provided. Many times it led to conversations that tightened the requirements and we ended with a more robust product. That's what QA is all about - calling out risk and, where practical, reducing it. Maybe the fixes and enhancements didn't happen right away but the enhanced requirements always got logged and actioned at some point, usually before go-live.
Of course, set the ticket priority to the best of your ability. Can it wait? Will it affect a lot of users? Do you think the product can go live with this defect/missing functionality or not? If not, fight for the ticket to be prioritised at the earliest opportunity by talking to whoever sets the priorities. You are the champion of the product. Just because someone didn't spot it before you doesn't mean it shouldn't get done.
Here's an example of a common user story: "As a user I can register for the site." Great. Or not so great. You could drive a coach and horses through a requirement like that. What about...
- What are you registering with - email or setting a username?
- What makes a valid email address structure?
- What error messaging do you see if you don't comply with the email structure?
- What makes a valid username? Minimum length? Are special characters allowed? Are only Latin characters allowed? Are numbers allowed?
- What error message do you get back if it does not comply?
- Does the cursor return to the first field containing the error when an error is returned on the form? (This would be an accessibility win too).
- What makes a password valid? Is there password structure error handling and messaging so the user knows what is expected? QA can bring previous good/bad experiences to conversations about this with the team.
- Can you navigate from the sign up screen to the login screen if you are on the wrong screen?
- What if the email or username is already registered? Do you want a user who has previously registered to be able to see that they have already signed up previously if they try to register again? Does the error message contain a link to the sign in screen?
- Do you want to be able to sign up with a single sign-on facility
- E.g. Facebook, Google, etc, and if this is on the sign up screen. What if the user has signed up on your site previously using that method?
And that is just one user story. I am sure you could elaborate on just about any story given the time, space and confidence by your employer. The result may mean the project takes longer, but you (and your client) wouldn't want to deliver a product with ambiguous requirements that were not fit for purpose, meaning days, weeks or months of fixes and likely working evenings and weekends.
Summary
That's part 2 of my list of things that I often park or forget. There are likely to be many other topics that you may consider instead of these, and that's fine. Either way it's food for thought, and hopefully next time you'll remember (or actively decide to forget) what to test too!