This is the final blog in the ‘Assisting with inquiries’ series, during which we have been exploring some skills and techniques which can help us when reporting. The series runs alongside some pages which look in detail at specific questions we can ask as we go, about the important task of providing valuable, timely and reliable information to the people who need it.
This concluding piece is concerned with how we go about assessing just how useful our reporting was and how it could be improved. I’d encourage you also visit the accompanying page entitled ‘After reporting, consider how you might improve’ – this has more detail on some of the questions we can ask after reporting.
The preceding posts in the series have discussed two important considerations before reporting and a skill required whilst reporting:
|The needs of your audience|
|The mechanics of reporting|
Questions are powerful
If you have been following this series, it won’t be a surprise to you that I believe questions are powerful. Asking the right questions at the right time is an important skill in many things that we do. For a tester, learning the art of asking questions is crucial.
Whenever we work on something which is intended to meet the needs of someone else, we can ask questions afterwards to determine how successful we have been in meeting those needs. A very visible example of this has emerged over the last decade, as companies have placed greater emphasis on building a lasting relationship with their customers.
Those organisations have been eager to find out how well they are doing and they have been evaluating this by asking their customers a question:
You may be familiar with NPS (Net Promoter Score) but even if you don’t know the initials or the name, you are almost certainly familiar with the question above, or some variation on it. Respondents are categorised as Detractors, Passives, or Promoters according to their response. (It is worth noting that NPS is not universally liked; the methodology has some detractors of its own). A calculation is then used to determine an overall ‘Net Promoter Score’ based on all the responses.
Whilst the question seems simple, the responses to the question and the way they are interpreted are taken very seriously by many companies. By asking customers this question about its products or services, an organisation can build a picture of how loyal those customers are.
That picture provides some useful information, but the survey can become very powerful if the company asks another, open-ended, follow up question. The follow up can be as simple as asking ‘WHY?’ The intent is to generate a reasoned response. What was it about the product or service which led the customer to select the number they did from the scale?
It is the responses to this question which provide the most valuable information. If your customers are telling you what they like or don’t like about your product or service, you had better listen. It doesn’t matter how clever or innovative you think you are if your customers are unimpressed with what you are doing.
In the first blog in this series, ‘The needs of your audience’, you might remember that I likened reports to products or services, and the audience for those reports to customers. Now, I’m not suggesting that you distribute an NPS survey to your colleagues when you provide them with information about testing. However, it is probably sensible to talk to them about whether your report was useful to them, and to give them the opportunity to tell you why.
You may well get some idea from their reactions, and the conversations which result from the information you provide. However, if you aren’t clear on how well you are doing, or how well you are meeting your customers’ expectations, why not ask them?
This concludes the series. If you have any comments, or questions, I’d be really happy to hear from you. As I stated in the introduction to the series, I don’t claim my mind-map to be a comprehensive list of useful questions, so I’d welcome suggestions as to how it can be improved.
I’d also like to recommend some resources which I have found very useful:
- Communicating testing during software development – a mindmap by Thomas Ponnet
- Three tips to help testers ask better questions – by Katrina Clokie for Ministry of Testing
- Six tips for Software Testers on asking questions – Sticky Minds article by Thanh Huynh
Quick links to the posts and pages in this series:
Assisting with inquiries – blog posts
- Introduction to the series
- Part 1 – your audience
- Part 2 – the mechanics
- Part 3 – filtering information
- Part 4 – how was it for you?
6 thoughts on “Assisting with inquiries: Part four – how was it for you?”