WTEU-70 – Measurable Quality (LIVE!)

WTEU-70 – Measurable Quality (LIVE!)

Date: Sunday 24th July 2016
Time: 3.30pm – 5.30pm BST
Facilitators: Amy Phillips and Neil Studd
Attendees: Trisha Agarwal, Purendra Agrawal, Abby Bangser, Jamie Fairlie, Michael Hudson, Amir Khan, Christian Legget, Vijaykumar Nadeshan, Thomas Ponnet, Ram, Adina Sit, Dolores Zurdo

It was the two-year anniversary of our taking charge of Weekend Testing Europe, so we wanted to do something special! For the first time, we presented LIVE via webcam, so that facilitators Amy & Neil had the chance to handle a real-time grilling. Thankfully there were relatively few technical mishaps: our primary webcam died just as the session was starting (that’s why we had a second one in reserve!) and, more frustratingly, our local copy of the recording failed after 20 minutes (so we don’t have the full video for posterity).

For this session, we wanted to take a closer look at quality: how we define it, and how we can hope to measure it. The application we picked to test was Songkick, for several reasons. Firstly, Amy works at Songkick, so was able to bring a wealth of product knowledge and insight. Secondly, Neil is a long-time user of Songkick, and was therefore offering a user’s insight into a product which might otherwise have been alien to attendees. Finally, our webinar was coming live from Songkick’s London office, so it felt appropriate!

We set people a simple exercise: Is this a quality website? www.songkick.com. Thomas made a very good point that he would usually start by interviewing a product owner (or equivalent) to find more about what information they wanted to receive. Purenda requested clarification of the scope of the task, and Adina sought further information about the purpose of the product. Abby made an astute counterpoint that often a pair of inexperienced eyes on the product can reveal problems which might otherwise stay hidden, so there might still be value in testing without further guidance. These are exactly the sort of questions/approaches that we like to see!

During the half-hour exercise, a lot of interesting questions were raised, and bugs were uncovered:

  • We had an in-depth discussion about geolocation (how Songkick determines your area, based on a combination of IP, ISP information and number of events in your region).
  • There were some interesting issues uncovered when resizing the browser to mobile-sized widths.
  • Some site features weren’t clear to new users (e.g. the difference between “I’m going” and “Track event”).
  • Adina and Michael observed differences in the behaviour when connecting to different third-party services (Facebook and Last.fm).
  • Abby noticed that (by design) an abbreviated site logo is shown to logged-in users, as they’re assumed to be more familiar with the short name.
  • Jamie spotted that there are certain workflows where clicking the logo doesn’t return you to the homepage; Amy explained that these are in areas such as signup, where the desire is to prevent users from interrupting their current flow.
  • Jamie also found that 404s on secure (https) pages go to an unskinned error page, different to their http counterparts.
  • …and more besides!

After the exercise, we discussed whether each issue would be likely to affect a user’s opinion of the site’s quality. This involved going deep into discussion about what quality even means. Several people commented that quality involves delivering value to a customer, to an extent that they want to return for future business. We discussed some places (other than the website) where we might be able to find information to drive quality measurement; Thomas said he would check haveibeenpwned.com for details of data breaches, and Neil suggested that a site such as currentlydown.com could provide an indicative measurement of site uptime. There was also some chat about business measurements of quality; for instance, code quality, or anything which impacted Songkick’s ability to be a leader in the use of continuous delivery (e.g. flaky pipelines, poor messaging, inability to roll-back) would massively affect their vision of quality.

Here’s a video of the session introduction! Sadly, our local backup copy failed after this point, but it’ll give you a flavour of the session:

You can find the session transcript below. Because we hosted most of it on video, the chat is a bit quieter than normal, but hopefully the description above (and the other resources below) will make up for it!

Session Resources:

About the Author

Neil is a tester from the United Kingdom who has been testing desktop, mobile and web applications for the past ten years, working in a range of agile roles for organisations as varied as Oracle and Last.fm. In his spare time, he participates in freelance and beta testing projects, as a way of learning and developing new approaches to testing.