How to Write a Quality Bug Report

I worked with Aaron Weintrob to put together a uTu course on how to write quality bug reports. The intended audience of this course is new uTesters with little or no testing experience. We kept it relatively short and simply highlighted the key areas. We may create additional courses for various sections where we can dig in a little deeper. Here is a link to the course. Below is the course’s content:

INTRODUCTION

Writing a good bug report is one of the most talked-about topics in the testing world. The art of creating a well-written bug report requires a balanced combination of testing and communication skills. This course provides advice and tips geared towards helping you create bug reports that are informative and actionable, thus improving their value to the customer.

WHAT MAKES A GOOD BUG REPORT?

Most testers understand the role of a bug report is to provide information, however a “good” or valuable bug report takes that a step further and provides useful information in an efficient way.

To help us get started writing valuable bug reports, we are going to focus on a few key areas:

  • The Title
  • Actions Performed (Steps)
  • Expected and Actual Results
  • Attachments

THE TITLE – THE GOOD, THE BAD, THE UGLY!

The title is the face of your report. It’s the first thing anyone sees and it’s importance cannot be overstated. A good title helps reduce duplicate issues and can quickly convey a summary of the bug.

It’s best to avoid generic problems in the title. For example, these should never be used:

  • XYX is not working properly
  • Issue with XYZ
  • XYZ is corrupted/does not look right

The above example titles add little value in describing the problem. By nature, every report is describing something that is not working as it should. Be specific about what makes it “not working.”

Instead of: Sorting is not working properly.
Try: Sorting is happening in reverse order.

Instead of: Issues with GUI on navigation bar.
Try: Navigation bar is wrapping to a second line.

Often times, bugs are migrated into the developer’s database that may contain hundreds, if not thousands, of other issues. Imagine trying to search this database for “navigation bar”. That search will return every issue related to the navigation bar. Searching for “wrapping to second line” is much more specific making it easier to find the bug. Your bug report needs to survive (and be useful) beyond the current test cycle; a strong title will help it through it’s journey.

ACTIONS PERFORMED – ADVICE FOR EXPLAINING YOUR STEPS

This is the body of your report. The goal of this section is to show the reader how to reproduce the bug. Since this area usually contains the most information, it’s important to keep it concise and easy to read. Always number your steps and kept them short and to the point.

Tip: Using a prerequisite can reduce the number of steps.
Instead of listing out every step to login in, start your steps with: “Prerequisite: User is logged in”

Tip: Find the direct path to the bug
Often times, testers will stop at the point where they found a bug and log their last few actions. However, the most helpful bug reports are those that distill the report down to the core reproduction steps.

It’s a good exercise to reproduce the bug by following the steps you’ve just outlined. This will help ensure you’ve included everything the customer will need to reproduce it as well.

Sometimes digging a little deeper below the surface of the bug can add additional value. Here are some examples of how adding a bit more effort or thought will produce a higher quality report.

Example 1: Provide additional useful information
Scenario: You find that a video does not play.
Good: Mention if it happened on all videos and not just the one mentioned in report.
Better: Specify if the issue is reproducible on more than one browser or device.
Best: Upload a speed test showing that bandwidth was adequate when testing was happening.
Lesson: Try to identify and answer follow-up questions before the customer asks them.

Example 2: Report the bug, not a symptom of the bug
Scenario: We are testing an Address input field. We find that the Address field allows “1234567890″ and it also allows “!@#$%^&*()_+”
Lesson: These are two different symptoms of the same bug. Closer inspection would reveal that the real issue is the Address field isn’t being validated at all. The problem may be more serious than the first symptom you find.

EXPECTED AND ACTUAL RESULTS – WOULDA, COULDA, SHOULDA

Now that you have described how to reproduce the bug, you need to explain the problem and the desired behavior.

Tip: When describing expected results, explain what should happen, not what shouldn’t happen.
Instead of: The app shouldn’t crash.
Try: The user is taken to XYZ screen.

Tip: When describing actual results, describe what did happen, not what didn’t happen.
Instead of: The user wasn’t taken to the XYZ screen.
Try: The user remained on the ABC screen.

ATTACHMENTS – WHAT TO DO AND WHAT NOT TO DO

Attachments add to the bug’s value by offering proof of the bug’s existence, enabling the customer to reproduce it or helping the developer fix it. Each attachment should add to the value of the bug in at least one of these three ways.

The following are some tips and guidelines to keep in mind when adding attachments:

IMAGES

  • Adding images is a quick way to add context to your bug. Consider adding an image even if you also have a video.
  • Highlight the area(s) of interest in your image.
  • Attach the image files directly to the report. Don’t put images in a Word document or a zip file.
  • Use images to illustrate static issues.

VIDEOS

  • Video confirms your steps were accurate at the time the issue was created. For example, a screen grab of an error message isn’t as useful as seeing what went into creating that error message.
  • Actions in the video should match the steps listed in the bug report.
  • Videos should be trimmed to only show the bug.
  • Provide video if the steps are complex.
  • External/live videos can be more impactful than mirrored videos because you can see hand gestures or you touching a button on the screen.

LOG FILES AND OTHER TIPS

Avoid proprietary file types (like .docx). Use .txt instead.
Avoid compressed (.zip) files unless specifically asked for or approved by the TTL, PM, or customer.

UTEST ETIQUETTE – GENERAL OVERVIEW OF PROPER BEHAVIOR

It is important to remember that you are representing the TTL, the test team, and all of uTest when you work on the test cycle. Your fellow testers rely on you to write a good title for your bug report so they won’t file a duplicate bug. TTLs depend on clean, good reports to ensure the customer receives value from the cycle. uTest needs quality work from everyone so we can continue to work in the field we all love.

ADDITIONAL READING

Here are some valuable discussions about bug reports from the uTest Forums:

Two contributions to the uTest University

Back in December of 2013, uTest officially launched the uTest University (blog post) which is intended to be a single source for testers of all experience levels to access free training resources. This is a neat opportunity for testers to contribute to the growth and development of the testing community by creating courses and writing articles. The university also offers  Author Page

My first course was derived from a uTest forum’s post I I wrote back in June of 2013. I was on a cycle where the customer required logs from Charles Web Debugging Proxy be attached to every bug report, but none of the testers (myself included) or the knew what that was or how to use it. I spent some time learning how to use the tool and then put together a tutorial to share with the rest of the team. Fast forward 8 months later several other customers started required the same thing. To make the information a bit easier to find the tutorial was turned into a uTu (uTest university) course:
How to Set Up Charles Web Debugging Proxy for iOS Devices and Windows 8

My second course came at the request of the uTest Community Management team. They needed a tutorial for new testers to show them how to create videos (screencasts) of their bugs. They specifically wanted it based around the free tool Screencast-O-Matic. I had actually never used that tool before, so I spent some time getting familiar with the tool. I also compiled a list of suggestions and tips based on things I see frequently in the videos of other testers. The result is:
How to Set Up and Use Screencast-O-Matic

 

My First Testing Experience – Part 2

Continued from Part 1

I can vividly remember lying on my couch in my living room staring at the ceiling, my stomach in knots. I felt this huge weight on top of me making it hard to breath. I was pretty close to panicking. “How can I possibly do this? I have no idea what I’m doing. Why did I agree to this project? Do they have any idea what they’re asking? This is so unfair. This is CRAZY!” These were the thoughts racing through my mind. Before I could think one through, another would jump on the pile. That feeling of hopelessness, of being completely overwhelmed was my first and worst moment as a software tester.

I’d like to think the PM understood the magnitude of the project, that somehow she saw this amazing potential inside me just waiting for an opportunity to prove itself. But I knew she didn’t really understand what she was asking. In her mind this was a small task that any mid-level developer should be able to do. After all, it’s just testing, how hard can that be? That just added to the pressure. I was dealing with unrealistic expectations about an undefined project that required an under-appreciated amount of skill and effort.

After breathing into a paper bag for 10 minutes or so, I calmed down. I reminded myself the best way to climb a mountain is one step at a time. I started by making a spreadsheet documenting all the different migration tools we had built. I identified the base data type, the destination data type and a description of the transformation needed. One by one I went down the list, talking to the developers, searching for any documentation, looking at the old system, learning about the new one, basically doing the work a BA would have done.

Once I had a solid understanding of the specifications I was working with, I started testing. I started with the simplest cases and the ones that I was most familiar with. You are probably thinking I should have done the opposite and started with the hardest, most complicated and risky transformations. If I was doing it over today, that is the approach I would take, but considering I was a complete rookie tester, I needed to start with a small, achievable task to build some confidence and establish my testing strategy.

My strategy was pretty simple. I gathered sample source data, ran it though the migration and verified the results on the other side. Then I started adjusting the source data, trying every scenario I could think of looking for ways to make the tools fail. As I worked through my list, reporting bug after bug, it became clearer to everyone just how large this task was. Expectations slowly became more aligned with reality and the measurement for success became more reasonable.

After a few months of work, we were ready to start migrating content. It wasn’t perfect and several bugs made it through, but they were minor and due to junk source data that we didn’t care about. The migration ran for months as different sites came online. The migration tools did their job and by all accounts, the testing aspect of the project was successful.

This experience gave me an unexpected appreciation for what testing is and how valuable it can be. It gave me confidence in myself that I could accomplish a task that I really had no business even attempting. It’s an experience that I can now look back on and be proud of.

Webinar – Should testers report every bug they find?

In December of 2012, Ryan Lamontagne and I got into a good discussion on the uTest forms about whether testers should report every bug they find. We decided to kick it up a notch and debate it live in a webinar!

http://forums.utest.com/viewtopic.php?f=13&t=4430

Webinar – Finding bugs in mobile devices

I was able to join Kayla Cox and Todd Smith for a uTest webinar to talk about testing mobile devices and how to find high-value bugs.

Since my microphone was terrible (and I might have been mumbling a little ) here is a summary of the points I made in our discussion.

Crashes
Understand that not all crashes are valuable.
Out of memory crash may be due to other apps using up 90% of your memory and the app you are testing just pushed you over the limit. The best way to know for sure, is to have a clean test bed. Restart your phone after you install a new app, and make sure no other apps are running in the background.

When you do get a memory related crash, use a memory management app to help you see where your memory usage spikes. Being able to identify a reproducible memory crash is usually a high-value bug

Connection Issues

  • Kill your connection while data is being transferred
  • Unplug your wi-fi router/modem
  • Turn on airplane mode
  • Turn of wi-fi on your device
  • Turn off cellular data on your device
  • Find places near you that have low or no signal and test there

Interaction with native and popular apps

  • Share something via email with no email set up
  • Log in using Facebook account with/without the Facebook app installed
  • Interrupt testing with phone calls, text messages, FaceTime calls etc
  • If the app changes the phone settings, make sure it does it correctly. Change it back manually in settings and see how the app responds

Investigation and Documentation
There are many topics on how to write good bug reports but there are a few points worth reiterating

  • Provide exact reproduction steps
  • Do root cause analysis – don’t report symptoms. I once saw 3 testers reported 3 different symptoms of the same bug. On the surface they all looked like different bugs, but a little analysis showed they were all caused by the same step they all overlooked. 

Some Thoughts on Functional Testing

I wrote this as a Guest Blogger for the uTest blog.

There is an age-old expression that says “You only have one chance to make a first impression.” This is a hard truth in today’s world of instant gratification. If your product fails to deliver the first time, your customers will simply move on to the next thing. In-the-wild functional testing, as provided at uTest, is similar to a dress rehearsal for your application. Your application is exposed to a group of people who accurately represent your potential user base. They can identify and report the issues (that would have negatively impacted your customer’s first impression) before your customer ever has the chance to see them.

A functional tester has the ability to evaluate individual features of an application. They are familiar with typical application behavior and have the skills needed  to look objectively at a feature and see what’s wrong.

Perhaps even more valuable is a functional tester who is able to analyze individual pieces of an application within the context of the entire application. A functional tester looks at a particular item, identifies integration points between that item and other parts of the application, and then formulates a plan of how to inspect those touch points. Applications are usually weakest in places where different parts come together. A strong functional tester knows this and knows how to exploit those weaknesses to identify any lurking bugs.

Functional testing will only be successful if an organization’s underlying quality fundamentals are solid and everyone clearly understands how testing helps achieve the goals of the business. Functional testing is only one of many activities that collectively comprise a comprehensive testing strategy. Depending on the needs and expectations of your company, different testing activities such as performance, load, and security testing should be considered. Functional testing differs from other types of testing in that it most closely reflects the experience of the users. While performance affects the experience and security issues add risk to the experience – how the application functions IS the experience.

5 Ways to Improve Your Bug Titles

I originally posted on the uTest forum here.

Bug titles are one of the most important pieces of you bug report. They are the face of your bug, they show the its value and can help or hurt the overall efficiency of the test cycle. Far too often testers don’t give their bug titles the attention they deserve. This post will try to change that. Here are 5 tips to help you improve the titles of your bug reports.

Consider Your Audience

Like the bug report itself, the title is intended to convey information. The main difference is the title is more concise. A well written title will quickly and clearly summarize the bug and its value.

To communicate this information effectively, you need to consider your audience. Bug titles are read by different audiences who may use the title for different reasons. Testers have the difficult job of writing a title that satisfies the needs of two different audiences at the same time: The customer and your fellow testers.

Customer
When the customer or Test Team Lead (TTL) reviews the bug list, one of the first things they do is look at the title. As we talked about in Reporting High-Value Bugs – Part 2, part of reporting high-value bugs is “selling” it to the customer. The title of your bug is part of your sales pitch. Always keep the title short and to the point. You want to focus on the end result, not the actions. For example:

Use “User profile – Unable to link to Facebook” instead of “Clicking the ‘Link to Facebook’ button doesn’t do anything

Also use words that action words that convey importance such as ‘prevented’, ‘does not’, ‘inconsistent’, ‘unexpected’ etc.

Fellow Testers
Your fellow testers use the title of your bug in a very different way. They use it to determine if the bug they found has already been reported. To help them, you need to include the key words they will be searching for.

Hopefully, before you report your bug, you search the bug list see if it has already been reported. Make a note of what you searched for because those are the words you should consider including in your title.

In Reporting High-Value Bugs – Part 2 we also talked about reporting the root cause of the bug. The same is true for the title. Your title should describe the underlying problem, not one of its many possible symptoms.

Follow the uTest Standard

uTest has a crash course dedicated to Bug Title standardization so I’m going to point you there first: http://help.utest.com/testers/crash-courses/general/bug-title-standardization

To summarize that post, every bug should be broken apart into two distinct parts. The “Area” and the “Description” The area is the place in the application where the bug occurs. The description is a brief summary of the bug. These two areas should be separated by a hyphen.

For example, in this bug title:
Homepage – The ‘Contact Us’ button is linking to the incorrect page
“Homepage” is the area and “The ‘Contact Us’ button is linking to the incorrect page” is the description

This can get a little tricky when the area is deep in the application. If there was a bug in the uTest platform on the payments screen in the Account & Settings section how should we identify that area?

In the link above, one of the authors suggests you write it like this:
Account & Settings – Payments – Total payout amount is incorrect

Personally I don’t like this suggestion. Testers who do this tend to put the navigation steps in the bug titles. That is not the place for that information. Plus having more than two sections makes the title difficult to read.

I prefer to list only the broad area of the application and include the more specific area in the description. Here is how I would write this title:
Account & Settings – The total payout amount on the Payments page is incorrect

Do Not Specify the Test Environment

Many testers include the device or environment they use to test in the title of their bugs:
[iPhone 5] User profile – Unable to link to Facebook

The landscape that we test against these days is so large that it’s no wonder that this has become more common recently. Testers feel that the device they found the bug on is an important piece of information. While that is true, the title of the bug is not the right place for it.

The main reason that this is a bad practice is because it gives false impression about the scope of the bug. Generally when testers start their bug title with the environment, they are simply stating the device that they found the bug in. But the customer may interpret that to mean that this bug is only present on that device listed in the title.

Unless you have tested against every other possible device/environment, don’t include this information in the title. It adds little value and can actually cause problems.

As with most rules, there are exceptions. Here are two:

Explicitly required
If the cycle specifically tells you to include environment information in your bug titles, you should follow the instructions.

For example, this is directly from a test cycle I recently was on

NOTE – If you find an iPad bug: Please add [iPad – iOS xx] at the beginning of you bug title.

In this situation it is perfectly fine (and even required) that you include the environment in your title.

However, you may see something like this in the instructions:

BUG TEMPLATE: Please include the following info in all your bugs: Mobile device model and OS version Description of bug Wi-Fi or 3G / 4G?

This does not mean that all this information should be in the title. It simply means that it should be specified in the body of the bug. Generally you should put this information in the ‘Specified Environments’ or ‘Additional Environment info’ fields. It is the “Bug” template, not the “Bug Title” template.

Environment specific bugs are allowed
Occasionally a cycle will allow the same bug to be reported for different environments. In this case, each one of these bugs is considered different by the customer. Since the only difference between the bugs is the environment, it is necessary to include the environment in the title. Otherwise you would have multiple bugs with the exact same title and your fellow testers would have to look at the contents of the bug to see which environments had already been reported.

Keep Consistent with Earlier Bugs

Sometimes a cycle will ask you to include some extra piece of information in the title. One example of this would be the build of the application that you tested. What I usually see happen in these situations is every tester comes up with their own way of including this information. The result is a messy bug list that looks something like this:

[b 123] Area – Description
build 123 => Area – Description
Area – Description {build 123 v.2.045.34}
123 Area – Description

This is difficult for the customer and TTL to read and makes it impossible for them to quickly scan through the list.

Assuming that the earlier bugs followed the uTest standard and everything we addressed above, you should follow the pattern established in the first few bugs. Don’t worry about being original or sticking to your own personal preference, the goal is consistency. This will make the customer’s and TTL’s jobs much easier. See how much easier this is to read?

[b 123] Area – Description
[b 123] Area – Description
[b 123] Area – Description
[b 123] Area – Description

Learn From Other Testers

You can learn quite a lot from reviewing the bug reports of your fellow testers. You can see different styles of reporting the reproduction steps, come up with new ideas of how to test, and see which types of devices are the most common.

The same can be said for the bug’s title. When you are reviewing bugs, don’t just skip over the title. Instead, take advantage of the opportunity to learn from the mistakes and successes of others.

First evaluate the title of the bug first on its own:
Does the title follow the standard? Does it include appropriate key words?
Then look at it in the context of the entire report:
Does the title accurately and efficiently summarize the bug? Does it “sell” the importance of the bug?

As you pay more attention to your own bug titles as well as the titles of other bugs, you will start to see the types of patterns we have just talked about. It will become apparent that the testers who do these types of things are the ones that are separated from the crowd. Bug titles are extremely important and should be treated that way. Keep these tips in mind and you will be one step closer to writing the perfect bug report.

I’d love to hear your thoughts on this topic. What other tips can you give your fellow testers?

Is uTest a Scam?

There are some reviews out there from testers claiming uTest is a scam and that you can’t make any money. I’ve also seen a few uTest customers complain about the quality of the testing and uTest’s sales/negotiating practices. These concerns are valid and I can understand why some people have the impression that uTest is not what it claims to be. In order to address these perceptions, we need to look at them from two points of view; The view of the customer, and the view of the tester.

The uTest customer

Complaint #1: The quality of the testing was lower than expected

I have no problem saying that there are a lot of bad testers at uTest. It’s true, there is no point in denying it. These testers only report low-value bugs, they don’t follow instructions, and generally disrupt the test cycle. Sometimes it is because they’re inexperienced testers, sometimes they’re just plain bad. Unfortunately this is one of the downsides of crowdsourcing. To uTest’s credit, they do realize this and are continuously working to improve the skills and abilities of uTesters. They also identify and remove problem testers.

On the other hand there are some absolutely awesome testers at uTest. These top testers consistently provide the customer with excellent service and high-value bugs. uTest does a pretty good job of identifying the strong testers and ensuring that they are the ones working on your projects. Keep in mind that there are literally hundreds of projects, dozens of Project Managers and thousands of testers from all over the world, so every test cycle is going to be different.

So what’s a customer to do? First you need to manage your expectations. Understand the limitations and benefits of uTest and make sure they align with your testing needs. Second, you need to be involved. Yes, uTest is a service, but the quality and success of your project is a direct result of your participation and influence.

Here is an excellent article from Elena Houser on how customers can get the most from their uTest (or any crowdsourced) testing service. It is a MUST read for any potential uTest customer: http://trancecyberiantester.blogspot.com/2012/10/crowdsourced-testing-lessons-learned.html

Complaint #2: The uTest sales process is shady

In full disclosure, I’m not a uTest customer so I’ve never gone through this process myself. However, I am a uTest TTL (Team Test Lead) and so I have worked with many different customers. I’ve seen customers who are extremely satisfied and those constantly complain. It doesn’t take long before you start to see a pattern.

I really could just do a copy/paste from above. Again, this comes down to having correct expectations and being involved. Customers who follow Elena’s advice will find that uTest’s services are well worth the money and effort. Those who don’t will be disappointed with their results.

Another point worth mentioning is that uTest is a start-up company. They have only been around a few years yet they are growing and changing incredibly fast. In just the last year I’ve seen some impressive improvements. I have no doubt that as the company matures and customer needs and expectations are better understood, the sales experience will improve and mature as well.

The uTest tester

Complaint #1: uTest is a scam

As I mentioned above, uTest is a start-up. The company grew faster than many people expected and as a result it went through some obvious growing pains. Everything about the company was (and still is) evolving. The payment process was still being worked out, bug reporting and evaluation was confusing, and in general the tester experience gave some the impression that uTest was either a scam or just unprofessional.

Admittedly, the first tester interface was terrible. It was slow, buggy, and difficult to use, which is quite ironic for a software testing company. This problem has now been addressed. uTest recently launched their new tester platform and it is so much better (read more here). Their are now several reliable ways for testers to receive payment, and there is an entire team of employees solely dedicated to the welfare of the testers. These are just a few examples of how uTest is working to improve its image and show testers that uTest is a legitimate company and a great place to work.

Complaint #2: You can’t make any money

I recently read a review from a uTester that he had reported 87 bugs but he only was paid for 16 of them. The other 71 bugs were rejected. He felt that bugs were intentionally rejected in order to avoid paying the testers. He’s not the only one to complain that testers are not adequately compensated for their efforts. Fortunately it is because of a few misconceptions.

Testers need to understand the uTest bug payment model. Customers pay uTest a set price for an agreed upon amount of work. It is up to the customer to accept or reject the bugs reported by the testers. uTest then pays the testers for the bugs (and other work) the customer accepted. Since the customer pays a flat fee no matter how many bugs are reported or accepted, they have no financial incentive to reject individual bugs. Bugs are rejected for valid reasons, not to avoid paying for them.

The other important point is testers are not paid for their efforts (there are some exceptions), they are paid primarily for the value they provide. Testers who provide the customer with high-value bugs make a lot of money. Testers who report low-value or “junk” bugs make very little money.

The bottom line is good testers can make good money working at uTest. Poor testers will be frustrated.

Conclusion

uTest is not a scam. It is a legitimate company and an amazing one at that. While uTest is not perfect, most criticisms can be answered if you look at the entire situation objectively.