All software companies establish a QA process to ensure that their applications are always working. As applications evolve and acquire new functionality, that functionality needs to be tested for quality and correctness in its own right. However, existing functionality also runs the risk of breaking as a result of changes—both internal and external. Thus, existing functionality needs to be tested as well to ensure that it has not regressed. Your QA process lays out how your company ensures that it meets both of these requirements.
QA process varies widely between companies and there is no one size fits all. We know this well and that’s why Reflect is flexible to all sorts of QA approaches. Whether you’re a 1-person operation and just need to offload some of your testing, or you have a full team of dedicated test engineers, Reflect is a tool that anyone can wield. In the below example, we present a simple QA process for a small software company and show how Reflect seamlessly fits into this workflow.
Let’s imagine an example QA process where a software company has a handful of developers building the product. Another person spends part of their time testing the application, especially around the times when the development team is deploying new versions of it. So, there are several full-time developers, and one part-time tester.
The developers build the product throughout the week and deploy their changes to a staging environment whenever they have completed a functional change or a new feature. The staging environment mimics the production environment, which your customers use, in terms of configuration but uses different account data (i.e., fake or test account data), and usually is powered by less infrastructure so that it’s cheaper.
The staging environment allows everyone to test the new application changes in an end-to-end way using real (non-customer) data without impacting the production environment. Broadly, you want to test both your staging and production environments, though you’re usually focusing on different things. Staging focuses slightly more on the new features or the parts of the application that are changing, and production focuses on all functionality as a whole with an eye towards ensuring that the critical functions are always working. Now, let’s establish the testing procedures for this hypothetical company.
The most important thing is to actually establish the QA process. This means that everyone should know—and ideally, agree—who will test the different parts of your application. In our scenario, let’s say that developers are required to test their changes in staging. This means that developers would merge and deploy their changes to the staging environment and then create tests for those changes in Reflect. While the developers can run these tests on-demand, they’ll most commonly set up a schedule to have these tests run every day against the staging environment as long as they are working on the new feature. In this way, the developers can focus on building the product with the assurance that each new piece of the feature doesn’t break earlier parts.
Over time, the new feature is released to all customers and it now runs in production. At this point, the developers can update those tests to run against the production environment rather than the staging environment. Additionally, in our hypothetical scenario, this is the point at which the developers transition ownership of the tests that they wrote to the part-time tester. The part-time tester can then modify the schedule on which the tests run, or classify them into new logical groups with Reflect’s tags.
In addition to managing tests against the production environment, the part-time tester will also periodically perform exploratory testing against the staging and production environments. As popular new use cases emerge over time, the tester can create tests to verify these experiences are always working. In our scenario, the tester often moves tests from an “experimental” tag into a “nightly regression” tag as the tests prove useful.
Inevitably, some of your tests will fail over time. Whether it’s intentional application changes or unexpected side-effects of using a new layout framework, you’ll receive email or Slack notifications for failing tests. A primary goal of Reflect is to make it easy to respond to such failures. Reflect’s test result view includes all of the information you need to confirm the issue, reproduce the issue and identify the root cause of the problem.
In fact, the test result view is the most efficient way for the tester and developer to communicate about a website bug. The tester first validates that the error is legitimate by viewing the test steps and the video of the test execution in Reflect. The tester opens a new ticket in the company’s issue tracking system, pastes the URL of the test result view, and notifies the developers. (Developers can also be notified directly from Reflect if you want.) Finally, the developers view the failure and work on the fix, optionally debugging the issue using the console and network logs in Reflect.
In the case that the failure is a valid change to the application, Reflect supports several ways to edit your tests. For the purposes of this hypothetical example, let’s consider the simplest case where the visual state of some component in your application has changed. If you had previously created a visual assertion for that component, then your test will fail when you deploy changes to its appearance. Accepting the new visual state of the element is as easy as clicking Accept Changes above the image in the test step detail view.
QA process takes many shapes and forms at software companies, but the important thing is that you have some process to rely on and facilitate communication. Reflect serves both the individual tester as well as a team of developers by unifying their testing efforts through a common interface: the browser. Furthermore, this is the same interface that your users interact with when using your application. Communication is key for your QA process and Reflect makes sure everyone speaks the same language.