WeTest QA Case Study: A casual mobile game

WeTest can help companies flexibly build their own performance cloud platform, effectively build local performance monitoring, automated cloud testing services, custom web-based data visualization, and so much more.

This is a popular global mobile game maker which encountered issues in March 2020.

Due to COVID-19 quarantine policies, its original external compatibility testing service supplier failed to provide effective testing services, mainly in three aspects: a lack of internal QA Expertise, slow server response, and insufficient testing models.

Each of their tests was only supported by 20 models, which indicates that potential compatibility issues on hundreds of phone models weren’t examined and resolved.


To solve the problems, we create a roadmap that includes six detailed steps:

Firstly, Collect materials required for testing.

Secondly, Tech experts communicate requirements, customize test plans, and complete test-case designs.

For the third step, we select the test models. In this specific case, we use the TOP 1,000 Models, which covers 1,000 popular phone models worldwide selected by big data and user analysis.

After selecting the test models, we use automated testing and AI testing to identify issues in specific game scenarios.

Then, issues are monitored using mobile devices in the cloud lab, which capture the screenshots and real-time logs for further debugging.

Lastly, we classify issues, debug problems, conduct regression testing and generate the test report.

This process is supported by millions of registered players, a management distribution platform, and a professional game assessment team.

Comprehensive Solutions

This is an overview of the comprehensive solutions that WeTest uses to test mobile games.

During the DEMO period, WeTest is used to provide word-of-mouth trends, find out the causes of the changes, supply recent hot events with key communication nodes, and analyze industry trends. These information and data help to understand how the game is adapted in the market.

Among the full chain of R&D to operation phase, a variety of testing categories are activated step by step, including localization quality assurance for the target market, functional testing, security testing, performance testing, network testing, compatibility testing, and payment testing.

Before the game releases, app store compliance and regulatory issues were put on the table.

In the operation & growth phase, the performance of the game or app was monitored continuously through social listening, game crowd testing, crash reporting, and Client Application Performance Monitoring (APM).


The effective testing models that we selected to cover 80% of global mobile users.

After many rounds of trial and comparison, more than 50% of its testing costs have been saved due to expertise escalation and model update.

As a result, the profits and quality of the game are largely boosted.

This case is a typical example of how we help a mobile game solve the problems of a lack of capabilities and key resources while saving costs, especially during the COVID crisis.

Our QA team, with more than 10 years of experience, worked with the client to rebuild its global quality assurance service strategy.

Overall QA Strategy

What is a high-quality game? Maybe everyone has a different answer. WeTest defines a high-quality game from 8 dimensions:

  • Novel gameplay
  • Rich game scenarios
  • Logical numerical design
  • Good client performance
  • Excellent compatibility
  • High security
  • Strong network capability
  • Good server stability

Question: How to achieve a high-quality game?

The answer is: A One-stop Quality Assurance from Pre-production to Post-production.

During the pre-production and production phase of a game, Playability evaluation, Security Testing, and Compatibility Testing should be started early to minimize the time and cost for early problems-spotting and bugs-solving.

Then, Client/Server performance testing should be conducted to test the speed, stability, and reliability of the game before release.

Even after the release and distribution of a game, the testing process is not done yet.

Compatibility testing, Security Testing, and Social Listening should be repeatedly conducted during each game update (patch) to ensure a consistent high-quality gaming experience for players.

Cloud Testing Services

During the overall testing process, cloud testing is a very important part of the solution.

It divides into two types of testing– Compatibility Testing and Functional Testing.

Let’s talk about compatibility testing first.

WeTest’s Automated compatibility testing serves to quickly discover compatibility problems of games, such as Abnormal display, crash, ANR, process exit, and the installation failure.

During the compatibility testing, we want to cover as many phone models as possible.

And one of the key advantages of WeTest platform is that we cover a lot of models. Our Global TOP 500 models cover over 1000 popular mobile devices from more than 30 countries and regions.

Besides mobile models, we also have PC and Console terminals.

To make sure that the testing models are cost-efficient and up-to-date, we purchase new devices and update our models’ selection monthly according to the latest analysis of big data and user insight.

Compatibility Testing

WeTest provides three types of compatibility cloud testing services.

The fastest one is the standard compatibility testing, also known as the monkey test. It only takes 5 minutes per device. This testing can be submitted online with one simple click on our official website and the report can be obtained within 4 hours;

The second one is the Test Automation with a script, where users can customize their own test cases and scripts. It takes 30 minutes per device and supports mainstream frameworks such as Appium, Robotium, UI-Automator, and G-Automator.

The last one is the most commonly-used — Manual Testing. It is used for almost all of Tencent’s game projects. This kind of test is more in-depth, taking 20 or 50 minutes per device. Throughout the testing process, testing experts would provide professional test case design, which covers nearly 100% of main game scenarios. In the report, multi-dimensional information on bugs would be provided and analyzed by logs, screenshots, etc.

Automated Functional Testing

Automated Functional Testing testing is used to verify the integrity and accuracy of each functional module in a game, such as the numerical accuracy for each level or item management.

Just like the compatibility testing, WeTest’s Automated functional testing supports various mainstream testing frameworks. It also expands script testing capabilities on the cloud infinitely.

After the test, a detailed test report would be generated, including user test results feedback, use case analysis, performance reports, and device analysis.

Here are some examples of a Functional Testing report.

Each test case is listed with its device name, Status, Pass rate, and details of the issues , which are supported by screenshots and log.

Besides, there is an issue Cluster Analysis function which summarizes and classifies massive test cases by devices, brands, and versions.

Client Performance Testing

Client Performance testing serves to quickly locate and analyze the performance issues, and improve the performance and quality of games by testing the client-side. It is a critical step to optimize the performance problem of a game.

PerfDog is an excellent example. PerfDog is a tool for testing and analyzing the performance of a mobile game by testing the client-side (also known as end-to-end testing). It quickly locates and analyzes the performance issues, and improve the quality of games.

PerfDog can test all types of applications on the entire mobile platform, including games, apps, browsers, mini-programs, mini-games, H5, background system processes, etc. It does not require any additional changes to hardware or apps, which means no need to ROOT for Android devices, and no need to jailbreak for IOS. The only thing you need to do is to install it on Windows or Mac, Connect it to the mobile phone, and Run the application. Finally, you can get the performance data of the applications you are testing.

PerfDog’s Client Interface on PC

Here are two sample images of PerfDog’s Interface. This image shows the PerfDog’s Client Interface on PC. There are two bars: the top bar presents the real-time FPS rate, and the bottom one presents CPU Usage and Memory Usage.

PerfDog’s Mobile App Interface

This image here instead is a sample of PerfDog’s Mobile App Interface. At the top left corner of the image, there are several performance indicators such as FPS, CPU, GPU, Memory Usage, Network Status, and Temperature.

Report in Cloud & Local Report

This graph is an example of PerfDog’s test report.

It provides detailed statistics of FPS rate, CPU occupancy, and Memory Usage. More specifically, it includes the proportion of high FPS, Average FPS, FPS Standard deviation, FPS jitter, Average CPU occupancy, Proportion of low CPU occupancy, Peak Memory, and Average memory data.

After testing these metrics with different load scenarios and machine models, analytical data is generated with corresponding severity levels, defect descriptions, and defect categories.

Overall, PerfDog can help companies flexibly build their own performance cloud platform, effectively build local performance monitoring, automated cloud testing services, custom web-based data visualization, and so much more, which allows enterprises to efficiently complete client performance testing.

WeTest: The one-stop platform for all your quality assurance needs. More at https://www.wetest.net/