• How to Get 20 Testers & Pass Google’s New App Review Policy

    At BetaTesting, we’re happy to announce our new pre-submission test type, designed to help developers recruit 20 testers for 2 weeks of app testing so you can pass the Google Play app store submission requirements with flying colors!

    Check out our new Google Play 20 Tester Pre-Submission Beta Testing test type.

    Here’s what you’ll learn in this article:

    1. How to recruit 20 targeted testers for your Google Play app
    2. How to get 2 consecutive weeks of testing
    3. How to collect survey feedback and bug reports to improve your app and pass the Google Play app submission requirements
    4. How to apply to publish your app in production on Google Play and how to answer questions to pass your app review!


    About the New 20 Tester Google Policy

    Where can I learn about this policy? Check out the new updated developer app submission requirements on Google Play. These are designed to improve the quality of apps within Google Play. Part of these requirements include app testing before submitting your app for review to the Google Play store. This is required for all new personal developer accounts, and recommended for all developers and organizations!

    New requirements include the following:

    • Personal developer accounts must first run a closed test for your app with at least 20 testers who have opted in for 14 days or more before submitting your app for approval.
    • After meeting this criteria, you can apply for production access on in the Play Console
    • During the application process, you’ll be asked questions about your app and its testing process and production readiness. In this process, you can reference your testing process with BetaTesting to collect bugs and feedback.


    How to recruit 20 testers for your Google Play app

    At BetaTesting, we have a community of over 400,000 testers around the world, with roughly 200,000 available in the United States. We allow targeting in a wide variety of ways:

    • Demographic targeting: Age, gender, income, education and many more.
    • Device targeting: Target users on Android phones, Android tablets, and many more device platforms (iPhones, iPads, Windows, Mac, etc).
    • Custom screening surveys: Ask questions to testers and only those that answer the right way will be accepted.

    You can learn all about our tester recruiting here. Check out our testing package for Google Play 20 Tester Pre-Submission Beta Testing.

    Collect emails and distribute your app for closed testing through the Google Play Console

    Through our pre-submission test template, you can collect emails from test participants when they join your test. You can then view or download the list of applicants so you can easily distribute your app through the Google Play Console. When designing your test, we allow you to collect emails when choosing which platforms you’d like to distribute your app to:

    Screenshot below:


    How to get testers to engage for 2 consecutive weeks of testing

    Through the BetaTesting platform, we have multi-week test options available, designed specifically to recruit and engage testers over multiple weeks.

    We provide two options for setting up your test:
    1. Use our pre-designed template, which requires ZERO setup and can be launched within minutes.
    2. OR customize your test instructions and survey questions according to your app.

    If you choose to customize your test design, you can build your own test workflow using our custom test builder:

    Watch the video below to learn more about how multi-day testing on BetaTesting works:


    Want to learn more about our pre-submission Google Play app testing?


    How to get user feedback & bug reports to improve your app before submission

    Through the BetaTesting platform, you can provide testers with tasks/instructions, get bug reports for technical issues, and collect feedback through a survey.

    All your test results are available in real-time through the BetaTesting dashboard and can be viewed online, exported, and downloaded.

    You can even link your bug reports to Jira or download them to a spreadsheet.

    View & manage your bugs through two different views.

    The bug management “Grid View”

    The bug management “Detail View”

    View survey results along with automated AI analysis to save you hours.

    How to apply to publish your app in production on Google Play and how to answer questions to pass your app review!

    When applying for access to Google Play Production, you’ll be asked the following questions:

    Part 1: About your closed test

    This section is used by Google to determine whether or not your app has met requirements to be tested before it is published on Google Play. The purpose is to allow Google to validate that your app is high quality and prevent low quality apps from being published in the Play Store. In this section, you should provide the following information:

    1. Provide information on how easy or difficult it was to recruit testers for your app. Hopefully, this was very easy for you if you used BetaTesting for your pre-submission test!
    2. Provide details about how testers engaged with your app. You can include details about any tasks or instructions you provided, and which features testers engaged with. As evidence for this section, you can also provide details on the bug reports you collected and resolved, and specific feedback and charts from your final feedback survey collected through BetaTesting. You can also indicate whether or not testers were able to engage with your app similarly to how a production user might in the real Play Store. With BetaTesting, we allow our testers to use your app organically over two weeks, so the types of usage and interactions should be close to (but even more rigorous) than normal app users.
    3. You’ll be asked to summarize the feedback you received from testers. This is a great spot to provide details from the various charts and feedback you received through your final feedback survey report. Common metrics to reference include your NPS score or star rating, and you can also share the automatically-generated AI summaries that summarize tester’s feedback for each question.
    Part 2: About your app or game

    This section is used by Google to understand a bit more about your app or game as part of their consideration during your app review process. You should provide the following info:

    1. Describe the intended use of your app or game and what type of user you are targeting. Be specific but also keep it short and sweet.
    2. Describe what makes your app or game stand out, and what value you provide to users. What is unique about your product and why will users use it? In this section, you can point to positive feedback you received during your beta test, and specifically reference answers to the questions in your feedback survey.
    3. Estimate how many installations you expect in the first year. Try to make the estimation reasonable, but don’t underestimate! However, with very high numbers, your app may receive a more strict review from Google given that they want to ensure your app is high quality if you expect so many people to install it.
    Part 3: Production Readiness

    This section is used to provide evidence that your app is ready for production. You should provide the following info:

    1. Describe the bugs you fixed during the testing period and any improvements you made to the app during the closed testing process.
    2. Describe why you are confident that your app is ready for production. Good examples would be positive feedback, a good NPS score, low number of new bug submissions (or validating that many key bugs have been resolved), or running multiple iterative tests over time and showing improvement.

    Get in touch today to run your pre-submission test to ensure you pass the Google Play review!

    Have questions? Book a call in our call calendar.

  • AI Video Analysis for Usability Videos

    At BetaTesting, we’re happy to announce our AI video analysis tool, which saves you time and allows you to more easily understand key feedback within your usability videos.

    The AI Video Analysis tool will be available for any test that includes video-based feedback, including Usability Videos, Selfie Videos, or Multi-Day longitudinal or beta tests with usability videos as part of the process.

    • Automated transcriptions w/ timestamps
    • Key topic detection, to automatically detect the key topics within the video and their associated sentiment.
    • Video summary
    • Sentence sentiment
    • Video overlays for sentence sentiment and key phrase sentiment
    • Copy/share links to specific points of time in the video


    Transcripts w/ Timestamps

    Why it helps: Seamlessly navigate to the most important sections of your video without watching the whole thing!

    As you play your video, you’ll see the transcript automatically move to the appropriate section (highlighted). You can click and instantly navigate to any section of the video.


    Key Topics Detected

    Why it helps: See the sentiment around key topics (automatically grouped by AI) that the user discussed in the video, and click to jump to clips within your video where the user is discussing a specific topic.

    Every transcript includes AI-driven key phrase detection to detect the most important topics discussed in the usability video, and their associated overall sentiment. You can click on any of these topics, and the transcript will automatically be filtered to only show you the parts of the video that discuss that topic. You can also view every mention for that topic as an overlay in the video, along with the individual sentiment of that mention.


    Video Summary

    Why it helps: Get a summary of your video transcript driven by ChatGPT. This does a great job at distilling the key feedback within the transcript.

    Every usability video analysis includes an AI summary generated by ChatGPT.


    Sentence Sentiment

    Why it helps: We save you time by pulling out the important moments automatically. See positive and negative feedback in the transcript or scan the video timeline to find all of these moments directly in the video.

    We detect sentiment within the transcript and automatically highlight the most important positive and negative areas. You can also view the video timeline to see each of these moments in the video.


    Video Overlays for Sentence Sentiment and Key Topic Sentiment

    Why it helps: Just scan the video timeline to see every time a certain key phrase was mentioned, or all of the most important positive and negative moments in your video.

    Just hover over your video to see all the most important positive and negative moments, and click to jump to that location. If you have a specific key topic phrase highlighted, you’ll see every mention for that topic and the associated sentiment for that specific mention in the video overlay- even if the tester used a different word. For example, if the testers says “it needs a lot of work” the analysis is smart enough to detect context and understand that you were speaking about the “app” and group this with the “app” key topic.


    Copy / Share Links to Any Point in the Video

    Why it helps: This makes it easy to share a link with your co-worker to the exact point in the video where the user is discussing a pain point or giving encouraging feedback!


    We integrate with the best AI language models like ChatGPT (among others) to automatically analyze feedback with the goal of making it easy to surface key insights and make sense of complex data.

    Have feedback or questions? Let us know! We look forward to continue to build and improve great AI driven beta testing and user research functionality!

    Don’t have an account yet? Check out our plans and Sign up for free here.

    Have questions? Book a call in our call calendar.

  • How Sengled Beta Tested iOT Smart Lights in the Real World: Case Study

    The smart light market is currently worth over $8.5 billion and is growing over 22% each year. With new products, apps, and integrations available every day, it is important for smart light manufacturers to create amazing customer experiences to get consumers to buy and repurchase bulbs.

    So when Sengled needed to test a new smartbulb model prior to public release, it was important to select the right beta testing partner. Sengled turned to BetaTesting due to our ability to handle logistics, shipping, and multi-week tests with a community of targeted real-world users.

    BetaTesting helped Sengled get their smartbulb in the hands of real-world testers to provide feedback on the user experience, bugs, and integrations with other devices.

    Sengled bulbs were shipped to over 100 testers across the US, selected based on a series of screening criteria, such as their previous experience with IoT products and whether they had a Google Home or Amazon Alexa product in their homes. Testers agreed to test the bulbs for a month, and were guided through a series of tasks and surveys to collect their feedback each week.

    First, testers were asked to share their feedback about the packaging, unboxing, and setup experience. This included installing the bulb and connecting it to wifi, and completing the setup process through  the Sengled app.

    Through this process, Sengled received valuable insights into several specific technical issues affecting customers during the setup process. At times, bulbs were not connecting to certain routers and wifi networks, and users were confused at certain points of the onboarding process. 

    Finding and correcting these issues before launch proved very valuable. Even one or two small improvements early in the setup process can lead to significant decreases in returns and support costs, and increases in customer satisfaction when the product is live. 

    “The BetaTesting team and their beta testers have passion in new products, and were willing to work with our special requests. We’ve been very fortunate to work with this group. And certainly hope to work with Beta Testing group in the future.” said Sengled Product Manager Robert Tang. 

    As the month-long test continued, Sengled’s team worked with BetaTesting to design new surveys to test various features of the app and bulb: dimming, power reset, creating schedules and scenes, and more.  Sengled was able to collect qualitative and quantitative feedback about which features used loved, where there was confusion in the app’s user flow, and much more. 

    For example, one important feature that was tested was the voice control option through Google Home and Amazon Alexa. Sengled was able to collect hundreds of voice commands and feedback from testers about the voice control experience, and which phrases Google or Alexa understood. 

    Real User Feedback:

    I asked Google home to “turn on living room light on” and it worked every time. I asked google to “turn living rooms lights off” and it worked every time. I asked google to “Turn Living room light bulb brightness to 50%” and it worked every time.  The only command I could not get to work was to set a schedule using voice commands. I tried all kinds of variations like “hey Google set living room light to come on at 7pm” or “hey Google can I set a schedule for living room lights?” The most common answer Google would give would was “I am sorry I can not do that yet”. – Dave Oz

    Through the BetaTesting summary reports, Sengled received segmented analysis of Google users versus Amazon users, and how each group perceived the product after using voice commands in their home for weeks. 

    Sengled also received over 60 bug reports during the test, detailing issues related to connectivity, creating an account, app design, bulbs not working correctly, and much more. Each bug report included the device and operating system of the tester, and many had photos and videos to make it easy to understand the issue and let Sengled’s development team address them quickly. 

    Sengled found that the new features they had created for this product, such as the ability to setup light schedules and use voice controls, were extremely popular with users. The net promoter scores for the product were very high and Sengled received a range of feedback to continue improving their bulbs for the future. 

    Learn about how BetaTesting can help your company launch better products with our beta testing platform and huge community of global testers.

  • McAfee + BetaTesting Partner to Beta Test a New Antivirus Product for PC Gamers

    Recent studies show that the gaming industry is expected to grow 12% per year from 2020-2025, with North America being the fastest growing market. This is driven by the rise of new gaming and technology platforms, readily available and cheap internet access across the globe, and the advent of new technology like augmented reality. 

    McAfee, the market leader in antivirus software, wanted to develop a new product focused on the unique security needs of gamers. Gamers have very different technology needs in comparison to other consumer groups. Their gaming systems are often complex, and include cutting edge hardware, memory, graphics cards, and processors to give them a leg up during gaming. Gamers demand high performance, and “high performance” is not typically a phrase used to describe antivirus software, known traditionally as being bulky and slow.

    “We were in the need of a unique audience to satisfy the needs of the gamer security product, and BetaTesting did a great job in finding the right cohorts at the intended scale,” said Rajeshwar Sharma, Quality Leader at McAfee.

    McAfee set out to develop an antivirus product that could provide maximum protection without affecting gameplay for this market. The McAfee team worked with BetaTesting to design a multi-month beta test before the Gamer Security product launched, for the purpose of collecting user experience feedback and functional/exploratory bug testing over multiple test cycles.

    First, BetaTesting recruited over 400 PC gamers matching specific demographic and interest criteria required by McAfee: A unique mix of real-world operating systems, devices, and locations. Participants were recruited through the existing BetaTesting community of 250,000 testers and supplemented with custom recruiting through BetaTesting market research partner networks.

    McAfee’s product team and user research teams worked closely with BetaTesting to design multiple test flows and collect feedback from hundreds of gamers. The test included feedback around the installation process, the user interface, and the back-end telemetry data of the antivirus product being used in the real-world. 

    Testers also provided information about which games they played, their hardware configurations, operating systems, and more, in order to connect all of that data to the performance of the Gamer Security product. 

    During the final rounds of testing, McAfee also provided marketing materials and screenshots to testers to get feedback about their messaging and content to see what resonated best with their target market.  

    The research team also set up moderated interviews with testers who had provided the most insightful feedback. Testers connected via video chat with the McAfee team to discuss their in-depth opinions about the product, their willingness to pay, and much more. 

    Overall, the test helped McAfee understand the value of the product to their target market. The feedback was overwhelmingly positive about how the antivirus software didn’t affect gameplay, and testers loved the simple installation, straightforward UI and performance data. The McAfee team was able to launch the product on time – with good reviews online. 

    BetaTesting is an amazing team, who were very helpful, diligent and thorough from immediately understanding the requirements of our unique software to contract negotiations, and to overall execution of the program to the scale of our expectations. They were available, prompt and clear in all communications. I would highly recommend them!” – Rajeshwar Sharma, Quality Leader at McAfee.

    McAfee received over 200 bug reports during the test, detailing issues related to installation and update issues, crashes, performance issues during certain games, data mismatches, and more. Each bug report included the device and operating system of the tester, and many had photos and videos to make it easy to understand the issue and let McAfee’s development team address them quickly. 

    The McAfee Gamer Security test was a comprehensive test that underscores the capabilities of BetaTesting platform and managed services. The BetaTesting team coordinated with different departments and stakeholders within the McAfee team, and the test design focused on everything from onboarding to back-end data collection to marketing content.  Finally, testers provided hundreds of bug reports, qualitative and quantitative data, and moderated interview feedback to make this test – and the new product launch – a success for McAfee. 

    Learn about how BetaTesting can help your company launch better products with our beta testing platform and huge community of global testers.

  • Test Anything! See some recent examples of our beta tests.

    We’ve run a lot of tests recently across a wide range of industries and products, and wanted to share some of them to highlight how you can test anything with our platform and community:

    📺 a streaming media device with testers across the US

    📱 a small business CRM and business phone line app for freelancers and business owners

    📷 a drone + camera with automated tracking for existing power users

    🎮 a new game on Steam with hundreds of players worldwide

    👶🏼 an iOS app to help kids with speech therapy

    🛡 a desktop antivirus product designed for gamers

    🔒 a desktop VPN product tested in 10 different countries for usability and security

    💵  an online web app for a business funding and lending platform

    🚗  a machine learning app crowdsourced tester data to track travel behavior

    … and many more.

    We’d love to help you collect feedback fast and improve your products. Get in touch with our team and we’ll help you design your first test.

  • Jira Integrations are now live!

    We’ve been getting this request for a long time and it’s finally live! BetaTesting is integrated with Jira to push bugs from your tests directly into your Jira Server or Jira Cloud account.

    It’s simple to map fields from our bugs to various fields in your Jira account, such as title, description, priority level, attachments, and more.

    Within each test, you can choose your own integration, so different tests can connect to different development teams.

    Contact us for a demo to learn more!

  • Beyond Beta Tests: Collect Data for AI and Machine Learning

    Over the past few months, we’ve seen more clients leverage the BetaTesting community (200,000+ participants) to collect real-world data to improve AI and machine learning within their product:

    Crowdsource pictures to improve image detection algorithms.

    A large car parts company in Europe crowdsourced pictures of car interiors from BetaTesting users to tag images for debris, helping their machine learning algorithm learn if a car was clean or dirty. Within days, they were able to collect thousands of pictures from different vehicle types around the world to improve their product.

    Track phone sensor data (e.g. locations, battery, acceleration). 

    Another customer had users share their schedules throughout the day (at home, driving to work, at work, sleeping, etc) and matched it to their phone sensor data to learn more about their customers and improve their AI algorithms to anticipate customer behavior.

    Gather user ratings to improve recommendations engine.

    A number of companies have improved their recommendation engines for content by asking a wide variety of testers to rate content and give user feedback around preferences and categories to make their products more personal and targeted for each customer.

    Collect software telemetry for PC app over several weeks.

    An antivirus product for desktops asked users to beta test their product and provide back-end log files to track the software telemetry internally within their team and learn more about how the product behaved across various devices and operating system versions.

    —-

    AI and machine learning products need more data than ever, and our platform + community is the perfect mix to source large datasets or hard-to-find niche users to make your products more accurate and powerful. This includes pictures, speech inputs, mobile usage, location data, and much more. 

    As a team, we are excited to see these opportunities to connect to users in different ways to improve products beyond beta tests. Our community of over 200,000+ people is perfect for gathering any data you need to improve your products. With our tools, you can find the right demographic anywhere around the world and get the exact data you need – within hours or days. 

  • BetaTesting Test Design: How to Setup Your First Test Process

    One of the most common questions we get asked about our test design process is: “How should I setup my beta test?” How many tasks should I give testers? How many surveys should I setup?

    There are a lot of different ways to setup your beta test, and it is usually driven by what you hope to achieve from your test.

    QA / Bug Reports / Functional Testing

    For technical testing, where you are trying to identify bugs, specific test design scenarios that cover your entire product are usually best. For example:

    1. Create an account in our app and fill out your profile.
    2. Go to Settings, click on “invite a friend”, and add their email address to the app.
    3. Go to ‘Newsfeed’, upload your first picture, and add location tags to it. Submit a bug report if you are unable to complete any of these tasks.

    Asking for specific tasks helps testers stay focused and helps our clients find bugs across different operating systems, phones, and browsers that can be resolved quickly.

    Usability or UX Testing

    If you are more interested in general usability and UX/UI testing, we recommend keeping the test design process more high level and letting users explore your product more organically. For example:

    1. Create an account and begin using our app.
    2. Try to connect our app to your Google Home or Alexa.
    3. Use voice commands to control the settings in the app and share what voice commands you tried.
    4. Continue using the app every day for the next week and then answer our final survey.

    Higher-level test tasks give testers more freedom to explore the app on their own. This provides a larger variety of use cases and opinions from testers. When you want to learn about your app’s design, ease of use, or net promoter score, a test designed to let users figure things out on their own will help you get more accurate scores.

  • Professional Beta Testers vs Real-world Beta Testers

    When you run a beta test for your product, there are tradeoffs to consider between professional testers and real-world beta testers. A lot of companies use professional beta testers to help companies test their products, and those testers have deep experience dealing with technical tests that require a deeper dive into a product to find issues.

    Before we changed our name to BetaTesting, we built Erlibird as a community of early adopters interested in finding new tech products and providing feedback for them. It made our community a diverse mix of tech-savvy early adopters and regular people. Since then, we have focused on building a community of real-world testers across the world with a variety of tech experience.

    We believe using real-world testers leads to better feedback and results for most tests. We are able to recruit testers that actually match the audience for your product, with a true mix of platforms, operating systems, connection speeds, and more that help customers find real bugs and get more accurate UX and UI feedback.

    In most cases, companies need quick feedback to iterate their products, see usability videos to learn where users might be getting confused, and test the validity of their new features and products. In those instances, a real-world community makes more sense than a community of professional testers.

  • Sending Effective Beta Testing Invite Emails

    At BetaTesting, we’ve sent out close to 5,000,000 beta testing related emails. Many of these have been beta testing invite emails designed with the specific purpose of communicating test details concisely and getting high levels of engagement from our community of testers. We’ve A/B tested, measured results, and redesigned our emails countless times based on those results.

    With all of this data, we’ve learned that there are several factors that can influence the effectiveness of your beta testing invite emails. The pay-off is clear: Sending better invite emails will help you attract the right testers that are more likely to become active users of your product now and in the future. In this article, we’ll take a look at some things you can do to improve the effectiveness of your invite email:

    1. Work Within Bounds

    Understand that your beta testing invite email open and click rates, and your larger beta participation rates are primarily driven by the value of your product.

    This is an obvious one (and maybe not the type of advice you’re after right now) but it’s very important to keep in mind that all the marketing/messaging/magic in the world can’t make up for a product that simply isn’t valuable / fun / interesting. If you build an exciting product that is fun and useful, some people are going to get in line to test it. On the other hand, if people aren’t intrigued by your product, the invite email isn’t going to change that. So, when designing the invite email, the goal is to fine-tune the messaging to simply and quickly communicate your product’s value. If you find yourself instead communicating something else (e.g. value your product doesn’t really offer) or leaving important information out in order to increase open/click rates, it’s a sign that you may need to change your product.

    2. Communicate Your Product’s Value in One Short Sentence

    When communicating your product, LESS WORDS > MORE WORDS

    Think of this as the email version of your elevator pitch. Invest some time and effort in coming up with one simple sentence that accurately describes your product and catches people’s attention. Some helpful tips:

    • Mention what problem your product solves, or how it might benefit the user.
    • Avoid marketing lingo.
    • Run A/B tests with different language.

    When working with our clients, we have seen that when faced with the challenge of communicating something clearly, everyone’s intuition is to communicate more: more words, longer sentences, links, screenshots, etc. Unfortunately, this has the opposite effect from what is intended because most people are not going to spend the time to read every word you’ve written. More words ends up being more confusing (not more nuanced and complete as you expected).

    3. Provide Key Information

    What you should include in every beta testing invite email.

    Besides communicating a short exciting sentence about your product, here are some other things to include:

    • Expectations:  Be upfront about how much time you expect the testers to spend engaging / providing feedback. Most people are busy, and they want to know the required time commitment before making a decision.
    • Incentives: Let users know what’s in it for them. Although the primary motivation for most testers is to help you launch successfully, they still expect to be compensated for their time and effort. Knowing the incentives upfront will help them decide if the test is a good fit for them.
    • Exclusivity: If you are running a private beta with limited invites, mention how many spots are available. The opportunity to be one of the few people that get early access to your product might make it more attractive for potential testers.
    • Prerequisites: If there are any prerequisites to apply for your test, you should mention them in the email. (At BetaTesting, we help our clients filter for prerequisites through demographic targeting, and an additional screening survey for refined targeting on any criteria like interests, lifestyle, etc)
    • Call to action: Be clear about what you want people to do, and include a “Call to Action” button in your email. For example, if you want people to apply for your private beta, your CTA button could be “Apply for Early Access” (which links to a landing page where users can apply for the test).

    4. Measure The Results

    A/B test several versions of your invite email to optimize your results.

    It is unlikely that you will get the results you want on your first try. You will probably need to refine / re-design your emails several times until they are optimized. If you have a large enough list (thousands of people), you can send a few different versions of your invite to different groups. Each group should be hundreds in the minimum, but ideally thousands in order to obtain reliable data. After you run a few of these tests, you’ll be able to see what works best and you can stick with that version. If you don’t have a big enough list, run some tests with Google Consumer Surveys, or hit the pavement the old fashioned way and ask people on the street.

    Finally, remember that while these tips can help you craft better beta invite emails, this is only one small part to achieving high levels of beta engagement. For example, if you are building an email list before you get to the beta stage, it’s much better to stay in touch with your audience and get them excited by regularly sending them interesting content and updates (not just waiting for one big beta launch email).

    Learn about how BetaTesting can help your company launch better products with our beta testing platform and huge community of global testers.