Closely following the launch of the A/B Testing service for iOS, Android, and Fire OS apps, we have just released an update addressing one of our most popular feature requests. You can now track up to ten goals in a single A/B test, which means you can see how your experiment affects up to ten metrics at once. This is especially powerful when the metrics aren’t entirely independent and it would be difficult to create A/B tests to isolate them from each other. Let me illustrate with an example.
Say you have a mobile game that generates revenue using a combination of in-app purchasing (IAP) and mobile ads. You know that player engagement is the key to monetization, so you decide to test a hunch that more challenging levels will keep players in the game longer.
You create an A/B test project for your app, adding an experiment that allows you to adjust the overall difficulty of each level. Since you can have up to five variations for each test (see A/B/n testing for more information), you decide to measure player engagement when the game is much harder, slightly harder, slightly easier, and much easier than normal. “Normal” will be a variation of its own, called the Control.
In this case, you create a test variable called difficultyMultiplier, which your code can access and use to modify its behavior for each user. For the control group (60% of players in this example), difficultyMultiplier is 1.00, indicating no change from the default difficulty. The other groups see a slightly different value for difficultyMultiplier, depending on how hard the game should be for those players.
To measure the effect of changing this variable, you define a view event and a conversion event, which your code records as they happen and reports to the A/B Testing service. For the purposes of this test, you consider it a view whenever a player starts a new game session. A conversion is registered if he/she plays for five minutes or more. The A/B Testing service tabulates these events by variation and reports on the conversion rate for each group of users.
Say you run the experiment and discover your hunch was right: harder levels are played longer, leading to an increase in the average amount of time players engage with your game. The logical next step would be to ratchet up game difficulty. But what if improved engagement isn’t the whole story? Changing the difficulty may affect other metrics you care about, but you can’t always tell based on a single type of conversion event. For example, how does this change the way people share their progress on Facebook, a major customer acquisition channel? How does it impact ad click-thru rates? Does it impact how users rate the game? Setting multiple goals can help you detect such unintended consequences and choose the variation that delivers balanced results.
Now that the latest version of the A/B Testing service allows a single view event to be associated with up to ten different conversion events (goals), you can measure and compare the impact of each variation along more than one axis. Each goal can be maximized or minimized independently. For example, here you are trying to maximize game sessions, in-app purchases, ad clicks, and Facebook shares while minimizing one-star reviews, all in the same experiment.
When generating reports, the A/B Testing service includes the results for all goals associated with an experiment, organized by variation. The service highlights the “best” variation with respect to each goal, so you can tell at a glance which one resulted in the most game sessions, for example (Variation C), or maximized shares on Facebook (Variation A).
When goals overlap or depend on one another, as they do here, there may be no single variation that definitively “wins” every goal. A report like the one above, however, can help you make an educated choice, weighing the trade-offs of each alternative. In this case, Variation B looks like a good candidate since it succeeded in minimizing one-star reviews and came close to winning several other goals as well. When you look at the big picture, Variation B appears to have the best performance overall.
The orange checkmarks indicate which results achieved statistical significance—that is, where there are enough measurements to be confident that the observed change is actually due to the test variation. More details are available for each individual goal, so you can drill down on the ad clicks, for example, associated with each variation:
With the addition of up to ten goals for a single experiment, the A/B Testing service expands its flexibility and becomes an even more powerful tool for refining your app and optimizing it based on customer behavior. For more information on A/B testing, multiple goals, and how you can incorporate them into your mobile app or game, check out the online documentation.
We recently released an update to the Amazon Mobile App SDK that includes improvements to GameCircle and some related components on both Android and iOS. It updates GameCircle’s dependency on the latest version of Amazon Insights, corrects a few bugs, and expands API coverage of the Unity3D plug-in. For all of these reasons, we recommend migrating to the latest SDK when convenient.
Both GameCircle and its Unity3D plug-in now take advantage of the latest version of Insights, part of Amazon’s Analytics service and the piece responsible for generating Achievements Reports. Insights SDK on iOS removed a dependency on CoreTelephony.framework, while the Android version corrected an issue specific to certain device Locales and improved support for other IAP frameworks. In addition, session timeouts now behave more consistently across platforms.
Check out these latest changes to the SDK, and watch this space or future updates. We’re always working to improve performance, usability, and reliability of our services for mobile apps and games.
Join me Thursday, December 5th at 10:00am PST for a live webinar demonstrating Amazon’s A/B Testing service on iOS.
If you have ever wondered why A/B testing is good for your mobile apps and games, or simply been curious about how Amazon’s service works on iOS, sign up for the webinar today.
I had to take a class in accounting before I understood the difference between sales and earnings. Fortunately, learning about Amazon Mobile App Distribution Program sales and earnings reporting is quite a bit simpler than that accounting class.
While there are 7 possible reports to choose from, in this blog post, we’ll cover three popular financial reports:
Sales Reports show trends in sales and returns over standard or custom time periods. You can get daily breakdowns in CSV reports, and you can see sales broken out by individual marketplace or by the almost 200 countries in which you app could be sold. Data in sales reports are updated every few hours but do not reflect processed financial data for the current month. They do not include adjustments and other data that will affect your payment.
Below are two examples of reporting that can be customized by app and international marketplace. The first is a map view. Note the drop-down buttons for ‘Date’, ‘All Apps’ and ‘All Marketplaces’. These let you show a specific date range and show sales by marketplace (like amazon.co.uk or amazon.fr) or by the country (such as Brasil or Norway).
The next example shows a tabular view of the Sales report with detail such as Units Sold, Units Returned, Units Refunded and Gross Revenue.
Earnings Reports represent the sales, refunds, app earnings and adjustment data used to calculate royalties earned during a given month. Data in earnings reports are released on a monthly basis once royalties earned in the previous month have been processed and approved. Monthly summary information is available in the Amazon Mobile App Distribution Portal and daily breakdowns are available through CSV export.
Payment Reports represent the actual disbursement of funds from Amazon. Monthly summary information is available in the Amazon Mobile App Distribution Portal on the payment reports section.
Accessing Reports Getting these reports is easy. Log in to the Amazon Portal and click on the Reporting link (shown below). Then select the report you want, then select your customization filters below that.
(Note: of the seven reports shown in the screen shot below, all but the Beta Engagement report now allow reporting by country of sale)
The Amazon Moblie App Distribution Program gives you useful reporting data that is pretty easy to access and understand. My accounting teacher would be proud. Try getting a few reports now, and see how easy it is to get data on your apps today.
Amazon is excited to announce an update to reports within the Amazon Appstore Developer Portal. Reports provide developers with important historical and trend data for sales and earnings. Improving the developer interface and strengthening service capability were two of the most important factors we focused on for this update.
During the past two months, we beta tested the update with more than 500 developers. Many of these developers provided valuable feedback that we incorporated into the final design.
Starting today, we will begin rolling out this update to all developers on the Amazon Appstore Developer Portal. You will notice the following changes:
We encourage developers to explore the new reports and provide feedback via the Contact Us link on the Amazon Appstore Developer Portal homepage or by clicking on the Submit Feedback flag on your Reports page.