At SOASTA, we care about ensuring that your sites and apps are always up and running. We help you get the visibility into web performance you need from a variety of perspectives — IT, business, and user experience — in order to understand how your site’s speed and availability affect your users, and ultimately your business.
That’s why we’re excited about the new features in the Fall 16 release of the SOASTA Digital Performance Management (DPM) platform. This release is packed with big (and small) features that our customers — including 53 of the top 100 retail sites, 6 of the top 10 media companies, and 7 of the top 10 media companies — have been requesting. There are also many improvements under the hood, as well as a much tighter integration of all of our tools built on the DPM platform.
Let’s take a tour of some of the exciting features inside this release.
Some of our competitors claim to provide a level of performance analytics on par with SOASTA. While performance analytics as a discipline may become commoditized in the future, SOASTA prides itself on providing its customers with true insights using our DataScience capabilities. This is analysis that goes above and beyond analytics and allows our customers to dig deep into their data (all of which, by the way, we collect and store forever).
mPulse, our Real User Monitoring (RUM) solution, has been the market leader in enabling customers connect the dots between performance and business metrics. It has, for some time, provided elements of data science, such as the predictive analytics of the What-If dashboard. But with this release, mPulse will be formally powered by DataScience, our proprietary data science platform. This means expanded and integrated DataScience capabilities right inside of mPulse.
Here’s a sneak peek at what DataScience capabilities will be included with all mPulse licenses going forward:
What-if analysis offers predictive analytics beyond ecommerce
Already a key part of mPulse predictive analytics and a hit with all of our customers, the What-If dashboard has expanded functionality that enables users to select any conversion or revenue metric, such as Session Length or Session Duration. This takes predictive analytics to a whole new level for verticals outside of ecommerce.
Tree maps give you faster insights into your data
These give you the ability to slice and dice your data visually by various user dimensions — such as device, geography, and OS — to get a quick glimpse of the performance cross section for all your users.
Conversion and Activity Impact Scores tell you which pages to optimize
These scores use your own data to show you the correlation between specific page types and performance. Impact scores quickly enable you to decide which pages are most important to your business’s bottom line, and to focus on remediation for those specific pages.
Learn more > Conversion Impact Score and Activity Impact Score
3rd party analytics help you enforce SLAs
In a typical modern web property, more than 60% of resources are coming from third parties. Our 3rd party analytics help you isolate performance issues to specific providers, so that you have recourse in enforcing the SLA with them.
Learn more > What you don’t know about third parties may hurt you
Tolerance bands and smart alerts give realtime performance monitoring of marketing campaigns
We are proud to introduce a brand new capability that our data scientists and engineers have been working on for the past few months. This capability will allow you to monitor your marketing campaigns and see their performance in real time. Any relevant campaign metric may be used, such as page views or revenue, the only requirement being that at first data from a previous similar campaign must be processed to create tolerance bands.
Smart alerting is also coming soon, whereby DataScience will use elements of machine learning to set up and alert you automatically if a campaign is underperforming. We are particularly proud of this effort, and are looking forward to all of your use cases and feedback!
Enhanced SPA support for CloudTest lets you load test single-page apps
Single-page applications (SPAs) have become a new standard for creating fast, adaptive websites, and SOASTA has already provided universal RUM support to our mPulse customers that have any of the standard or custom SPA frameworks.
However — although monitoring sites built with a SPA framework is undoubtedly useful and even necessary endeavor — customers have been unable to effectively perform load testing on their SPA sites.
Until now. Introducing SPA support for CloudTest Chrome Extension.
The Chrome browser extension is a great way to record a web app to build load test clips. With the Fall 16 release, the extension records SPA apps better than ever, catching the “page” changes that are no longer full HTML downloads but instead are smaller region changes that deliver a faster end user experience. In the test clip, you will also see the new bursts feature, a container that runs multiple messages simultaneously, such as downloading page resources, without an associated HTML document.
These messages will run at the same time, up to a thread limit (default is 50). You can manually create bursts, too, like other built-in containers.
Learn more > How to add a burst in CloudTest
This time around, we went a lot further. We are now introducing error tracking and analysis dashboards. These will enable you to get critical error information at a glance, such as:
- Error rate
- Sessions experiencing errors
- Errors broken down by build
- Recent error stack
- Details on a specific error
In addition, very shortly we will be introducing alerting on various error statistics, such as when error rates go out of tolerance.
Annotations make it easier to see specific events and actions
Every tool in our Digital Performance Management platform — mPulse, CloudTest, and TouchTest — is enabled with annotations. This feature allows you to more clearly see specific events or actions, such as alerts, right on the charts within these tools.
For example, in the graphic above, you see auto-generated annotations on an mPulse widget. An annotation may represent a single point in time or a time range. By default, annotations will be viewable as a chevron or a thick line on the horizontal (time) axis of any time-based chart. Annotations will expand on hover and will be fixed to expanded state on click.
In CloudTest, annotations are created automatically during a load test to mark virtual user changes and steady state periods. This makes it much easier to compare results across test executions but only when it matters, such as during steady-state periods of a load test. Similarly, you can measure results during an annotated period, making it easy to exclude ramp-up, ramp-down, and other changing periods so the results focus on steady-state measurements, which can then be checked against SLA (service level agreement) settings, for example.
In this release, annotations are limited to system-generated events, but soon charts may be annotated with user-generated input.
Learn more > SOASTA annotations and Analytics segmented by ramp
Enhanced data comparison lets you compare performance at different time intervals
As users of mPulse may already know, our RUM tool is extremely versatile in terms of allowing external data sources for correlation. Users are able to create custom dashboards with different widgets showing how their system behaves over a specific period of time, but with different data sets. It’s even possible to combine multiple data sets within a time series on a single widget. But what about the ability to compare how the digital property behaves at different intervals of time?
In this release, customers are able to select any widget with any metric (as long as it is a time series metric) from any dashboard within mPulse, and compare it to the same time interval at any previous point in time.
Here’s how it works:
- Open a dashboard with any time series metric widget.
- Select a data set time interval in the time picker on the dashboard, as usual.
- On the right hand side of the widget, click on the gear icon.
- Select “Compare”.
- A window will pop up with another time picker.
- Pick a point in time to which you would like to compare the original data set to. (Please note that the date you pick will be the end date of the comparison period.)
- Once done, you will see a new widget within the same dashboard, as shown in the graphic above.
The Y-axis will be automatically fixed between the two widgets so you can make a reasonable comparison. The new widget can be placed in either the same dashboard as the original widget or into a whole new dashboard.
We think this will make data comparison a lot easier, so compare to your heart’s content!
Learn more > Comparing widgets on the dashboard
Wait, there’s more!
As you can see, this release is full of amazing features, and unfortunately we can’t detail them all. But here are the additional features and enhancements that are part of this release:
- Resource groups allow you to group specific assets on a page and create custom timers for them. This is especially useful for monitoring load times of above-the-fold content for media sites.
- Timezone support at account level lets you easily change time zones across the entire account, rather than at the dashboard level. This will allow correct time-based charting inside dashboards by default, no matter where in the world you are located.
- Alerting via webhooks allows for external triggers to set alerts within mPulse.
- Java API support gives you the ability to collect mPulse beacons for any Java application.
- Long-awaited dashboard filter and time picker UI update enables much simpler and versatile filtering and time range selection across all SOASTA tools.
- JMeter test support has been enhanced with network emulation to simulate different network types, and dynamic ramp to adjust virtual user load up or down in real-time during a test.
- TouchTest now supports iOS 10 and Android 7 (Nougat) app functional and performance testing.
- Mobile device metric collection from the test device has been improved to provide better measurement of performance of the hardware and the software being tested. These let you see how the application impacts, and is impacted by CPU, battery level, network traffic, and memory usage.
- Backend and architecture improvements to improve performance and reliability of all SOASTA applications.
We hope you’re as excited as we are about this release (just in time for the holidays!) that will enable you to monitor, test, and optimize your digital properties to their fullest potential. Here’s how you can start taking advantage of SOASTA DPM:
- If you’re a current SOASTA mPulse customer: You will be getting these updates automatically as we roll them out over the next several weeks.
- If you’re a SOASTA CloudTest customer: CloudTest updates are available automatically in your CloudTest appliance and can be applied on demand when you’re ready.
- If you’re a new customer: Contact us to set up your free trial.
- To learn more about this release and see a demo of the SOASTA DPM platform: Please join us for the Fall 16 release webinar on October 13th.
About the Author
Tom has more than 20 years of experience as a manager and product manager in the software development tools field. Today, Tom works in product management as Senior Evangelist at SOASTA, the leader in performance analytics. He speaks frequently at industry conferences and meetups on topics, including Web app performance and testing at large scale, mobile continuous integration and testing, automated mobile testing tools, and big data analytics for business value.