The Performance Beacon

The web performance, analytics, and optimization blog

In an agile world, user experience is key: Lessons learned at the South Florida Agile Association Conference

As seemingly the only person in North America who did not attend the Velocity Conference in Santa Clara, CA, last week, I was left searching for the next best thing. Well, I found it… close to home in sunny Florida. I attended — and spoke at — the South Florida Agile Transformation Summit (#ATS2015, @Southfaa), a gathering over over 300 members of the South Florida Agile community held on the campus of Nova Southeastern University in Fort Lauderdale — a campus that also doubles as the training headquarters of the NFL’s Miami Dolphins. (No relation to agile or performance there with their track record lately.)

Session focus and common themes

With thirty different speakers, there were certainly many deep technical presentations around the mechanics of Agile, best practices, people and many discussions around scrum and process. However, I noticed a recurring theme:

Testing is critical to the success of agile, and critical to the success of testing is the role of user experience in agile development.

In the traditional agile development world, the focus of the user experience (UX) role has been around user interface design, usability, and other factors typically associated with verifying and validating that the application performs as the business owner has intended it to perform — from a functional and usage perspective.  On some agile teams, I have seen cases where development has taken over the UX role.

What’s wrong with this, you ask, if it speeds up the process?

Well, developers are GREAT at writing and implementing code; however, they typically are not capable of doing any in-depth user research, which can hasten the degradation into the user experience of the application over time.

In addition, there is a new wrinkle in user experience that attendees are now realizing should be a key component to any agile team’s UX role — namely performance of the application from the user perspective.

Analyzing real user data lets you ask questions like:

  • Are users using your application in the way in which it was intended?
  • Or have they found a “short cut” or alternate path that was not considered during initial UX design and testing as a mainline path, but now must be added to the test coverage matrix on the functional side, as well as analyzed on the performance side?

In the image below, real user measurement data collected from SOASTA’s own mPulse solution, is visualized here to show the conversion paths that users of this particular application utilized when they completed a purchasing transaction. What is presented here are all the various “click-through” paths that each user selected over a particular time period.  As you can see, there are several different paths, and some represented paths that were deemed way beyond the edge when the original test coverage model was selected by the UX designer.


Take the guesswork out of user experience:  Know your performance

By measuring the real user experience of your application in this manner, you can quickly and clearly see how your user’s are using your web site or application, as in the image above.

  • What are their click-through paths to conversion?
  • Which click-through path leads to the most revenue?
  • What should your test coverage model be?
  • What are the true edge cases?

What else can we learn?

One major retailer (and SOASTA customer) recently discovered that the most common user experience for its female customers was directly accessing the shopping cart and checking out. (You can actually see the path in the image above, as it is the shortest “spoke” on the sunburst chart.) Once you see the data, it’s not hard to understand why. This retailer found that its female customers are savvy shoppers. They put items in their carts to see what the final prices would be – and then went off-site (typically through a different browser tab) to access competing retailers’ sites and compare prices. If this major retailer offered the best price, the customers return to the site to resume the purchase process and close out their transactions.

Once the retailer began to test performance in this scenario, it took advantage of the insights by offering discounts when shoppers returned back to the site. This increased conversion rate, which equates to increased revenue.

Your takeaway: Does your testing strategy encompass these five best practices?

Unquestionably, UX and performance testing with real-user data is a game-changer. Your testing strategy should encompass these five industry best practices:

  1. Develop an in-depth application performance strategy that encompasses real-world scenarios and real-world data
  2. Analyze your application back-end metrics to get a fuller picture of the sometimes-invisible factors that affect performance
  3. Analyze your front-end performance metrics to get a truer understanding of your end user’s experience with your application
  4. Tie your application’s performance with revenue-generating or cost-generating transactions
  5. Identify the “hot zones” on your sunburst chart where you can have a fast impact to most effectively improve your website or application performance.

If you are a practicing agile shop, clearly these activities and best practices are a must-have for your UX — and your performance team.



Dan Boutin

About the Author

Dan Boutin

Dan is the Vice President of Digital Strategy for SOASTA. In this role, Dan is responsible taking the world's first Digital Performance Management (DPM) solution to market as a trusted advisor for SOASTA's strategic customers, and changing the way ecommerce organizations approach the marketplace.

Follow @DanBoutinGNV