The SOASTA team just spent the week down in sunny Orlando, Florida, at StarEAST’s “Quest for Test” Conference. The attendees of this conference spanned the spectrum of the software testing lifecycle, and track topics included many focus areas of testing including Agile Testing, Test Automation and Mobile Functional Testing. Each one of these tracks included deep dives into the common reoccurring themes that we saw in virtually each and every presentation.
In no particular order:
- Shorter product cycles – Gone are the days of multi-year software product lifecycles (or lengthy software development lifecycles). Today, development teams – and, by extension, their counterparts on testing teams – are under greater pressure than ever to release software far more frequently.
- More device platforms – With the emerging dominance of tablet and smartphone platforms, testers must have the expertise to test more devices – and accomplish it all in shrinking windows of time. Quite simply, it means more tests in less time.
- Expanding test plans – Software sophistication continues to grow, leading many developers to create exponentially larger test plans to cover large sets of functionality and performance issues. This expansion is outstripping the capacities and capabilities of all but the largest development teams.
- An emphasis on analysis – With these shifts, development teams want more than a list of test scores and stats. They want analyses of the results and their implications on performance and on changes in development or infrastructure strategies.
- Changing focus to the user experience – Traditionally, software testing has focused on validating, verifying, and measuring the performance of a software program, module, or function. Today, that’s not enough. Instead, forward-thinking development teams are emphasizing development and testing frameworks that evaluate software from the user’s perspective and that account for the user’s experience, irrespective of underlying technical metrics and performance stats.
Innovation is a must: It’s time to step outside the box
There was a common theme among the five keynotes: It’s time for innovation in testing. Mobile has changed the world and that includes world of functional testing.
And just like the La Quinta commercial, many in the testing community are looking for the right level of innovation that aligns with today’s world. Innovation in the right processes, the right technologies, and the right set of metrics to continuously improve the test ENGINEERING process in their world. But unlike the commercial, just about all of them are ready to jump outside the box.
Because of mobile, the testing matrix has become complex.
Because of mobile, the testing process has been compressed.
Because of mobile, the testing process has required a strategic use of automation and has put a bigger emphasis on testing strategy and test planning.
And because of mobile, there is now a laser-focus on the testing of non-functional requirements and performance testing.
Why? Performance is now user experience.
With the advent of mobile, every tester is a user. Gone are the days when a tester was testing an application that they either didn’t use, or couldn’t relate to as a user. Why? Today all of them carry around the platform that caused it all. Every day, they use applications on their mobile device that operate in much the same way as the applications that they themselves are working to bring to the marketplace.
Which brings me to another key point…
Takeaway: It’s all about the user experience
Today’s testers are also users. They’re looking to bring this user expertise into their everyday testing world. So, how to they do that? With innovation, of course.
I spoke with many test engineers at the show, and was lucky enough to be able to speak to an audience of about 350 of them during my talk “Let’s put the ‘e’ back in Testing!” What’s the ‘e’? Well, it’s ENGINEERING, of course.
And to use engineering to innovate, today’s test engineers understand the need to have a basic understand of some key areas:
- Know your user – Understand how the user is and will use your application.
- Know your application – Does the application do what it was designed to do, AND is the user using it that way? To perform user testing properly, you have the need to know how the users are using it after deployment. And to continually measure and monitor that usage and tweak your test coverage around it.
- Know your Test Matrix – You can’t test everything. 100% test coverage isn’t real world. (Unless, of course, you are building airplanes. So, you can at least feel a bit safer when you use that cell phone during the take-off roll.) Test the critical brand functionality and user performance..
- Know your devices – Which devices, operating systems and versions will you support and test? Most testers have also confused this with “test coverage”, but the devices are just one half of the equation. (See “Know your test matrix” above.)
- Know how and what to automate – Put together a strategy for automation. Core functions vs. edge functions. Automate repetitive test cases.
- Know your performance – Analyze your user data to determine the optimal performance. What are they doing? How are they doing it? At at what point does their patience “run out” to complete their experience.
- Know your edge – Every tester understands edge cases. (For you newbies, edge cases are scenarios that occur much less frequently than the main scenarios.) Every tester also fears edge cases and fears that they have drawn the line too high on coverage, thus leaving them exposed. Without the right user data and metrics, this line becomes a guess. With it, testers can sleep better at night.
And I’ll leave you with one final thought as you start your journey into test ENGINEERING with innovation: It’s all about the user experience.
Learn more about mobile testing best practices
Read our report: The 7 Steps to Pragmatic Mobile Testing
Join me in my upcoming webinar: Three Tips to Increase Your Mobile Test Coverage
And as a famous TV chemistry teacher once said, “Tread lightly.” For me, I’ll span the globe and revolve my testing strategy around user experience.
About the Author
Dan is the Vice President of Digital Strategy for SOASTA. In this role, Dan is responsible taking the world's first Digital Performance Management (DPM) solution to market as a trusted advisor for SOASTA's strategic customers, and changing the way ecommerce organizations approach the marketplace.