The Performance Beacon

The web performance, analytics, and optimization blog

Office Depot’s digital transformation journey: From back-room QA lab to digital performance management

Back room web performance testing lab

Recently, I’ve been spending many days on-site with one of SOASTA’s long time partner/customers, Erick Leon — Office Depot’s resident performance engineering guru — as Office Depot continues its evolution towards the next phase of digital transformation: digital performance management (DPM).

Naturally, over the years that we’ve worked together, Erick and I have shared stories of our mutual journeys in business and, of course, technology. After several great stories, comparisons of background, technology trivia*, several glasses of Diet Coke (okay, maybe they weren’t all Diet Coke, but this is a family blog), Erick piped up, “Hey, you should write about our digital transformation story. Everyone’s trying to do this now.”

So today I’m going to share Office Depot’s journey from back-room QA lab to where the company is going today with digital performance management.

Let’s start at the beginning…

Before IoT, there was embedded realtime software

Performance testing — which is nothing more than testing how a system works under real-world (aka realtime) conditions — started back in the 1970s.

Though many applications were industrial and military in nature, and thus not readily available or visible to the general public, this discipline started to hit the mainstream with the introduction of things like the Atari 2600, which brought realtime software into our living rooms. Before this occurred, testing consisted mainly of de-bugging (AKA finding an error before a user does) rather than prevention, which is what today’s performance testing is focused on.

Enter the age of acronyms…

The era of Total Quality Management (TQM) and the dawn of the QA organization

Engineers are great at building things — especially organizations around disciplines. As software started to become mainstream (though not nearly as mainstream as it is today), organizations started to morph around disciplines.

You’d have a software organization that dealt with all things development, and a hardware organization and all the appropriate disciplines organized within that structure. And they would all feed into an operational group that was responsible for day-to-day operations and support.

With the advent of software products hitting the mainstream, product support suddenly became very important. This put a new focus on testing. And although the term “quality” had been around for awhile, it took the Naval Air Systems Command to coin the term Total Quality Management (TQM).

Since TQM didn’t fit in any existing organization — such as software development (“It doesn’t belong with us. We don’t test the hardware.”) or hardware (“It doesn’t belong with us. We don’t test software.”) and certainly not support or operations (“We run it. We don’t test it.”) — engineers did what engineers do: they built.

Specifically, engineers built another organization to handle testing that did not fit into any of the other organizations. This testing was typically the systems level testing. Why was it called this? Because it included the entire system: hardware and software.

And so began the QA organization. And this is the QA organization that reigns today, except the products tested have morphed over the years and are not just “boxes” (i.e. tangible products like an Atari 2600), but consist of:

  • a piece a user would interact with, such as the web or mobile app, and
  • a piece behind the scenes that supports what the user interacts with, such as a datacenter or mobile device.

The QA organization took on testing every sort of application and product that had software in it — everything from back-office applications running in large datacenters to software embedded in some type of device that was an eventual product (the precursor to IoT).

With this new focus on testing, and the newly minted organizations, something was still missing: new tools.

The planets align. Enter Mercury… and “that tool out west”

With applications becoming closer to the mainstream (and more complex with the advent of Windows, which increased usability and allowed applications to mature and provide an easy window to interact with software), a need arose to be able to test these new front ends quickly — especially from a functional perspective — in a repeatable fashion. Enter WinRunner.

WinRunner enabled the QA organization and its newly minted testers to record and playback various interactions with the application under test via the Windows interface. This testing validated the functionality of the system and user interactions (UI) via the Graphical User Interface (GUI).

And once all the testers had cool GUI interfaces for their functional testing, it made perfect sense that they’d want the same kind of cool tool for performance and load testing. So, in 1989 Mercury introduced “that tool out west”.

READ > New to cloud-based performance testing? Here’s a quick language lesson

As the decade of the 1990s began, the boom around GUI products grew exponentially. And so did the requirements for better testing. Faster testing. And, for the first time, “test automation”.

As more and more firms built software packages, there was a never-ending demand for testing skillsets, which usually demanded a skillset aligned with a particular toolset.

And since “that tool out west” had about an 80% marketshare, the demand for tool knowledge commanded the marketplace even more than any background or knowledge of testing best practices or QA in general. A cottage industry of “that tool out west” consultancies sprouted, and careers were built and carved out over the toolset. QA organizations grew in number and size with a laser-focus on tool skillsets.

Enter Al Gore and his invention of the Internet

The game changer started in 1995, though no one really knew it then or understood the potential impact at that time. It was the dawn of the world wide web. The dawn of Amazon and eBay. And the hockey stick rise of ecommerce.

QA testing stayed in its organizational silo. The web was just treated as another application to test in the QA lab, isolated from production and the real world. A typical QA lab might consist of 10% of a full production-level application. But early in the days of the internet, the web was static. Content was static. Pages were built out and treated much like an ad in the Yellow Pages — an advertising presence, but not much else. (Everyone remembers the Yellow Pages, right?  Google it, or go watch The Intern.) And testing of these sites was fairly easy. Static content doesn’t change much, and thus didn’t put a lot of undue pressure on performance in general, and revenue impact early on was not a core focus of the business.

Performance testing in this era consisted of testing in the QA lab, with a simulation of load typically being less than 1,000 vUsers.

Cracks in this testing philosophy started to appear in the late 1990s and were highlighted by one of the first promoted web advertisements for the Super Bowl — the 1999 Victoria’s Secret ad. Predictably, the site was brought to its knees when the ad ran during the Super Bowl. As we know now, Super Bowl ads are big events, and anything tied to the web will get unprecedented traffic. However, at the time, the reaction to the failure was along the lines of, “Hey any mention of the ad is good news, no matter what the performance of the web was.” (My, times have changed.)

There have been many other examples since then, but this one was the first high-profile example that really started to change the game and bring an initial focus onto web performance, and how that might affect the growth of the web into a revenue channel.

The initial reaction from the QA community was to throw more hardware and software at the problem within the structure that they were already testing against. “Let’s boost the lab from 10% of production capacity to 25%.” “Let’s increase our virtual user capability with ‘that tool out west’ from 1,000 vUsers to 2,500 vUsers, or more.”

Some organizations even tried testing in production, but were quickly kicked back to the lab by the operations folks when “that tool out west” brought the production environment to a crashing halt because the tests being executed did not have any realtime metrics capabilities — the equivalent of flying a plane without instruments and with a bag over your head.

Not good.

Hence the dated mantra “We don’t test in production” that I still hear from many veterans in the IT Operations arena.

But the game was about to change in many ways.

READ > Why performance testing in production is not only a best practice — it’s a necessity

Enter omnichannel, ecommerce and SOASTA

Around 2004, with the advent of Web 2.0, the game began to change into what we see today.

Web 2.0 was not a standard, per se, but rather a term that was coined to signify the web’s transition from static web content (e.g. a “Yellow Pages” format) to the beginnings of dynamic, changeable, interactive content.

This change also brought with it the cultural business change that a web presence was more than just a “Yellow Pages” static ad. Instead, it represented the entry point into a storefront where retailers, and others, could bring their wares into your home via the web.

During the early phase of Web 2.0, we saw the announcement of a series of game changers that, when taken together, drastically accelerated the web into today’s ecommerce platform:

  1. Amazon introduced its EC2 Cloud offering. For the first time, entire fleets of servers were available to anyone that wanted to “spin them up” in the cloud, and give them back when they were done. The ultimate SaaS offering.
  2. Steve Jobs introduced the iPhone. This accelerated ecommerce by putting the web in everyone’s hands. Literally putting a brick-and-mortar store — in fact, an entire mega-mall — into the palm of your hand.
  3. SOASTA introduced the world’s first cloud testing platform. CloudTest was the world’s first performance testing platform entirely hosted and executed from the cloud. Retailers and other online businesses finally had the much-needed ability to test 100% of their real-user experiences — quickly, scalably, and globally — from more than six million worldwide test servers. (This unprecedented ability to test and ensure availability is why the top 20 online retailers still rely on SOASTA.)

When taken together, all of these form the basis for the technology that enabled the boom in ecommerce — not only in retail, but in virtually every other business vertical that had a business-to-business or business-to-consumer presence.

With all of these game-changing events, Web 2.0 became the place to be. Ecommerce went from unknown to mainstream. And with it, traffic boomed. Revenue blossomed. Bottom lines grew plump. Businesses grew and flourished as ecommerce carved out a business model to augment traditional brick-and-mortar firms. The business folks moved into omnichannel go-to-market planning, while IT Operations struggled to keep up.

Everything was changing. (Well, almost everything.)

But back in the lab, not much had changed for the QA teams

After all, they had everything they needed. They had their slimmed-down production lab, their “tool out west”, and certain career stability because every script they wrote, they had to maintain — forever.

Then cracks in performance began to appear and garner front page news. More and more companies began to desire an ad splash at major events, such as the Super Bowl. In 2013, this culminated with the famous CokeChase interactive game where viewers could vote for their favorite CokeChase team. The voting website crashed within seconds of the first ad’s appearance, never to recover.


Even today, in 2016, with the just concluded Peyton Manning Rodeo finale, we had a great halftime show from Beyonce, coupled with a not-so-great performance by her website, which crashed around the time she took the stage. Not good.

So, everything has changed, right? What’s the problem?

How Office Depot has embraced the new digital world

As all of this was going on around them, the QA team at Office Depot recognized the revolution and the challenges associated with it. Online and omnichannel were the future. They had to adapt or fail. It wasn’t about what had worked since the 1990s.  It was about what would work going forward.

Our partners at Office Depot recognized that the world was changing — had changed — as online ecommerce changed continuously. Websites were changing daily. There was no longer the luxury of spending two or three weeks writing C code scripts for a tool developed in 1989.


Back in 2012, Office Depot brought us in to address specific challenges. Challenges that the “tool out west” was not handling to their satisfaction. Challenges such as:

  • Rich relevance
  • Speed
  • Cloud testing
  • Flexibility
  • Affordability
  • Real-time analytics
  • Testing in production
  • And many more

The QA team pivoted from a back-room QA organization to part of the mainstream

Promotions and web site changes were tested with every release. The performance team, as it was now called, was involved in every release cycle — prior to the release. Pro-active testing as an integrated part of the overall organization became mainstream.

Enter user experience (UX) based testing

The next pivot came very quickly. As Real User Measurement (RUM) became introduced and utilized at Office Depot, the performance team quickly adapted UX data as the main input and driver for all of their test cases. This data was also used to feed into the development process, and back into marketing to drive the next promotion, ensure performance was a success, and ensure that the ecommerce web and mobile sites were optimized to reflect the way that they were being used by the customers, thus increasing revenue streams from these channels.

Predictive analytics and “The Big Beacon”: The future is here

Naturally, now that RUM is mainstream, the question arising out of the marketing and business team is, “Now that we have all this real user data, can we use this data to help us to optimize our on-line promotions and ad campaigns?”

Yes, you can.

READ > Steering the enterprise with Digital Performance Management (DPM)

mPulse is powered by SOASTA data science, which gives you the ability to track campaign progress in realtime and, based on that real-user data, adjust the campaign accordingly (e.g., Not tracking to revenue goals? Send an email update offering an additional 10% off, or free shipping).

With the ability to marry marketing data taken from Adobe Analytics or IBM CoreMetrics or Google Analytics with SOASTA’s RUM beacon, customers have the ability to measure and monitor — in realtime — the revenue performance of each and every campaign and promotion being run by marketing at that minute. (Usually there are several being run at any one time: Sunday flyer, weekly ads, flash sales, social media campaigns, etc.) As part of this realtime revenue performance measurement, the performance beacon data is teamed with marketing/campaign beacon data to form a complete picture of revenue performance and the impact that web performance has on revenue.

FREE DOWNLOAD: Getting started with a digital operations center ebook

In the photo below, you can see a “live” set of marketing campaigns being tracked in Office Depot’s Digital Operations Center (DOC). Expected revenue performance based upon past promotions and campaigns forms the “yellow” bands of expected revenue where alerts for above and below the predicted are set to measure campaign performance.

Office Depot's Digital Operations Center

READ > Why a Digital Operations Center (DOC) must be a central part of your digital strategy

In addition, marketing campaigns can be drilled into simply by analyzing the session path data of the users navigating their way through your campaign landing pages and website.  Analysis can be seen, and immediately acted upon in realtime, for poor performing pages, which will show red when revenue and/or web performance dips below expected band levels shown in the bands in the “yellow” range above.

Takeaway: Your QA team is part of the mainstream

Armed with the right technology, your QA Team — or should I say, your Performance Engineering and Revenue Analytics team — should be a key part of your GTM strategy for your business.

It’s not just about testing anymore.

If your QA team is still defending the “tool out west”, then they haven’t yet started their digital transformation journey. You need to help them take that first step.

*Here’s one for you: Back in the days where you’d bring your 20lb work laptop home to write C code, and dial in to your corporate VPN from your home phone, what was the code that you entered to disable the “call waiting” feature on your land-line so you wouldn’t get disconnected if a call came in? (Answer: *70)

Learn more about the SOASTA digital performance management platform

Dan Boutin

About the Author

Dan Boutin

Dan is the Vice President of Digital Strategy for SOASTA. In this role, Dan is responsible taking the world's first Digital Performance Management (DPM) solution to market as a trusted advisor for SOASTA's strategic customers, and changing the way ecommerce organizations approach the marketplace.

Follow @DanBoutinGNV