If you're a Netflix (NASDAQ:NFLX) member, chances are, you're a part of a test and don't even know it.
The Netflix you log into isn't a static experience, but rather an ever-evolving platform, constantly improved to reflect research done by designers like Navin Iyengar. In a session at SXSW this week, he explained that "we think of product development as a series of experiments."
Those experiments might be cosmetic design changes, or revamped functionality to the app and site. Like many digital companies, Netflix makes these adjustment to improve user experience, and in turn, boost key business metrics like customer retention, streaming hours, and log-ins.
But Iyengar and his team aren't making big-bet, sweeping changes to the platform. Each idea, each test, only makes its way out to a small portion of users through an approach called A/B testing.
Netflix's team monitors how that small group of users react to the adjustment, then compares them to a control group that received the standard experience. If the tested feature delivers better results that are statistically significant, the team then rolls the test out to the broader audience.
One of the most recognizable products of this process is Netflix's famed post-play feature, which queues up the next episode as the credits are rolling on the one viewers are watching. It seems obvious that auto-rolling the next episode would be a boon to binging, but Iyengar explained during the session "what we found over years of testing... is that our intuition is generally wrong." So even the seemingly no-brainer ideas get the data-based testing method so the team is sure they won't irritate users.
Beyond boosting business metrics, the Netflix design team also uses the approach to help address user problems. Iyengar pointed to the service's decision to roll out its offline viewing feature last year — the change was aimed at addressing connectivity issues experienced by users in emerging markets, but after testing it was clear that functionality was a hit with users and should be rolled out broadly.
In his time with the company, Iyengar estimated they've run tens of thousands of tests. In addition to the realization that he and his team don't always know exactly what users want from the get-go, he and his team have learned a couple of other key lessons along the way.
Test before you invest -- Many of the decisions the design team make will be rolled out across tons of site and app pages for Netflix's over 90 million users. Working on that scale can be daunting, so the team uses the same small sample approach it uses with A/B testing to large redesigns. When they were working on a massive site overhaul a few years ago, they used the main home page as their testing ground, then applied those changes to less trafficked sub-pages.
Design to the extremes -- Iyengar explained that more extreme variations in testing allows the team to get to the core truth of what users are looking for, and forces the team to be more adventurous. Often this means pushing more radical experiences in early testing, then including elements of those renderings into more moderate final products.
Observe what people do, not what they say -- The company once asked potential customers "what one thing would you like to know more about before signing up for Netflix?" Almost half of respondents said they wanted to be able to see the whole catalog of movies and television shows before signing up, so the company tested adding titles to the new user page. But that experience led to less sign-ups. They tried building an experience based on feedback, but it turned out the best way to give users a feel for the catalog was to offer a free trial and have them actually use the service.
At the end of the day, even "failed" tests like the catalog experience are valuable to Iyengar and the design team, they prevent the company from wasting time and resources on things that don't resonate with their customer base. They also offer up valuable user insights, and might serve as the springboard to the next big idea.