How Digital Design Drives User Behavior
Decisions of all kinds are increasingly made on screens — and with that shift comes an often-ignored consequence: the design of the digital world can profoundly, and often unnoticeably, influence the quality of our decisions.
A review of recent research provides clear evidence that many organizations are currently undervaluing the power of digital design and should invest more in behaviorally informed designs to help people make better choices. In many cases, even minor fixes can have a major impact, offering a return on investment that’s several times larger than the conventional use of financial incentives or marketing and education campaigns.
In our recent working paper, written with Lynn Conell-Price at the University of Pennsylvania, and Richard Mason at City, University of London, we collaborated with Voya Financial, a leading retirement service provider, to investigate how variation in the digital design of online enrollment interfaces could influence the initial contribution decisions of employees in 401(k) plans. The research involved more than 8,500 employees across a few hundred plans, who prior to being automatically enrolled in the plan, had visited a standardized online enrollment interface to either actively confirm their enrollment at the default rate, personalize their enrollment at a different rate, or decline enrollment altogether by selecting one of three horizontally arranged options. Our goal was to get employees to consider a deferral rate higher than the default, which is often too low to achieve financial security in retirement.
To understand the impact of digital design on these initial enrollment decisions, we randomized employees to one of two versions of the enrollment interface — the original commercial design used by Voya or an “enhanced” design which reflected three small changes:
- We changed the color scheme of the options, shifting from a single color (all orange) to a traffic light arrangement of green (personalize), yellow (confirm), and red (decline).
- We displayed the plan’s default rate directly on the enrollment screen, thus making it easier for workers to take this information into account.
- We simplified and standardized the language used to describe the alternatives by removing uninformative fine print and simplifying headlines describing each option (e.g., “I want to enroll with different choices” became “Do it Myself,” while “I do not want to enroll” became “I Don’t Want to Save”).
What did we find? For starters, these simple design changes increased the rate of personalized enrollment by 15%, or a nine percentage point increase from a baseline of 60%. Because employees who personalize enrollment tend to contribute at a rate (7.8%) twice as high as those who automatically enroll (3.4%), the shift to personalized enrollment significantly increased savings for those employees and reduced the share of workers headed for “retirement poverty.” Ultimately, we estimate that the design changes led to an increase in overall contributions equivalent to increasing the typical plan match by over 60%. Of course, changing colors or a few words on a screen is a lot cheaper than sharply increasing matching incentives.
This field study extends previous research showing that major financial decisions can be shaped by seemingly modest design elements. In one experiment Professor Richard Thaler and one of us, Shlomo, conducted on Morningstar.com, we asked two groups of Morningstar subscribers to allocate their retirement savings among eight different funds. The first group was presented with a website that had four blank lines on it, although there was a highlighted link if people wanted to select additional funds. The second group was shown the same exact site, except their version had eight available lines.
This might seem like a trivial change. But the research found that the precise number of lines on a website dramatically impacted the level of diversification. While only 10% of people shown four lines selected more than four funds, that number quadrupled among subjects given eight lines. This means that the level of diversification was driven, in large part, by a seemingly irrelevant detail of design.
Another example of digital design shaping behavior comes from new research by NEST, which is a large workplace pension provider in the United Kingdom. Given the abundance of information in the digital age, an important aspect of improving design involves user engagement, or getting people to pay attention to your message. To boost engagement, NEST assigned workers who had yet to activate their pension accounts to either receive a personalized email or a personalized video. In both cases, the content was tailored to include the person’s name, account size, and age of account.
The results were a clear demonstration that the form of the message can shape outcomes. In the control group, which didn’t receive any digital contact, 1.3% of people activated their account. In the email condition, that number increased to 4.3%. However, the personalized videos doubled the activation rate, with 8.2% signing into their retirement accounts. The takeaway is that designs that increase engagement — and cut through the clutter of information — are much more likely to lead to behavior change.
Digital design also affects other important decisions, such as choosing health insurance. Consider that in the last several years, there has been an explosion in the choice of health plans available to U.S. consumers. Whether through public exchanges associated with Medicare Part D or the Affordable Care Act (ACA) or through employer-sponsored exchanges, people are increasingly choosing health plans from a (sometimes daunting) menu of online options.
In a study with George Loewenstein at Carnegie Mellon University, we sought to understand whether the digital design of plan menus affected the financial efficiency of people’s decisions. Specifically, we ran a series of choice experiments in which we asked online subjects to select a health plan from a menu of three alternatives whose prices and cost-sharing features were informed by actual plans available through the ACA.
A notable feature of the ACA was that, in recognition of the complexity of health insurance, it sought to organize health plans into distinct “metal” tiers defined by the generosity of their actuarial coverage, ranging from bronze plans (higher deductible, lower premium) to platinum plans (lower deductible, higher premium). In our study, we varied whether the plans were categorized by the metal tiers presently used by the ACA (e.g., bronze, silver, platinum) or by alternative labeling regimes involving either generic descriptors (plan A, plan B, plan C) or descriptors that emphasized the degree of expected medical use most appropriate for each plan (high use, medium use, low use).
The first thing we found is that, confirming several other recent studies, a lot of people tend to make poor, and costly, decisions across all of the labeling strategies. They select plans that aren’t commensurate with their expected health needs and tolerance for financial risk. Insurance is complicated.
Unfortunately, our research suggests that the current design of the ACA, however well intentioned, may have led to the most inefficient plan choices. On average, our experimental subjects overspent by an amount equivalent to 24% of the typical premium. We speculate that when faced with the tiered labels (bronze, silver, etc.), consumers selected plans not based on expected utilization, but inferences about the quality of care they would receive. Related research by Peter Ubel, David Comerford, and Eric Johnson shows that people prefer the fancier metallic labels (such as gold) regardless of what insurance is actually offered.
On a more promising note, we found that the labels emphasizing a consumer’s expected medical use led to significantly better choices than the metal labels. While many consumers still chose sub-optimal plans, the medical use labels reduced average excess spending by 37% relative to the metal labels. The lesson here is that the labeling strategy one adopts for (digital) menus should be informed by the trade-offs that ought to govern how people make their decision. In the case of health insurance plans, labels should serve as signifiers of expected utilization rather than quality.
Enhanced design can also improve our medical choices, even when they have nothing to do with finance or insurance. As noted by Gretchen Chapman, a psychologist at Carnegie Mellon, many of our health choices aren’t rooted in deeply held beliefs, but are instead shaped by design cues and context. For instance, simple nudges like forcing doctors to make an active choice about ordering a flu vaccine when accessing a patient’s digital health record can dramatically increase vaccination rates.
The growing evidence on the potency of design raises the question of whether the managers, executives, and policy-makers tasked with overseeing decision environments understand the importance of good design. To investigate this question, let’s return to the study of design and employee savings. As a part of the research, we surveyed several hundred 401(k) plan administrators and HR executives, and we asked them to predict how employees would respond to the design improvements that we had tested in the field. We also asked them to identify which specific design changes would drive any employee response. The results were striking – 88% of those surveyed underestimated the influence of design on enrollment and only 12% of respondents were able to identify which of the three design changes most strongly drove changes to employee behavior. Furthermore, those administrators and executives most confident about their forecasts were no more accurate than everyone else.
Too often, people see design as a visual garnish, a digital element that prettifies but adds little value. But this research shows that digital design is so much more than that — it is an integral part of any product or service offering. And it’s possible to navigate a path to behaviorally informed designs using the following five steps:
- Gather behavioral insights on screen behavior, benefiting from the growing academic literature.
- Conduct a behavioral audit of the existing digital designs, identifying gaps between the status quo and new designs that incorporate the latest insights on screen behavior.
- Test these new designs against carefully selected control conditions.
- Once a winning design is identified, scale it up, implementing across as many digital journeys as possible.
- Keep a searchable “results library” of all experiments.
Improving digital design using an evidence-based approach should be a strategic imperative for any organization aspiring to engage customers and help them make better decisions. Design isn’t the garnish — it is often the key ingredient.