Insights | Designalytics

Three Things I Learned From 100 Package Redesigns

Written by Admin | May 4, 2020

By Steve Lamoureux, founder of Designalytics.

Designalytics recently crossed a notable milestone; we analyzed one hundred package redesigns that have been launched to market. Looking across these “in the wild” efforts, we discovered a number of surprising things worth sharing for the education of the CPG and design industries.

For those of you unfamiliar with what we do, Designalytics is effectively the “scanner data of design performance”—that is to say we are an objective, syndicated, data-based feedback loop for CPG brand design performance. We do this so that manufacturers can understand the effectiveness of their designs and what might be holding them back. As part of our “always-on” system, we automatically test new redesigns that have been launched to market across CPG categories. This provides us with a representative sample of redesigns.

Here are the three most surprising discoveries from our first 100 measured redesigns:

#1 - Most redesigns fail

I know. That sucks, right? As a one-time brand manager, I know how much effort and expert opinion goes into a redesign—all the qualitative research, creative briefs, design route exploration, and quantitative validation. All the thoughtful input from brand, design, manufacturing, insights, sales, retailers, executives, roommates, and spouses. Yet, 63% of the redesigns we’ve measured show no improvement in key performance metrics; even worse, 50% don’t perform as well as the original design.  

You might be thinking that consumers favor the familiar. However, consider that nearly 40% of the measured redesigns produced significant design performance improvement across several measures. Further, of the dozen redesigns for which we have analyzed post-redesign sales data, our results consistently align with in-market performance.

#2 - The secret: communicate important things better

There are a number of design performance metrics to consider with a redesign—standout, findability, communication, mental availability, design resonance, and so on. However, effective communication of key decision drivers in a particular category seems to have a uniquely strong relationship with purchase preference. (Designalytics identifies decision drivers independently through multi-stage choice-driver research.) In fact, in 95% of cases, the design that communicates key drivers most effectively also performs better on purchase preference. In the remaining 5% of cases, redesigns communicate some attributes better than the previous design, and some attributes worse, in roughly equal measure.

#3 - Liking is not the same as buying

Other design performance metrics are less correlated than communication with purchase preference and sales gains. For example, a clear majority of redesigns produce more positive sentiment responses and higher overall resonance scores (i.e., consumer “liking”). In fact, among the redesigns that failed in Designalytics’ testing, resonance still improved for 35% of them. Conversely, 61% of redesigns that were more preferred by consumers saw a decline in resonance.  

That isn’t to say resonance and sentiment aren’t important; they are, but the things we subjectively appreciate about a design aren’t a substitute for effective communication and purchase preference. Given the amount of subjectivity ingrained in the design process—and the absence of reliable design performance data—you can begin to see how the industry produces such a high redesign failure rate. To combat this, we need to keep opinions in check with reliable objective consumer metrics.

How do these discoveries line-up with your experiences? I’m sure you’ve witnessed design misfires in your career—what do you think contributed to such outcomes? Without addressing the system that produces these variable outcomes, I fear that manufacturers will avoid making the investments needed in design performance improvement.

For more data-driven insights and best practices around design performance, check out our report: 16 Data-Driven Ways to Win With Design.