June 27, 2023

Spotlight Series: Joe Arnold

A veteran creative discusses the power of testing your assumptions and the value of letting data and consumer feedback guide the redesign process from the start.


Like many of us, Joe Arnold–creative director for Purity Products–finds contentment in the simple things. Hanging out with his two young sons, for example. “Currently, we’re spending a lot of time with swimming, karate, football… They keep me pretty busy,” he says with a laugh. 

Or heading out to his family’s cabin in the woods of Long Island. “There have been times that I'll be there and go: ‘You know what? I, you don't need anything else right now. I just wanna go sit in nature for a few hours, and just see what I see.”

And like many designers, Joe is drawn to minimalist package designs. “Personally, if I'm designing something for myself, I'd like something clean and simple,” he notes. “I like that aesthetic.” When it appeared that many of Purity’s competitors were redesigning their packages to follow this streamlined approach, Joe saw an opportunity he was excited about: simplifying Purity Products’ package design. 

It turned out it was an opportunity. Just not the one Joe expected. 

Before committing to a new design concept, he wanted to test these new simplified design concepts with consumers. He reached out to Designalytics to utilize our quick, cost-effective design screening and in-depth design testing tools in order to measure how these new concepts would perform compared to their current design.

The results? The in-market design bested all nine of the new concepts in consumer purchase preference. In other words, every one of the new designs was less effective than what was already on the shelf. 

We talked to Joe about what he decided to do next—something that completely turned the project around and set Purity Products up for success. 

How did it feel when you saw that all of the new, streamlined design concepts you created were outperformed by your current design? 

Honestly, it was a bit of a punch in the gut [laughs]. It hurt a little bit for everyone on the design team, because the current design–which we thought needed to change –destroyed the ones we were more excited about. But it was a really good thing, because it made us look at our current packaging again and rethink our strategy a bit. 

Before you created another round of concepts to test, you ran a baseline assessment of your current design. Why? 

Well, the Versus tests made it clear that we didn’t fully understand the design’s value. The baseline allowed us to get a clear sense of consumer reaction to the current design, and it turned out that there were a lot of elements that were working well. And there were others we assumed were important that just weren’t. Each member of my team combed through all the data and consumer responses, and then we each created new concepts independently. 

What’d you come up with the second time? 

We all have different approaches to design, but because we read the same data—I was looking at the same information as my associate art director Josh was, for example—the second round of design concepts were a lot more similar than they were different. Which makes sense, because the direct consumer responses we got were pretty clear. It really helped us zero in on a new aesthetic that was appropriate for our customers.

It sounds like the open-ended responses from consumers were helpful in refining those original concepts.

Absolutely. The data revealed what we needed to address, and the consumer responses helped show us how to do it. We just had to listen to what consumers were telling us. It was so helpful that we even shared the information with our digital marketing and sales teams.

Were the results useful for those teams?

Yes, especially for the sales team. If they’re speaking with CVS or Walgreens, Designalytics’ data can help them show how our design performs in comparison to our competitors. They can say, for example: “Company A is one of our key competitors, and Purity is more trustworthy, healthier, and more natural according to consumers.” Having that information helps us say: We're not making this up. This is what consumers are saying. 

Can you give an example of consumer feedback you received that made you think differently?

On our existing package, we had an image of the actual vitamin–the pill itself. Everyone here at Purity believed it was very important, but the data showed us that consumers didn’t care about it. So as designers, we said: why are we gonna clutter the package with this if we don't have to? It's not moving the needle one way or the other, so let's get rid of it. And sure enough, when we did, the subsequent Versus results showed no one missed it. Plus, now we had data to show why we were making that change.  

Were there certain elements of the Versus results that you found particularly helpful? 

The choice drivers were so helpful for us when we were designing the work. Just knowing what was most important to consumers helped us focus on how we could best communicate it. 

Did doing early research boost your confidence as you moved forward in the design process? 

It did, and it wasn’t just in the design itself. We felt more confident convincing stakeholders that it was the right approach. When we went in front of our CEO, for example, or visited retail folks, it was easier to explain what we did and why we did it. We had the data to back up what we were saying.