Many of my clients want an "off the shelf" conjoint design and analysis for their choice-based conjoint (CBC) experiments. These conjoint analysis designs and analyses work well and are pretty much the standard in the marketing research industry. But they can lack some of the bells and whistles commonly found in choice experiments reported in the fields of health economics, environmental economics and transportation economics. These bells and whistles can add some cost, but they may also improve the quality of our conjoint analysis results.
Three Advanced Techniques for Better Conjoint Analysis
The three conjoint design enhancements I usually recommend are:
- Simplify the respondent task with overlap designs
- Prevent profile dominance by eliminating "no brainer" choices
- Reduce noise from attribute non-attendance
These changes help remedy problems that have been found to affect utility estimates and estimates of willingness-to-pay in CBC experiments, making them worth considering for your next conjoint analysis project. The first two change the way we make experimental conjoint designs and the third adds some steps to the analysis path that produces utilities.
Simplify the Respondent Task with Overlap Designs in Conjoint Analysis
Choice-based conjoint questions can be complex enough that respondents find themselves overloaded with information. For instance, here's a pretty standard sort of conjoint analysis question:
Figure 1: Standard Conjoint Design
One way of simplifying your conjoint analysis is to use an overlap design. In an overlap design about half of the attributes do not differ across the choice alternatives (they "overlap"). We can emphasize this by shading them:
Figure 2: Overlap Design
The respondent can now focus on the 5 attributes that differ across alternatives more than on the four that do not. Several benefits come from simplifying your conjoint analysis in this way:
Benefits of Overlap Designs in Conjoint Analysis
In the course of their 10 or 12 CBC questions, respondents end up paying attention to more of the attributes in your conjoint analysis. When faced with all nine attributes, some respondents simplify on their own by looking at only one or two or a handful of the attributes. Jonkers et al. (2018) and Chrzan and Yardley (2024) both found this increase in attribute attendance when using an overlap design in conjoint analysis.
Let's say respondent Jones is a huge fan of Battery Life, so that Battery Life is more important to him than all the other attributes in the conjoint analysis. Jones might rip through a dozen questions like those in Figure 1 by looking only at Battery Life, always choosing the alternative with the longest life. In the overlap design, however, some [KC1] questions, like the one in Figure 2, will show ties on Battery Life, so that even a superfan like Jones will have to tell us more about what turns him on besides Battery Life. In other words, an overlap design can give us more information in the face of a dominating attribute in conjoint analysis. (Chrzan and Yardley 2024)
Jonker et al. (2018) also report that, compared to standard conjoint designs, this easier task for respondents reduces the dropout rate and reduces the amount of response error in the conjoint utilities, which latter result Chrzan and Yardley (2024) found as well.
We can create overlap designs in Sawtooth's Lighthouse studio program (Chrzan and Yardley 2024) or in the Ngene experimental design software (ChoiceMetrics 2025).
Prevent Profile Dominance by Eliminating "No-Brainer" Choices in Conjoint Design
If one product profile in a choice set is objectively superior to the others, we say it dominates the choice set Alternative [KC2] B dominates the other two alternatives, because it is at least tied on all the attributes and superior on at least one of them:
Figure 3: Dominant Alternative B
Dominating alternatives can cause a variety of problems in conjoint analysis (Bliemer et al. 2017), including:
- They create uninformative choices that reduce statistical efficiency in your conjoint analysis, a problem that gets magnified in situations with small sample sizes
- When respondents choose dominated alternatives in conjoint analysis, it distorts utilities which can bias estimates of willingness
- Dominated choice sets will be easier to answer than non-dominated sets, creating within-respondent scale heterogeneity that will affect the magnitude of the utilities in your conjoint analysis
With a little extra effort, we can prevent dominating alternatives from appearing in our CBC experiments. We can do this manually in Sawtooth's Lighthouse Studio program for conjoint analysis: we create and export a conjoint design, use Excel to identify and remove dominating alternatives and then redo the question and version numbers before importing the conjoint design back into the software (Orme and Chrzan 2021). We can also create a dominance-free conjoint design in Ngene and then import it into our Lighthouse Studio survey software.
Reduce Noise from Attribute Non-Attendance in Conjoint Analysis
By attribute non-attendance (ANA) we mean that some respondents don't consider (or attend to) some attributes when they make their choices. In LinkedIn post I reported a 10-attribute CBC in which some attributes had as many as 57% of respondents not attending them, and the average respondent attended to only 6 of the 10 attributes:
Figure 4: ANA in action
With standard conjoint analysis modeling, these unattended attributes usually have small non-zero utilities. Those utilities should be 0.0, however, so ANA can bias utility estimates and thus estimates of willingness-to-pay in conjoint analysis (Hensher et al., 2005).
An even bigger benefit of measuring ANA, for most of my clients who do so in their conjoint analysis, is that they find that a report like that in Figure 4 is all by itself of interest to their clients.
Implementing Attribute Non-Attendance Analysis
To remedy this, we can model attribute attendance and replace unattended attributes' utilities with zeros. Different software packages handle this differently, but in Sawtooth's programs we'd first run a model to unattended attributes for each respondent in the conjoint analysis, using the optional draws file output by our software:
Hess and Hensher (2010) recommend computing the respondent-level coefficient of variation (CV) for each parameter and considering those with CV greater than 2.0 as unattended in conjoint analysis). Then we remove those non-attended attributes from the respondent's experimental conjoint design (by coding them as zeros) and we re-estimate the model. Finally, we zero out utilities of non-attended attributes from this second estimation run.
These added steps may be worth the extra work in your conjoint analysis because they should prevent some of the reversed utilities that sometimes occur and because they can improve willingness-to-pay estimates.
References
Bliemer, M.C.J. J. Rose and C. Chorus (2017) “Detecting Dominance in Stated Choice Data and Accounting for Dominance-Based Scale Differences in Logit Models,” Transportation Research Part B, Methodological, 102, 83–104.
ChoiceMetrics. (2025). Ngene 1.3 User Manual and Software [Computer software]. ChoiceMetrics Pty Ltd. https://www.choice-metrics.com
Chrzan, K. and D. Yardley (2024) “Complete Level Overlap With Color Coding: Validation, Extension and a New Superpower,” Proceedings of the Analytics and Insights Summit, 201-217. https://sawtoothsoftware.com/resources/technical-papers/conferences/analytics-and-insights-summit-2024.
Hensher, D.A., J. Rose and W.H. Greene (2005) “The implications on willingness to pay of respondents ignoring specific attributes,” Transportation Research, 32(3):203-222. https://doi.org/10.1007/s11116-004-7613-z
Hess, S. and D.A. Hensher (2010) “Using conditioning on observed choices to retrieve individual-specific attribute processing strategies,” Transportation Research Part B, 44: 781-790.
Jonker, M.F., B. Donkers, E. de Bekker-Grob and E.A. Stolk (2018) “Attribute level overlap (and color coding) can reduce task complexity, improve choice consistency, and decrease dropout rate in discrete choice experiments, Health Economics, 28: 350-363. https://doi.org/10.1002/hec.3846
Orme, B.K and K. Chrzan (2021) Becoming an Expert in Conjoint Analysis: Choice Modeling for Pros (2nd Ed). Provo: Sawtooth Software.