8:00 am

Coffee & pastries
8:00 am
Coffee & pastries

9:00 am

Welcome remarks
Bryan Orme
Sawtooth Software
9:00 am
Welcome remarks

Bryan Orme
Sawtooth Software
9:20 am

If you knew box office performance, Rotten Tomatoes rating, and online sentiment, could you predict the appeal of an intellectual property (IP) to the average theme park guest? Turns out, you can! Learn about the ways UDX is using publicly-available data to supplement survey research, how we’re dealing with cross-country differences in scale use, and how we communicate the appeal of 120+ IPs to a busy executive audience.

How Popular Is Jurassic World? Adventures in IP Research
Laura Dulude
Universal Destinations & Experiences
9:20 am
How Popular Is Jurassic World? Adventures in IP Research

If you knew box office performance, Rotten Tomatoes rating, and online sentiment, could you predict the appeal of an intellectual property (IP) to the average theme park guest? Turns out, you can! Learn about the ways UDX is using publicly-available data to supplement survey research, how we’re dealing with cross-country differences in scale use, and how we communicate the appeal of 120+ IPs to a busy executive audience.

Laura Dulude
Universal Destinations & Experiences
9:55 am

Details coming soon

Session TBA
9:55 am
Session TBA

Details coming soon

10:30 am

Break
10:30 am
Break

11:00 am

Procter & Gamble (P&G) conducted 73 studies in oral health care, producing 1,554 unique claims. To compare these claims, a Sparse MaxDiff study was designed, using a subset of claims to create a database. AI models were then trained to predict claim success, with the most successful model using OpenAI embeddings and a neural network. The study demonstrated AI’s potential for predicting claim outcomes but highlighted the need for a robust claims database for accurate predictions.

Blending Historical MaxDiff Claim Studies and Using AI to Predict Claim Success
Jeremy Christman
P&G
Liz Clevenger
P&G
Pankaj Patil
P&G
Kevin Lattery
SKIM
11:00 am
Blending Historical MaxDiff Claim Studies and Using AI to Predict Claim Success

Procter & Gamble (P&G) conducted 73 studies in oral health care, producing 1,554 unique claims. To compare these claims, a Sparse MaxDiff study was designed, using a subset of claims to create a database. AI models were then trained to predict claim success, with the most successful model using OpenAI embeddings and a neural network. The study demonstrated AI’s potential for predicting claim outcomes but highlighted the need for a robust claims database for accurate predictions.

Jeremy Christman
P&G
Liz Clevenger
P&G
Pankaj Patil
P&G
Kevin Lattery
SKIM
Nino Hardt
SKIM
Howard Huang
SKIM
11:30 am

The Microsoft Research + Insights team has been exploring using AI and unstructured data to improve data quality in conjoint studies, working with PSB. Initial experiments explored using AI to create or augment samples, and conversational AI to probe respondent choices and use answers as: covariates. Results suggest AI enhances respondent engagement and model strength. Building on this, we report on an experiment comparing voice and text responses, and other work using AI to scale small sample size research for more robustness.

AI at Microsoft: Enhancing Conjoint and Scaling In-Depth Interviews
Dan Penney
Microsoft
Ananya Ramje
Microsoft
Suhasini Sanyal
Microsoft
Rob Kaiser
PSB Insights
11:30 am
AI at Microsoft: Enhancing Conjoint and Scaling In-Depth Interviews

The Microsoft Research + Insights team has been exploring using AI and unstructured data to improve data quality in conjoint studies, working with PSB. Initial experiments explored using AI to create or augment samples, and conversational AI to probe respondent choices and use answers as: covariates. Results suggest AI enhances respondent engagement and model strength. Building on this, we report on an experiment comparing voice and text responses, and other work using AI to scale small sample size research for more robustness.

Dan Penney
Microsoft
Ananya Ramje
Microsoft
Suhasini Sanyal
Microsoft
Rob Kaiser
PSB Insights
12:00 pm

Creating actionable and engaging segmentation models is part science and part art, ideally producing a mathematically sound and easily explainable model. To avoid the overwhelm (and myriad technical problems) of too many variables, we drew on various sources to tackle two key questions that helped us streamline our approach and create a solid, actionable – and, dare I say, fun – segmentation model faster than you can say “factor analysis.” All set to music, of course.

Like to Get to Know You Well: Strategies for streamlining variables to create actionable and engaging segmentation models
Tracey Di Lascio-Martinuk
Bose
12:00 pm
Like to Get to Know You Well: Strategies for streamlining variables to create actionable and engaging segmentation models

Creating actionable and engaging segmentation models is part science and part art, ideally producing a mathematically sound and easily explainable model. To avoid the overwhelm (and myriad technical problems) of too many variables, we drew on various sources to tackle two key questions that helped us streamline our approach and create a solid, actionable – and, dare I say, fun – segmentation model faster than you can say “factor analysis.” All set to music, of course.

Tracey Di Lascio-Martinuk
Bose
12:30 pm

Lunch
12:30 pm
Lunch

2:00 pm

Comparing a traditional CBC methodology to two new approaches, this paper will explore new ways to conduct conjoint exercises on mobile devices that are faster and more enjoyable than the traditional approach. Attendees will learn how well the results from two new approaches compare to the results generated by traditional choice exercises as well as how to optimize the accuracy of the new approaches.

Conjoint on Mobile Devices: Is There a Better Way?
Paul Richard McCullough
MACRO Consulting
Dan Yardley
Sawtooth Software
2:00 pm
Conjoint on Mobile Devices: Is There a Better Way?

Comparing a traditional CBC methodology to two new approaches, this paper will explore new ways to conduct conjoint exercises on mobile devices that are faster and more enjoyable than the traditional approach. Attendees will learn how well the results from two new approaches compare to the results generated by traditional choice exercises as well as how to optimize the accuracy of the new approaches.

Paul Richard McCullough
MACRO Consulting
Dan Yardley
Sawtooth Software
2:45 pm

Designing experiments with a large number of product attributes can be challenging. We propose a novel approach, Token-Based Conjoint (TBC), which reframes stated choice experiments to manage complexity more effectively.  TBC is especially useful for products like subscription services with many binary features.  By dynamically adjusting feature selection and incorporating a dual-response likelihood question, TBC delivers deeper insights into which feature combinations drive adoption, providing marketers with actionable results.

Token-Based Conjoint: A New Framework for Too Many Attributes
Megan Peitz
Numerious
Trevor Olsen
Numerious
2:45 pm
Token-Based Conjoint: A New Framework for Too Many Attributes

Designing experiments with a large number of product attributes can be challenging. We propose a novel approach, Token-Based Conjoint (TBC), which reframes stated choice experiments to manage complexity more effectively.  TBC is especially useful for products like subscription services with many binary features.  By dynamically adjusting feature selection and incorporating a dual-response likelihood question, TBC delivers deeper insights into which feature combinations drive adoption, providing marketers with actionable results.

Megan Peitz
Numerious
Trevor Olsen
Numerious
3:30 pm

Break
3:30 pm
Break

4:00 pm

Synthetic AI Avatars are entering the survey space, bringing a new approach to boost engagement and enhance data quality. This study examines their potential across various use cases, from complete survey interactions to personalized intros and optional chatbot assistance, while addressing challenges like bias and monotony in both online and offline settings. Join us to find out if AI Avatars are the next big thing in market research or just another passing trend!

Synthetic AI Avatars in Market Research: A Game Changer or a Mere Gimmick?
Saurabh Aggarwal
Knowledge Excel
Tarun Khanna
Knowledge Excel
Rashmi Sharma
Knowledge Excel
4:00 pm
Synthetic AI Avatars in Market Research: A Game Changer or a Mere Gimmick?

Synthetic AI Avatars are entering the survey space, bringing a new approach to boost engagement and enhance data quality. This study examines their potential across various use cases, from complete survey interactions to personalized intros and optional chatbot assistance, while addressing challenges like bias and monotony in both online and offline settings. Join us to find out if AI Avatars are the next big thing in market research or just another passing trend!

Saurabh Aggarwal
Knowledge Excel
Tarun Khanna
Knowledge Excel
Rashmi Sharma
Knowledge Excel
4:30 pm

Manual coding of open-ended survey responses is a labor-intensive, error-prone task. Discover how AI transformer models like text embeddings and large language models (LLMs) can automate and enhance the analysis of open-ends. This presentation compares these advanced methods with manual coding and traditional techniques, highlighting their strengths in understanding context and semantics. Attendees will gain practical expertise for integrating these technologies into their analysis of open-ends, and a clear grasp of their benefits and limitations.

“Transforming” Open-Ended Survey Response Analysis with AI Transformer Models
Jacob Nelson
The Harris Poll
Xander Jefferson
The Harris Poll
4:30 pm
“Transforming” Open-Ended Survey Response Analysis with AI Transformer Models

Manual coding of open-ended survey responses is a labor-intensive, error-prone task. Discover how AI transformer models like text embeddings and large language models (LLMs) can automate and enhance the analysis of open-ends. This presentation compares these advanced methods with manual coding and traditional techniques, highlighting their strengths in understanding context and semantics. Attendees will gain practical expertise for integrating these technologies into their analysis of open-ends, and a clear grasp of their benefits and limitations.

Jacob Nelson
The Harris Poll
Xander Jefferson
The Harris Poll
5:00 pm

This paper explores how Generative AI, using Large Language Models and multi-agent systems, is humanizing surveys through conversational interactions. By simulating natural, human-like dialogues, these AI-driven surveys enhance respondent engagement and improve data quality. A real estate app case study shows how conversational AI creates a more intuitive, personalized survey experience, allowing users to ask clarifying questions and provide contextual feedback. Our findings highlight how AI can revolutionize market research by making it more responsive, interactive, and human-centered.

Deepening Consumer Insights: How Generative AI Is Revolutionizing Personalized Engagement in Conversational Surveys & Beyond
Mohit Shant
Insights Curry
Mohd. Faisal
Insights Curry
Rajat Narang
Insights Curry
Sonia Sawlani
Insights Curry
5:00 pm
Deepening Consumer Insights: How Generative AI Is Revolutionizing Personalized Engagement in Conversational Surveys & Beyond

This paper explores how Generative AI, using Large Language Models and multi-agent systems, is humanizing surveys through conversational interactions. By simulating natural, human-like dialogues, these AI-driven surveys enhance respondent engagement and improve data quality. A real estate app case study shows how conversational AI creates a more intuitive, personalized survey experience, allowing users to ask clarifying questions and provide contextual feedback. Our findings highlight how AI can revolutionize market research by making it more responsive, interactive, and human-centered.

Mohit Shant
Insights Curry
Mohd. Faisal
Insights Curry
Rajat Narang
Insights Curry
Sonia Sawlani
Insights Curry
5:30 pm

Day 1 ends
5:30 pm
Day 1 ends

8:00 am
Coffee & pastries

8:00 am
Coffee & pastries

9:00 am
Reexamining the No-Choice Option in Conjoint Analysis

The validity of using conjoint analysis rests on the inclusion of brand names, prices, and an outside “no-choice” option in the choice task. We find that the lack of knowledge of competitive offerings and prices affects the brand part-worths but not the part-worths of other product features.  We discuss how a well-designed conjoint study mitigates the effects of this type of learning in conjoint analysis.

Greg Allenby
The Ohio State University
Peter Kurz
bms marketing research + strategy
Joel Huber
Duke University
Roger Bailey
The Ohio State University
9:00 am
Reexamining the No-Choice Option in Conjoint Analysis

The validity of using conjoint analysis rests on the inclusion of brand names, prices, and an outside “no-choice” option in the choice task. We find that the lack of knowledge of competitive offerings and prices affects the brand part-worths but not the part-worths of other product features.  We discuss how a well-designed conjoint study mitigates the effects of this type of learning in conjoint analysis.

Greg Allenby
The Ohio State University
Peter Kurz
bms marketing research + strategy
Joel Huber
Duke University
Roger Bailey
The Ohio State University
Cheng Yu Hung
The Ohio State University
9:45 am
Optimizing Product Portfolios with Discrete Choice Models

This paper explores how Discrete Choice Models (DCMs) help market researchers optimize product portfolios by simulating market scenarios to predict consumer preferences. We present a two-stage approach: First, use algorithms like simulated annealing to find near-optimal portfolios; second, refine these solutions by testing all possible remaining SKU combinations. This method balances mathematical optimization with real-world constraints, providing a practical, adaptable solution for both simple and complex market situations, and delivering actionable insights for business strategy optimization.

Maximilian Rausch
bms marketing research + strategy
Peter Kurz
bms marketing research + strategy
Stefan Binner
bms marketing research + strategy
9:45 am
Optimizing Product Portfolios with Discrete Choice Models

This paper explores how Discrete Choice Models (DCMs) help market researchers optimize product portfolios by simulating market scenarios to predict consumer preferences. We present a two-stage approach: First, use algorithms like simulated annealing to find near-optimal portfolios; second, refine these solutions by testing all possible remaining SKU combinations. This method balances mathematical optimization with real-world constraints, providing a practical, adaptable solution for both simple and complex market situations, and delivering actionable insights for business strategy optimization.

Maximilian Rausch
bms marketing research + strategy
Peter Kurz
bms marketing research + strategy
Stefan Binner
bms marketing research + strategy
10:30 am
Break

10:30 am
Break

11:00 am
Toward a Smarter MaxDiff: Rethinking Some of the Conventional Strategies

A typical MaxDiff is done in a sequential fashion: design before the data collection and modeling last. Building on the ideas from Sawtooth’s Adaptive and Bandit MaxDiff and the field of computerized adaptive testing, I look for ways to model each individual respondent during the MaxDiff survey to inform the design in real-time for performance gain. Item response theory and machine learning will be explored as model options. Simulations will be carried out for validation.

Ming Shan
Hall & Partners
11:00 am
Toward a Smarter MaxDiff: Rethinking Some of the Conventional Strategies

A typical MaxDiff is done in a sequential fashion: design before the data collection and modeling last. Building on the ideas from Sawtooth’s Adaptive and Bandit MaxDiff and the field of computerized adaptive testing, I look for ways to model each individual respondent during the MaxDiff survey to inform the design in real-time for performance gain. Item response theory and machine learning will be explored as model options. Simulations will be carried out for validation.

Ming Shan
Hall & Partners
11:45 am
Deconstructing Brand Equity: Assessing the Value of Brand Associations via Conjoint

We measure brand equity (in $s) via conjoint analysis and choice experiments. Brand associations can be integrated but may suffer from response biases. A new approach leverages an open-ended question to extract brand associations, avoiding halo effects and capturing both positive and negative perceptions. We use AI to derive a brand score that correlates highly with market share. By integrating these associations into conjoint we can put a $ value on a brand association.

Marco Vriens
Kwantum
Felix Eggers
Kwantum
11:45 am
Deconstructing Brand Equity: Assessing the Value of Brand Associations via Conjoint

We measure brand equity (in $s) via conjoint analysis and choice experiments. Brand associations can be integrated but may suffer from response biases. A new approach leverages an open-ended question to extract brand associations, avoiding halo effects and capturing both positive and negative perceptions. We use AI to derive a brand score that correlates highly with market share. By integrating these associations into conjoint we can put a $ value on a brand association.

Marco Vriens
Kwantum
Felix Eggers
Kwantum
12:30 pm
Lunch

12:30 pm
Lunch

2:00 pm
AI Assisted Segmentation

The traditional approach to segmentation is highly iterative with a great deal of manual evaluation of competing segmentation solutions. An analyst may estimate dozens of competing segmentation solutions and each needs evaluation on statistical and interpretative considerations. We will illustrate how we create scoring rules and leverage AI to provide guidance on a more efficient process for both development and evaluation of segmentation solutions.

Jackie Guthart
Radius
Curtis Frazier
Radius
Marcos Nunez
Radius
2:00 pm
AI Assisted Segmentation

The traditional approach to segmentation is highly iterative with a great deal of manual evaluation of competing segmentation solutions. An analyst may estimate dozens of competing segmentation solutions and each needs evaluation on statistical and interpretative considerations. We will illustrate how we create scoring rules and leverage AI to provide guidance on a more efficient process for both development and evaluation of segmentation solutions.

Jackie Guthart
Radius
Curtis Frazier
Radius
Marcos Nunez
Radius
2:30 pm
Segmentation 2.0: Redefining Segmentation for Modern Marketing

Recent marketing theories challenge traditional concepts like differentiation and niche targeting, favoring broader market penetration. Is this the end of segmentation? We explore how these theories are transforming segmentation practices and pushing analytical boundaries. New analytics are proposed to assess segment differences and similarities, effectively blending mass marketing with targeted efforts to help marketers achieve both immediate sales activation and sustained brand-building goals.

Jessica Wojtunik
NielsenIQ (formerly GfK)
Alexandra Chirilov
NielsenIQ (formerly GfK)
Catherine Gibson
NielsenIQ (formerly GfK)
2:30 pm
Segmentation 2.0: Redefining Segmentation for Modern Marketing

Recent marketing theories challenge traditional concepts like differentiation and niche targeting, favoring broader market penetration. Is this the end of segmentation? We explore how these theories are transforming segmentation practices and pushing analytical boundaries. New analytics are proposed to assess segment differences and similarities, effectively blending mass marketing with targeted efforts to help marketers achieve both immediate sales activation and sustained brand-building goals.

Jessica Wojtunik
NielsenIQ (formerly GfK)
Alexandra Chirilov
NielsenIQ (formerly GfK)
Catherine Gibson
NielsenIQ (formerly GfK)
3:00 pm
Modernizing Data Visualization Practices for Market Research

This work critically examines modern data visualization practices in market research, focusing on perceptual science, accessibility, and context-driven design. Attendees will explore how traditional methods can obscure insights and learn strategies for modernizing visualizations. Real-world examples in R, Python, and PowerPoint will highlight best practices for clarity and impact. The session offers actionable insights for improving data communication and decision-making in today’s business environment.

Keaton Wilson
KS&R
Ben Cortese
KS&R
3:00 pm
Modernizing Data Visualization Practices for Market Research

This work critically examines modern data visualization practices in market research, focusing on perceptual science, accessibility, and context-driven design. Attendees will explore how traditional methods can obscure insights and learn strategies for modernizing visualizations. Real-world examples in R, Python, and PowerPoint will highlight best practices for clarity and impact. The session offers actionable insights for improving data communication and decision-making in today’s business environment.

Keaton Wilson
KS&R
Ben Cortese
KS&R
3:30 pm
Break

3:30 pm
Break

4:00 pm
Turning It up to 11: A Practitioner-Led Comparison of Volumetric Conjoint Analysis Techniques

While common in practice, it has been a long time (Eagle 2010) since a paper has outlined a clear practitioner focussed methodology for undertaking volumetric analysis. In this paper we will undertake a series of analyses to identify an approach that maximises out of sample predictive validity, whilst maintaining ease of use for analysts undertaking volumetric choice studies.

Dean Tindall
Sawtooth Software
Chris Moore
Ipsos UK
4:00 pm
Turning It up to 11: A Practitioner-Led Comparison of Volumetric Conjoint Analysis Techniques

While common in practice, it has been a long time (Eagle 2010) since a paper has outlined a clear practitioner focussed methodology for undertaking volumetric analysis. In this paper we will undertake a series of analyses to identify an approach that maximises out of sample predictive validity, whilst maintaining ease of use for analysts undertaking volumetric choice studies.

Dean Tindall
Sawtooth Software
Chris Moore
Ipsos UK
4:45 pm
The Will of the Many: Generating Novel Concepts Using AI-Enhanced Respondent Feedback

New product design typically begins with designers curating a limited set of stimuli based on their interpretation of market needs, consumer preferences, and feasibility constraints. This a priori approach restricts the exploration of potential designs to what the designers anticipate as viable and appealing, effectively narrowing the creative space. Inspired by crowdsourcing, we propose an AI-driven approach that expands the design space by directly incorporating respondent input throughout the new product development process.

Peter Li
SKIM
Joris van Gool
SKIM
4:45 pm
The Will of the Many: Generating Novel Concepts Using AI-Enhanced Respondent Feedback

New product design typically begins with designers curating a limited set of stimuli based on their interpretation of market needs, consumer preferences, and feasibility constraints. This a priori approach restricts the exploration of potential designs to what the designers anticipate as viable and appealing, effectively narrowing the creative space. Inspired by crowdsourcing, we propose an AI-driven approach that expands the design space by directly incorporating respondent input throughout the new product development process.

Peter Li
SKIM
Joris van Gool
SKIM
5:30 pm
Day 2 ends

5:30 pm
Day 2 ends

8:00 am
Breakfast

8:00 am
Breakfast

9:00 am
Better Segmentation Results with Deep Learning

This paper will present useful and reproducible feature engineering techniques utilizing R's tidyverse workflows. It will focus primarily on identifying and resolving redundant measures of underlying constructs.  In addition, it will explore the use of deep learning embeddings for non-linear dimensionality reduction and anomaly detection. Embeddings also allow for complete reproduction of the data, unlike e.g., principal components. The objective is to achieve high quality partitions leading to more accurate predictive models for scoring.

Joseph Retzer
ACT-Solutions
9:00 am
Better Segmentation Results with Deep Learning

This paper will present useful and reproducible feature engineering techniques utilizing R's tidyverse workflows. It will focus primarily on identifying and resolving redundant measures of underlying constructs.  In addition, it will explore the use of deep learning embeddings for non-linear dimensionality reduction and anomaly detection. Embeddings also allow for complete reproduction of the data, unlike e.g., principal components. The objective is to achieve high quality partitions leading to more accurate predictive models for scoring.

Joseph Retzer
ACT-Solutions
9:45 am
Enhancing Cluster Ensembles with Latent Class Clustering

Internal investigation at Sawtooth Software suggests that latent class clustering (via Latent Gold) does very well on its own for clustering, but potentially even better as part of a CCEA ensemble (Sawtooth’s cluster ensemble package). We’ll investigate with numerous synthetic datasets whether adding latent class solutions to the CCEA ensemble improves the accuracy of our predictions of the known number of segments and of the membership of those segments.

Keith Chrzan
Sawtooth Software
Joseph White
InMoment
9:45 am
Enhancing Cluster Ensembles with Latent Class Clustering

Internal investigation at Sawtooth Software suggests that latent class clustering (via Latent Gold) does very well on its own for clustering, but potentially even better as part of a CCEA ensemble (Sawtooth’s cluster ensemble package). We’ll investigate with numerous synthetic datasets whether adding latent class solutions to the CCEA ensemble improves the accuracy of our predictions of the known number of segments and of the membership of those segments.

Keith Chrzan
Sawtooth Software
Joseph White
InMoment
10:30 am
Break

10:30 am
Break

11:00 am
Determining the Value of Price Thresholds in Pricing Conjoint Studies

Clients across multiple industries have become intrigued by the idea of potential price thresholds or price cliffs in the price elasticities of their products and these thresholds can be very key to their pricing strategies. In this presentation, through testing the application of post-hoc and modelled threshold options, we will illustrate whether pricing thresholds add value to our pricing models, what type of pricing thresholds (post-hoc or modelled) work the best, and if there are types of pricing studies that pricing thresholds are most appropriate for.

Michael Smith
SKIM
Juli Pham
SKIM
11:00 am
Determining the Value of Price Thresholds in Pricing Conjoint Studies

Clients across multiple industries have become intrigued by the idea of potential price thresholds or price cliffs in the price elasticities of their products and these thresholds can be very key to their pricing strategies. In this presentation, through testing the application of post-hoc and modelled threshold options, we will illustrate whether pricing thresholds add value to our pricing models, what type of pricing thresholds (post-hoc or modelled) work the best, and if there are types of pricing studies that pricing thresholds are most appropriate for.

Michael Smith
SKIM
Juli Pham
SKIM
11:30 am
Leveraging the 4P Marketing Framework to Calibrate Conjoint Models

A persistent challenge in conjoint analysis is the discrepancy between preference shares and actual market shares. This gap often arises from the omission of critical market dynamics and assumptions such as 100% awareness and distribution in our models. We propose an innovative approach that integrates the 4P marketing framework (Product, Price, Place, and Promotion) into the calibration process of conjoint analysis. This method offers a more holistic and accurate representation of market behavior, thereby enhancing the predictive validity of conjoint models.

Alexandra Chirilov
NielsenIQ (formerly GfK)
James Pitcher
NielsenIQ (formerly GfK)
11:30 am
Leveraging the 4P Marketing Framework to Calibrate Conjoint Models

A persistent challenge in conjoint analysis is the discrepancy between preference shares and actual market shares. This gap often arises from the omission of critical market dynamics and assumptions such as 100% awareness and distribution in our models. We propose an innovative approach that integrates the 4P marketing framework (Product, Price, Place, and Promotion) into the calibration process of conjoint analysis. This method offers a more holistic and accurate representation of market behavior, thereby enhancing the predictive validity of conjoint models.

Alexandra Chirilov
NielsenIQ (formerly GfK)
James Pitcher
NielsenIQ (formerly GfK)
12:00 pm
Best Paper ballots

12:00 pm
Best Paper ballots

12:05 pm
Closing remarks and best paper presentation

Bryan Orme
Sawtooth Software
12:05 pm
Closing remarks and best paper presentation

Bryan Orme
Sawtooth Software
12:25 pm
Conference adjourned

12:25 pm
Conference adjourned

Save a seat

Secure early bird pricing by purchasing a ticket today