Schedule

Monday, April 14, 2025

8:30 - 9 a.m.

Breakfast

9 - 9:15 a.m.

Welcome

9:15 - 9:45 a.m.

Designing for Data Excellence: Putting Humans at the Heart of Quality

With a heavy focus on AI and advanced technology, the critical contributions that human intervention makes in ensuring the accuracy, consistency, and reliability of data can be overlooked.  In marketing research, in particular, it is crucial that humans remain at the center of the process to interpret the data, give it context, and draw and communicate actionable and meaningful insights. Human-centered design provides the tools to ensure that human expertise and experience are an integral part of improving and maintaining data quality.

Key takeaways:

  1. Discover the benefits of human-centered design and examine the critical role of humans in improving and maintaining data quality.
  2. Learn how to facilitate at least 3 different human-centered design methodologies so that you can apply them right away.
  3. Hear stories and best practices about how human-centered design has improved data quality for peer organizations.

- Presented by Rebecca Farabaugh

9:50 - 10:30 a.m.

Understanding Sampling

10:30 - 11:00 a.m.

Networking Break

11:00 - 11:30 a.m.

Data Integrity Under Fire: A Live Test of Sample Quality

Presented By: 

Alexandrine de Montera, Full Circle Research & Vanilynne Gulla, A&E

11:35 a.m. - 12:10 p.m.

Understanding Tech-Enabled Fraud Prevention

A provocative panel discussion on cutting-edge fraud detection technologies in survey research.

Panelists: Keith Rinzler, 1Q, Rich Ratcliff, OpinionRoute, Megan Peitz, Numerious, Steven Snell, Rep Data; Moderated by Roddy Knowles, dtect
12:15 - 12:35 p.m.

Vetted Presentation / Client Case Study: TBA

12:35 - 1:20 p.m.

Lunch & Roundtable Discussions

1:20 - 1:50 p.m.

Understanding Screening & Qualification

1:55 - 2:15 p.m.

With All Eyes on Data Quality, Don’t Forget the Respondent

While researchers prioritize fraud prevention, they often overlook respondent disengagement, which can undermine data quality and business insights. A poor survey experience can negatively impact not only a single study but the broader research ecosystem, potentially leading to respondent loss. Dynata has developed a measurement system with eight key variables (e.g., interview length, abandon rate) to assess a study’s impact on respondent engagement. This system helps clients optimize their surveys for better data quality and retention. In this session, we’ll explore these measurement components, share real respondent feedback, and discuss actionable ways to enhance the survey experience.

- Presented by Tammy Rosner, Ph.D., Principal, Research, Dynata

2:20 - 2:50 p.m.

Red Herrings Aren't Cutting It: The Evolution of In-Survey Quality Testing

Data quality control questions (also known as “red herrings”, attention checks, etc.) have been used in online surveys for decades now.  They can improve the data quality of the survey; however, if too convoluted, there is a risk of “over-cleaning” the data, alienating, and losing valuable responses. aytm conducted research to assess the utility of a new generation of data quality control questions, Qualchas, that aim to combat not only inattentive survey respondents, but AI-generated survey responses and survey farms, as well while making survey taking experience more engaging for thoughtful survey takers.  

Goals of this research include: 

  • Helping other researchers design effective data quality control questions. This includes measuring attention and helping identify potential survey farms/professional respondents as well as LLM-enabled bots.
  • Providing insights into cultural differences in respondents’ perceptions of data quality control questions.
  • Teaching insights professionals how to apply an empathetic approach to quality control design (We should never forget the real respondents who are actually paying attention to the survey).

- Presented by Rossi Dobrikova, aytm

2:55 - 3:25 p.m.

DQ: The Brand-side Researcher POV

It's always critical to understand the client's perspective. Don't miss this brand-side researcher panel revealing their biggest data quality concerns and the steps they’re taking to ensure accurate, actionable insights.
3:30 - 3:45 p.m.

Networking Break

3:45 - 4:05 p.m.

Vetted Presentation / Client Case Study: TBA

4:10 - 4:30 p.m.

Unstructured Data: Quality Pitfalls & Best Practices

Understand the risks associated with poor data accuracy and trust in unstructured social, media, and related voice of customer data. AMEC (the International Association for the Measurement and Evaluation of Communication) will discuss the initiatives it is working on with the GDQ.

- Presented by Rob Key, Converseon & AMEC
4:30 - 4:45 p.m.

Coming Together to Solve the Data Quality Crisis - Featuring New Research Findings

Each player in the market research supply chain - suppliers, agencies, incentive platforms, and brands - has unique, but limited, visibility into quality signals. The lack of shared indicators, the probabilistic nature of quality metrics, and the reluctance to collaborate due to reputational risks create a systemic "prisoner's dilemma".
During this session, you'll hear the findings of a collaborative study that integrates quality signals across the entire supply chain, leveraging a superset of indicators to inform better decision-making and improve data quality.

Key Takeaways
- Practical guidance on which quality checks to prioritize and which may be unnecessary or redundant.
- Strategies to balance under- and over-removal of data, preserving both data fidelity and operational efficiency.
- A framework for exploring shared quality indicators and the feasibility of a data quality clearinghouse.

- Presented by Katie Casavant, Data Quality Co-Op
4:45 - 5:15 p.m.

IA & Data Quality - Industry Benchmarks & More

Get the latest on developments for the industry including the Data Quality Benchmarks tracking study and the cross-association collaborative, the Global Data Quality (GDQ) initiative.
5:15 - 5:30 p.m.

Open Forum

5:30 - 6:30 p.m.

Reception