Schedule
Monday, April 14, 2025
Breakfast
Welcome
Designing for Data Excellence: Putting Humans at the Heart of Quality
With a heavy focus on AI and advanced technology, the critical contributions that human intervention makes in ensuring the accuracy, consistency, and reliability of data can be overlooked. In marketing research, in particular, it is crucial that humans remain at the center of the process to interpret the data, give it context, and draw and communicate actionable and meaningful insights. Human-centered design provides the tools to ensure that human expertise and experience are an integral part of improving and maintaining data quality.
Key takeaways:
- Discover the benefits of human-centered design and examine the critical role of humans in improving and maintaining data quality.
- Learn how to facilitate at least 3 different human-centered design methodologies so that you can apply them right away.
- Hear stories and best practices about how human-centered design has improved data quality for peer organizations.
- Presented by Rebecca Farabaugh
Understanding Sampling
Networking Break
Data Integrity Under Fire: A Live Test of Sample Quality
Deep dive into the evolving landscape of data quality and trust with Full Circle Research, A+E Global Media, and Procter & Gamble as groundbreaking insights from a recent data quality test are shared. The test compares survey results from panel versus non-panel sources and evaluates the impact of security checks on data integrity.
Procter & Gamble will also discuss their bold move to mandate ISO 20252 certification for all online quantitative suppliers - a decision that’s setting a new standard for data quality and transparency in the industry. This session will provide actionable takeaways on how to secure data integrity, improve respondent quality, and build trust at scale.
Presented By:Alexandrine de Montera, Full Circle Research, Vanilynne Gulla, A+E Global Media, Megan Budzynski, Procter & Gamble
Understanding Tech-Enabled Fraud Prevention
Panelists: Keith Rinzler, 1Q, Rich Ratcliff, OpinionRoute, Megan Peitz, Numerious, Steven Snell, Rep Data; Moderated by Roddy Knowles, dtect
Improving Market Research Data Quality with Human-in-the-Loop AI
AI has the potential to enhance data quality, but only when applied correctly.
When AI is thoughtfully integrated with human oversight, it not only safeguards data quality but also enhances the accuracy of insights while minimizing inefficiencies.
Join this session to learn how:
- By streamlining validation and analysis, AI can save time and resources, while misuse can have the opposite effect.
- Over-filtering can shrink respondent pools, while weak tools may allow poor-quality data to slip through.
- AI-powered tools for open-ended response validation—such as text clustering, biometrics, and automated analysis—help improve the integrity of qualitative data.
- When researchers actively flag issues and suppliers refine their respondent sources, AI detection capabilities improve.
- Relying on outdated fraud detection methods puts data quality at risk.
- A structured feedback loop further enhances detection, reducing the manual workload for researchers and ensuring greater efficiency.
Presented by: Mike Herrel, Director of Data Solutions at The Directions Group & Mayank Agrawal, Co-founder and CEO of Roundtable
Lunch & Roundtable Discussions
Understanding Screening & Qualification
With All Eyes on Data Quality, Don’t Forget the Respondent
While researchers prioritize fraud prevention, they often overlook respondent disengagement, which can undermine data quality and business insights. A poor survey experience can negatively impact not only a single study but the broader research ecosystem, potentially leading to respondent loss. Dynata has developed a measurement system with eight key variables (e.g., interview length, abandon rate) to assess a study’s impact on respondent engagement. This system helps clients optimize their surveys for better data quality and retention. In this session, we’ll explore these measurement components, share real respondent feedback, and discuss actionable ways to enhance the survey experience.
- Presented by Tammy Rosner, Ph.D., Principal, Research, Dynata
Red Herrings Aren't Cutting It: The Evolution of In-Survey Quality Testing
Data quality control questions (also known as “red herrings”, attention checks, etc.) have been used in online surveys for decades now. They can improve the data quality of the survey; however, if too convoluted, there is a risk of “over-cleaning” the data, alienating, and losing valuable responses. aytm conducted research to assess the utility of a new generation of data quality control questions, Qualchas, that aim to combat not only inattentive survey respondents, but AI-generated survey responses and survey farms, as well while making survey taking experience more engaging for thoughtful survey takers.
Goals of this research include:
- Helping other researchers design effective data quality control questions. This includes measuring attention and helping identify potential survey farms/professional respondents as well as LLM-enabled bots.
- Providing insights into cultural differences in respondents’ perceptions of data quality control questions.
- Teaching insights professionals how to apply an empathetic approach to quality control design (We should never forget the real respondents who are actually paying attention to the survey).
- Presented by Rossi Dobrikova, aytm
DQ: The Brand-side Researcher POV
Networking Break
Vetted Presentation / Client Case Study: TBA
Unstructured Data: Quality Pitfalls & Best Practices
- Presented by Rob Key, Converseon & AMEC
Coming Together to Solve the Data Quality Crisis - Featuring New Research Findings
During this session, you'll hear the findings of a collaborative study that integrates quality signals across the entire supply chain, leveraging a superset of indicators to inform better decision-making and improve data quality.
Key Takeaways
- Practical guidance on which quality checks to prioritize and which may be unnecessary or redundant.
- Strategies to balance under- and over-removal of data, preserving both data fidelity and operational efficiency.
- A framework for exploring shared quality indicators and the feasibility of a data quality clearinghouse.
- Presented by Katie Casavant, Data Quality Co-Op