Schedule
Monday, April 14, 2025
Breakfast
Welcome
Designing for Data Excellence: Putting Humans at the Heart of Quality
With a heavy focus on AI and advanced technology, the critical contributions that human intervention makes in ensuring the accuracy, consistency, and reliability of data can be overlooked. In marketing research, in particular, it is crucial that humans remain at the center of the process to interpret the data, give it context, and draw and communicate actionable and meaningful insights. Human-centered design provides the tools to ensure that human expertise and experience are an integral part of improving and maintaining data quality.
Key takeaways:
- Discover the benefits of human-centered design and examine the critical role of humans in improving and maintaining data quality.
- Learn how to facilitate at least 3 different human-centered design methodologies so that you can apply them right away.
- Hear stories and best practices about how human-centered design has improved data quality for peer organizations.
- Presented by Rebecca Farabaugh
Understanding Sampling
Networking Break
Data Integrity Under Fire: A Live Test of Sample Quality
Alexandrine de Montera, Full Circle Research & Vanilynne Gulla, A&E
Understanding Tech-Enabled Fraud Prevention
Panelists: Keith Rinzler, 1Q, Rich Ratcliff, OpinionRoute, Megan Peitz, Numerious, Steven Snell, Rep Data; Moderated by Roddy Knowles, dtect
Vetted Presentation / Client Case Study: TBA
Lunch & Roundtable Discussions
Understanding Screening & Qualification
With All Eyes on Data Quality, Don’t Forget the Respondent
While researchers prioritize fraud prevention, they often overlook respondent disengagement, which can undermine data quality and business insights. A poor survey experience can negatively impact not only a single study but the broader research ecosystem, potentially leading to respondent loss. Dynata has developed a measurement system with eight key variables (e.g., interview length, abandon rate) to assess a study’s impact on respondent engagement. This system helps clients optimize their surveys for better data quality and retention. In this session, we’ll explore these measurement components, share real respondent feedback, and discuss actionable ways to enhance the survey experience.
- Presented by Tammy Rosner, Ph.D., Principal, Research, Dynata
Red Herrings Aren't Cutting It: The Evolution of In-Survey Quality Testing
Data quality control questions (also known as “red herrings”, attention checks, etc.) have been used in online surveys for decades now. They can improve the data quality of the survey; however, if too convoluted, there is a risk of “over-cleaning” the data, alienating, and losing valuable responses. aytm conducted research to assess the utility of a new generation of data quality control questions, Qualchas, that aim to combat not only inattentive survey respondents, but AI-generated survey responses and survey farms, as well while making survey taking experience more engaging for thoughtful survey takers.
Goals of this research include:
- Helping other researchers design effective data quality control questions. This includes measuring attention and helping identify potential survey farms/professional respondents as well as LLM-enabled bots.
- Providing insights into cultural differences in respondents’ perceptions of data quality control questions.
- Teaching insights professionals how to apply an empathetic approach to quality control design (We should never forget the real respondents who are actually paying attention to the survey).
- Presented by Rossi Dobrikova, aytm
DQ: The Brand-side Researcher POV
Networking Break
Vetted Presentation / Client Case Study: TBA
Unstructured Data: Quality Pitfalls & Best Practices
- Presented by Rob Key, Converseon & AMEC
Coming Together to Solve the Data Quality Crisis - Featuring New Research Findings
During this session, you'll hear the findings of a collaborative study that integrates quality signals across the entire supply chain, leveraging a superset of indicators to inform better decision-making and improve data quality.
Key Takeaways
- Practical guidance on which quality checks to prioritize and which may be unnecessary or redundant.
- Strategies to balance under- and over-removal of data, preserving both data fidelity and operational efficiency.
- A framework for exploring shared quality indicators and the feasibility of a data quality clearinghouse.
- Presented by Katie Casavant, Data Quality Co-Op