Written by Brian Lamar, Strategic Director of Survey Products at TraQline
Welcome to Samplecon: Uniting the World of Market Research
I spent last week in Lake Oconee Georgia at Samplecon, a marketing research conference that exists to create a collaborative environment for the transparent exchange of information and ideas to facilitate the evolution of consumer data. Basically, it’s a chance for us research nerds to get together and discuss best practices and challenges in the industry, focused on online sampling.
Well over 250 researchers attended this 2.5-day conference, which has become incredibly collaborative since its existence, allowing clients and vendors — many of whom are competitors — to work together to improve survey data and the insights we gain from it ultimately giving more confidence in decision making.
This year the primary topics were surrounding artificial intelligence and improving data quality. I wanted to give some high-level information about the ongoing data quality initiatives.
The Data Quality Revolution in Market Research
Quality challenges are not new to our industry. They took place when door-to-door methodology was most prominent and continued through the era of telephone surveys and persist even now, where market research is heavily conducted through the internet. However, this appears to be the most collaborative approach to minimizing the problem.
The Global Data Quality partnership is a group of industry associations across the globe working together to address ongoing and emerging risks to data quality in the market and social research, consumer insights, and analytics industries. With the goal of increasing information and building trust, each organization leads a workstream that delivers global quality resources to improve the conversation and outcomes.
2023 Achievements
The Insights Association (IA) is one of the organizations helping advance this initiative in North America, and I am fortunate to be the co-chair. We have had numerous accomplishments in the past year and a variety of workstreams that I am excited about including:
- Creating a glossary – It is imperative that the industry speaks the same language. The Global Data Quality glossary is the beginning of ensuring the industry is speaking from a common place when discussing research and data quality.
- Educational microsite – Located on the IA website, updates to best practices and to the toolkit can be found here.
- Updating the data integrity toolkit – With this update, we ensure that the data integrity toolkit has the latest information included.
- Journey map – Developed a visual map that defines the data quality process, requirements for validation, and how to monitor progress across the research journey.
- Pledge & badge – Created a commitment for individuals and companies to stay informed, engaged, and committed to ensuring high quality data standards.
- Data fraud & quality member survey: Conducted to understand the impact of fraud, to what degree it’s experienced, and protection measures currently used.
2024 Initiatives for Enhanced Data Quality
Additionally, there are numerous 2024 initiatives taking place to improve data quality, including:
- Establishing data quality benchmarks – For example, what is an acceptable removal rate, or dropout rate? What are the key metrics we should be looking at to measure quality?
- Efforts to improve the respondent experience to increase sample supply – One way this is being accomplished is to create a “Respondents Bill of Rights” which is a primary objective this year. What should researchers tell study participants? How should participants be treated?
- Collaboration labs – We conducted a collaboration lab at Samplecon and will provide other opportunities to provide input at IA National in Atlanta in April
- Fraud technology guidance
- Incentives guidance – Growing understanding of best practices in terms of incentives
- Data cleaning guidance
- AI guidance – Building understanding around how artificial intelligence will impact our industry
- Data quality training module – Development of a “class” that researchers can take to learn more about best practices
Conclusion: The Quest for Improving Data Quality Continues
In summary, significant progress has been made in improving the quality of data within market research in the past year. However, there are two key things to keep in mind:
- We will never have perfect quality data.
- While these initiatives will help improve quality, it is still necessary to design a good screener and questionnaire, ensure the ideal sample and analytic plan, and have numerous people testing the survey to ensure it is programmed correctly.
I will be leading the data quality track at the Insights Association Annual Conference April 8-10 in Atlanta Georgia and am excited to see our industry improve and help our clients make smart business decisions.
If you’d like to learn more about these initiatives or connect with TraQline to discuss our commitment to accurate and reliable data, contact us below.
About the Author
OpenBrand
Related blogs
Unlock the Power of Real-Time Data & AI in Retail Success: White Paper
Have your data — and use it too. Accessing data in retail is one thing. Knowing what to do with it is another. Market data is not meant to…
Related blogs
Laundry: 2024 Year-In-Review
Our laundy year-in-review report recaps the most critical activity captured throughout 2024. The…
Smartphones: 2024 Year-In-Review
Our smartphone year-in-review report recaps smartphone launches, placements, pricing and…
Televisions: 2024 Year-In-Review
The 2024 TV market witnessed dynamic shifts driven by innovative product launches, strategic…
Printer Hardware: 2024 Year-In-Review
Our printer hardware year-in-review report recaps printer hardware launches, placements, pricing…