ESOMAR 37
Your Questions, Answered
ESOMAR is the global voice of the data, research and insights community–a truly international association that provides ethical and professional guidance. Its document, Questions to Help Buyers of Online Samples, requires sample providers to expose their level of consistency, reliability and commitment to transparency when it comes to quality data. We are only too happy to oblige.
QUICK NAVIGATION:
01
What experience does your company have in providing online samples for market research? How long have you been providing this service? Do you also provide similar services for other uses such as direct marketing? If so, what proportion of your work is for market research?
02
Do you have staff with responsibility for developing and monitoring the performance of the sampling algorithms and related automated functions who also have knowledge and experience in this area? What sort of training in sampling techniques do you provide to your frontline staff?
03
What other services do you offer? Do you cover sample-only, or do you offer a broad range of data collection and analysis services?
04
Using the broad classifications above, from what sources of online sample do you derive participants?
05
Which of these sources are proprietary or exclusive and what is the percent share of each in the total sample provided to a buyer? (Assume proprietary to mean that the sample provider owns the asset. Assume exclusive to mean that the sample provider has an exclusive agreement to manage/provide access to sample originally collected by another entity.)
06
What recruitment channels are you using for each of the sources you have described? Is the recruitment process ‘open to all’ or by invitation only? Are you using probabilistic methods? Are you using affiliate networks and referral programs and in what proportions? How does your use of these channels vary by geography?
07
What form of validation do you use in recruitment to ensure that participants are real, unique, and are who they say they are? Describe this both in terms of the practical steps you take within your own organization and the technologies you are using. Please try to be as specific and quantify as much as you can.
08
What brand (domain) and/or app are you using with proprietary sources? Summarise, by source, the proportion of sample accessing surveys by mobile app, email or other specified means.
09
Which model(s) do you offer to deliver sample? Managed service, self-serve, or API integration?
10
If offering intercepts, or providing access to more than one source, what level of transparency do you offer over the composition of your sample (sample sources, sample providers included in the blend). Do you let buyers control which sources of sample to include in their projects, and if so how? Do you have any integration mechanisms with third-party sources offered?
11
Of the sample sources you have available, how would you describe the suitability of each for different research applications? For example, Is there sample suitable for product testing or other recruit/recall situations where the buyer may need to go back again to the same sample? Is the sample suitable for shorter or longer questionnaires? For mobile-only or desktoponly questionnaires? Is it suitable to recruit for communities? For online focus groups?
12
Briefly describe your overall process from invitation to survey completion. What steps do you take to achieve a sample that “looks like” the target population? What demographic quota controls, if any, do you recommend?
13
What profiling information do you hold on at least 80% of your panel members plus any intercepts known to you through prior contact? How does this differ by the sources you offer? How often is each of those data points updated? Can you supply these data points as appends to the data set? Do you collect this profiling information directly or is it supplied by a third party?
14
What information do you need about a project in order to provide an estimate of feasibility? What, if anything, do you do to give upper or lower boundaries around these estimates?
15
What do you do if the project proves impossible for you to complete in field? Do you inform the sample buyer as to who you would use to complete the project? In such circumstances, how do you maintain and certify third party sources/sub-contractors?
16
Do you employ a survey router or any yield management techniques? If yes, please describe how you go about allocating participants to surveys. How are potential participants asked to participate in a study? Please specify how this is done for each of the sources you offer.
17
Do you set limits on the amount of time a participant can be in the router before they qualify for a survey?
18
What information about a project is given to potential participants before they choose whether to take the survey or not? How does this differ by the sources you offer?
19
Do you allow participants to choose a survey from a selection of available surveys? If so, what are they told about each survey that helps them to make that choice?
20
What ability do you have to increase (or decrease) incentives being offered to potential participants (or sub-groups of participants) during the course of a survey? If so, can this be flagged at the participant level in the dataset?
21
Do you measure participant satisfaction at the individual project level? If so, can you provide normative data for similar projects (by length, by type, by subject, by target group)?
22
Do you provide a debrief report about a project after it has completed? If yes, can you provide an example?
23
How often can the same individual participate in a survey? How does this vary across your sample sources? What is the mean and maximum amount of time a person may have already been taking surveys before they entered this survey? How do you manage this?
24
What data do you maintain on individual participants such as recent participation history, date(s) of entry, source/channel, etc? Are you able to supply buyers with a project analysis of such individual level data? Are you able to append such data points to your participant records?
25
Please describe your procedures for confirmation of participant identity at the project level. Please describe these procedures as they are implemented at the point of entry to a survey or router.
26
How do you manage source consistency and blend at the project level? With regard to trackers, how do you ensure that the nature and composition of sample sources remain the same over time? Do you have reports on blends and sources that can be provided to
buyers? Can source be appended to the participant data records?
27
Please describe your participant/member quality tracking, along with any health metrics you maintain on members/participants, and how those metrics are used to invite, track, quarantine, and block people from entering the platform, router, or a survey. What processes do you have in place to compare profiled and known data to in-survey responses?
28
For work where you program, host, and deliver the survey data, what processes do you have in place to reduce or eliminate undesired in-survey behaviours, such as (a) random responding, (b) Illogical or inconsistent responding, (c) overuse of item nonresponse (e.g., “Don’t Know”) (d) inaccurate or inconsistent responding, (e) incomplete responding, or (f) too rapid survey completion?
29
Please provide the link to your participant privacy notice (sometimes referred to as a privacy policy) as well as a summary of the key concepts it addresses. (Note: If your company uses different privacy notices for different products or services, please provide an example relevant to the products or services covered in your response to this question).
30
How do you comply with key data protection laws and regulations that apply in the various jurisdictions in which you operate? How do you address requirements regarding consent or other legal bases for the processing personal data? How do you address requirements for data breach response, cross-border transfer, and data retention? Have you appointed a data protection officer?
31
How can participants provide, manage and revise consent for the processing of their personal data? What support channels do you provide for participants? In your response, please address the sample sources you wholly own, as well as those owned by other parties to whom you provide access.
32
How do you track and comply with other applicable laws and regulations, such as those that might impact the incentives paid to participants?
33
What is your approach to collecting and processing the personal data of children and young people? Do you adhere to standards and guidelines provided by ESOMAR or GRBN member associations? How do you comply with applicable data protection laws and regulations?
34
Do you implement “data protection by design” (sometimes referred to as “privacy by design”) in your systems and processes? If so, please describe how.
35
What are the key elements of your information security compliance program? Please specify the framework(s) or auditing procedure(s) you comply with or certify to. Does your program include an asset-based risk assessment and internal audit process?
36
Do you certify to or comply with a quality framework such as ISO 20252?
37
Which of the following are you able to provide to buyers, in aggregate and by country and source? Please include a link or attach a file of a sample report for each of the metrics you use:
01. Average qualifying or completion rate, trended by month;
02. Percent of paid completes rejected per month/project, trended by month;
03. Percent of members/accounts removed/quarantined, trended by month;
04. Percent of paid completes from 0-3 months tenure, trended by month;
05. Percent of paid completes from smartphones, trended by month;
06. Percent of paid completes from owned/branded member relationships versus intercept participants, trended by month;
07. Average number of dispositions (survey attempts, screenouts, and completes) per member, trended by month (potentially by cohort);
08. Average number of paid completes per member, trended by month (potentially by cohort);
09. Active unique participants in the last 30 days;
10. Active unique 18-24 male participants in the last 30 days;
11. Maximum feasibility in a specific country with nat rep quotas, seven days in field, 100% incidence, 10-minute interview;
12. Percent of quotas that reached full quota at time of delivery, trended by month