Subscribe to DSC Newsletter

As we all know, the bias mostly comes from people who won't participate or won't write a review: they are different, they don't participate because they don't have the time to do so. So you end up with skewed demographics and thus biased answers, with more students and people with plenty of free time participating. In the case of product reviews, you would expect that unhappy customers, and people-paid-to-review (paid by the product manufacturer or its competitors) are more likely to contribute.

How do you correct for this bias? Can you simply put more weight on answers from under-represented segments, like busy people or executives? This approach sounds too simplistic to me to fix the problem. In the example below (Census Bureau), they make the survey mandatory to all US residents, but I think it makes it worst as people are going to lie (especially to very personal questions). What techniques do you use to detect liars? Redundant questions? Cross-validation with other data sources?

Views: 471

Reply to This

Replies to This Discussion

This is a huge topic that can't be answered with a few comments but let me share some thoughts.

First, we need to stop calling research respondents liars. In many cases, they are reacting to surveys that are poorly and thoughtlessly written by researchers who expect them to behave like robots. People aren't robots. People get bored, tired, offended, distracted, and have a multitude of far more important activities going on in their lives. To call them liars is to assume researchers have written a PERFECT survey. Impossible.

Second, there are many techniques for differentiating human responders who make genuine mistakes as they answer fairly good surveys from the extreme minority of people who are deliberately answering surveys poorly. Some of these techniques are in the design of questions whereas other techniques involve statistical analysis of improbable combinations of responses across the entire survey. I've done many many papers, presentations, and webinars on these sorts of things and can direct people to my slideshare if they're interested.

Good timing for this topic!   Recently my grandkids school sent the parents a survey .

Like the census - this particular "questionairre" was deemed Mandatory -- and - it was NOT anonymous. 

Because of that - my son was hesitant to give honest feedback, and then disturbed when his carefully thought out opinions were later questioned .  Evidently his was the only respondant (out of 20) who gave anything but glowing positive praises. His efforts were "rewarded" with a SECOND  set of questions seeking further clarification on his submitted survey responses.  

 

With technology today we can now ensure that people have responded without tying their specific responses to their name...and that is what I will recommend to their school to use next  year.   

meanwhile - i welcome ideas, suggestion,references --especially any  links to research articles on the anonymonity basis

RSS

Videos

  • Add Videos
  • View All

© 2020   Data Science Central ®   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service