Docstoc

Addressing the Growing Problem of Survey Nonresponse

Document Sample
Addressing the Growing Problem of Survey Nonresponse Powered By Docstoc
					Addressing the Growing Problem of
      Survey Nonresponse

                David R. Johnson
       Director: Survey Research Center
  Professor of Sociology, Human Development
     and Family Studies, and Demography
• The Bad News.
• The (Sort of) Good News.
• Steps that may help stem the tide
  and maintain adequate response
  rates.
                     The Bad News

• Nonresponse to surveys has been increasing.
• There is substantial evidence that nonresponse has been
  growing at an increasing rate in the last 10 – 15 years.
• The decline in response rates has occurred for most types of
  surveys—telephone and personal interview studies lead the list.
• Increase in nonresponse is occurring across the globe.
• There does not appear to be a single or clear explanation for
  these trends.
            Examples of Trends

• Evidence is most compelling from long-term trend
  surveys that have been repeated over many years.
• Most of these long-term surveys are either telephone
  or personal interview surveys.
• Evidence of trends mail surveys is not as clear
  because of lack of long-term trend studies. (Census
  is an exception)
• Web surveys are too new to provide much trend data.
     Components of Nonresponse

• Refusals
• Non-contact
   – Household never at home or telephone never
     answered.
   – Mail surveys returned not delivered.
   – Failure to locate the sampled person/household.
• Inability to participate
   – Language, literacy, etc.
          Nonresponse Definitions

• ―Response Rate‖ has had no standardized
  definition until recently.
• Standards for reporting of response
  information have been developed by AAPOR.
• Many different definitions (and formulas for
  each) have been developed.
• Most common is response rate
• R = (sampled units completing survey)/ (all sampled units)
  Response Rate Trends in Telephone
              Surveys
• Behavior Risk Factors Survey System
  (BRFSS) RDD telephone survey conducted
  separately in each state with common
  methodology.
• University of Michigan Survey of
  Consumers—telephone survey of consumer
  attitudes.
   Nonresponse in the Behavioral Risk Factors
     Survey (BRFSS) : Telephone Survey




Source: Groves et al. 2004
                           Behavior Risk Factors Survey Response Rate
                100

                 90                                             Maximum
                                   Median All
                 80
                                   States
Response Rate




                 70

                 60

                 50              Pennsylvania

                 40
                                Minimum
                 30

                 20
                  1994   1995      1996    1997   1998   1999     2000    2001   2002   2003   2004
                                                         Year
         Response Rate Trend in the Survey of Consumer
                  Attitudes Telephone Survey




Source: Curtin, Presser, & Singer 2005
University of Michigan: Survey of Consumers
                      Telephone Survey




Source: Groves et al. 2004
   Response Rate Trends in Personal
          Interview Surveys
• Many Federally sponsored and conducted
  surveys have been administered over many
  years.
• Response Rates have generally been very
  high in these studies.
• Federal agencies (e.g., the U. S. Census
  Bureau) have taken seriously the problem of
  increasing nonresponse rates in these
  surveys.
    Federal Government Personal Interview Studies
CPS Current Population Survey; NHIS National Health Interview Survey; NCVS National Crime
Victimization. Survey; SIPP Survey of Income and Program Participation; CED Consumer
Expenditures Diary; CEQ Consumer Expenditures Quarterly




         Source: Atrostic et al. 1999
National Crime Victimization Survey (Personal
                 Interview)




   Source: Groves et al. 2004
 Mail Survey Response Rate Trends

• U. S. Census Forms mailing
  – 1980   75%
  – 1990   65%
  – 2000   66%
• In 2000 the Census Bureau went to great
  expense and efforts to stop the decline in
  response rates observed between 1980 and
  1990.
    Why people choose to participate in
                Surveys
• Motivations to participate will differ among
  respondents.
• Balance between the cost to the respondent of
  participating and the reward they will obtain.
• Rewards can be:
   –   Civic responsibility
   –   Interest in topic
   –   Perceived respondent burden
   –   Financial reward
   –   Interest in expressing their opinions
Suggested explanations for the increases
        in Nonresponse rates
• Increased refusals.
   –   Time constraints (―too busy‖).
   –   Lessened sense of civic responsibility or sense of reciprocity.
   –   Too many survey requests.
   –   Concerns about safety, fraud, and misrepresentation.
   –   Human Subjects requirements.
• Declining contact rates.
   – Access issues.
      • Caller ID, Answering machines, Cell Phones, Multiple
        telephone numbers, unlisted numbers.
      • Gated communities, limited access apartment buildings.
      • Privacy regulations
                 The Good News

• Question: Are the results obtained using the
  survey data biased by the presence of non-
  response?
• Answer: No, at least the bias appears to be
  small and inconsistent for most variables.
• Some evidence: Ketter et al. 2000 Public Opinion Quarterly
        The Ketter et al. Study:
  Two Parallel National RDD Telephone
          Interview Surveys
• Standard Survey: 36% response rate
  – Calling done over 5 days.
  – Selected respondent from people at home at time of call (no
    random selection).
  – 5 call backs, 1 call back to refusals.
• Rigorous Survey: 60.6% response rate
  –   8-week calling period.
  –   Random selection of respondent from list.
  –   Pre-notification letters with $2.
  –   Multiple attempts (including letters to refusals).
  –   Multiple Callbacks.
      Differences between findings in the
       Standard and Rigorous Surveys
• Average difference in percentages on all
  items was less then 2%.
• Largest differences were for demographic
  items.
• Small or no significant differences in:
  –   Political and social attitudes and behavior.
  –   Media use, engagement in politics.
  –   Social integration.
  –   Crime-related items.
   Conclusions about Nonresponse Bias

• The Ketter et al. findings are consistent with a
  number of other studies in finding minimal bias.
• Demographic differences can usually be adjusted
  with weighting of the data.
• Obtaining relatively high response rates can be
  expensive. Is it worth the cost, or can the same
  resources be used to improve data quality in other
  ways?
• Lack of bias and presence of high quality data should
  be a more important goal than obtaining a specific
  minimal response rate.
Ways to increase response rate in today’s
           survey environment
•   Incentives
     – Pre-payment of small cash incentive
     – Post-payment (e.g., offer to pay if R refuses)
     – Drawings, coupons, etc. not as effective.
•   Multiple Contacts
     – Critical in mail and web surveys
     – Many callbacks in mail and personal surveys
     – Refusal conversions can convert 10 – 30 percent of initial refusals.
•   Pre-notification letters
     – Provides more information on the study
     – Increases respondent confidence in the validity of the study
•   Interviewer training.
     – Training in how to approach the respondent and convert reluctant
       respondents
     – Large differences in response rate by interviewer, but not always sure why.
                   More ways…

• Sponsorship.
  – University sponsorship often helps over commercial
    organizations.
  – Government sponsorship usually the best.
• Multiple-mode survey.
  – Combining web and mail survey
  – Follow-up mail survey with telephone contact
• Reduce respondent burden.
  – Shorter survey instruments.
  – May be more important in mail surveys
What Response Rate should I expect?

• Very difficult to answer—harder to estimate for some
  modes than for others.
• RDD Telephone Survey—35 to 60%.
• Mail Survey of General Population—35 to 70%
  (assumes multiple mailings, incentive, relatively short
  survey)
• Special population mailing—20 to 80%
• Web survey of student population—30 to 60%
• Personal Interview Surveys of general population—
  60 to 80% (or more)
                  Conclusions
• Obtaining high response rates is difficult and
  expensive. The cost and effort that it takes to get an
  ―acceptable‖ response rate has increased
  substantially in the last 20 years.
• In carefully designed studies under the right
  conditions it is still possible to obtain quite high
  response rates.
• Efforts to increase response rate need to be
  balanced carefully with the quality of the data
  obtained. Sometimes higher response rates can yield
  less representative and less valid data.
• More research is needed on identifying the factors
  that motivate people to participate in surveys.

				
DOCUMENT INFO