Survey Research Papers
(These are papers that I have contributed to, co-authored, or authored)
Massachusetts Health Insurance Survey Methodology Report
Survey Years: 2008, 2009, and 2010
(Added April 2013) (Authors: Timothy Triplett, Sharon Long, David Dutwin, Susan Scher)
Abstract: The Massachusetts Health Insurance Survey (MHIS) collects information on health insurance coverage and
access to and use of health care for the non-institutionalized population in Massachusetts. It is funded by the
Massachusetts Division of Health Care Finance and Policy (DHCFP) and is conducted by the Urban Institute, along with its
subcontractor, Social Science Research Solutions (SSRS). This report provides information about the methods used to
collect and analyze the MHIS.
Can Your Spouse Accurately Report Your Activities? An
Examination of Proxy Reporting
(Added November 2010) (Author: Timothy Triplett)
Abstract: The 2008 Survey of Public Participation in the Arts (SPPA) is sponsored by the National Endowment for the Arts and was conducted as a supplement to
the May 2008 Current Population Survey (CPS). A big challenge in being a supplement to the CPS is that respondent selection procedures for SPPA
supplement differ from the CPS procedures. CPS is administered to any person 16 or older who is able to report employment information on all
persons 16 years or older in the household. While the SPPA collects information on all adults 18 or older and it is felt that many of the
questions on the survey require self reports rather than proxy reports. In 2002, the Census Bureau interviewers attempted to complete the
SPPA supplement with all persons 18 or older, but after 4 call attempts they accepted proxy reports. To make the SPPA a better fit for the
CPS protocol, rather than attempt to interview all adults in the household , the 2008 SPPA accepted proxy responses for spouses or partners (for
many of the questions). This change in design makes it much easier to measure the impact of proxy reports given that they were collected by
design rather than out of necessity. This research will explore the extent to which proxy reporting may have resulted in over or under reporting
participation. And when there are differences should you adjust your estimate? Of particular interest will be comparisons between husbands
reporting about the wives activities and vice a versa. In addition, this research will explore whether the quality of proxy reporting varies by
key population sub groups.
2002 NSAF Nonresponse Analysis
(Added July, 2006) (Author: Timothy Triplett)
Abstract: This report focuses on the characteristics of nonrespondents to the 2002 NSAF and assesses the impact of
nonreponse on the NSAF statistics. It includes analysis of the effectiveness of the call attempt and refusal conversion
strategies across all three rounds of NSAF data collection, providing some insights on how the level of effort affects the quality
of the data by reducing nonresponse. This report also includes a sociodemographic comparison of nonrespondents using census
block information obtained for 2002 nonrespondents and respondents.
Comparing Incentives at Initial and Refusal Conversion Stages On a Screening
Interview For a Random Digit Dial Survey
(Added March, 2006) (Authors: David Cantor, Patricia Cunningham, Timothy Triplett, Rebecca Steinbach)
Abstract: This paper discusses the results of an experiment that explored the use of incentives at different stages of an RDD
survey. A primary question addressed was the relative effectiveness of using an incentive at the initial call or during
refusal conversion. The results show that providing $2 at the initial attempt to complete the screener works about as well
with respect to response rates as a $5 treatment at refusal conversion.
Using an E-Mail Invitation to Screen Survey Respondents
(Added September, 2005) (Authors: Timothy Triplett, Adam Safir, Natalie Abi-Habib)
Abstract: Internet surveys can be designed so that the respondent can simply click on
a link that indicates that they do not want to fill out the survey. The link could be
embedded in the e-mail invite or for on-line invite surveys they could be included on the invite
page. The decline option would be appropriate for those respondents that are not actually the
end-user and, thus cannot answer most of the questions. This option can potentially improve
your response rate estimate as well as provide additional information about your respondents.
However, there is the concern that the decline option would provide an easy out for
legitimate respondents. This paper analyzes the effect the decline option had on the
response rate and survey responses.
Determining the Probability of Selection for a Telephone Household in a
Random Digit Dial Sample Design is Becoming more Difficult
(Added September, 2004) (Authors: Timothy Triplett, Natalie Abi-Habib)
Abstract: For many years, researchers using a RDD sample design could estimate the
total number of residential telephone numbers in a household by simply asking one, sometimes two,
and at most three questions. The 2002 National Survey of America's Families (NSAF) is a
telephone survey that relies primarily on a large RDD sample design using over 400,000 telephone
numbers. In previous rounds of the NSAF (1999 and 1997) a simple two-question approach was
used to estimate total residential phone numbers that could be sampled. For the 2002 study a
more complex set of questions was asked of each household which included learning more about what
these additional phone numbers were being used for. This paper compares the results of these
questions with other large RDD studies, with previous rounds of NSAF, and discusses the impact these
questions have on the probability of selection adjustments.
Sampling Refusals: Why, When, and How Much?
(Added September, 2004) (Authors: Timothy Triplett, Adam Safir, Kevin Wang, and Natalie Abi-Habib)
Abstract: The National Survey of America's Families (NSAF) is a dual frame survey that relies primarily on a large
RDD sample. The survey consists of a short three-minute screener interview used to determine eligibility, followed by a
forty-five minute extended interview. Almost half of all potential respondents initially refuse to participate. Although
interviews are completed with close to forty percent of all initial refusals, the per-interview cost of converted refusals far
exceeds that of initial cooperators. In addition, refusal conversion extends the data collection period. Therefore,
for the last round of NSAF data collection, refusal conversion was limited to a sub-sample of refusals. The completed
interviews from this effort were given a weighting factor to adjust for the refusals not attempted. This paper analyzes the
effect of using the refusal sampling approach and weighting adjustment on the survey estimates and associated standard errors.
Effects On Survey Estimates From Reducing Nonresponse
(Added December, 2002) (Authors: Adam Safir, Rebecca Steinbach, Timothy Triplett, Kevin Wang)
Abstract: This paper presents the results of research conducted to analyze the effects of efforts
to minimize the potential for nonresponse bias in the National Survey of America's Families (NSAF) survey
estimates. We compare the characteristics of those easily interviewed with the characteristics
of those who were difficult to contact or who had temporarily refused to do the survey. We
also compare information from the sampling frame for possible differences between respondents and
non-respondents. Finally, we provide a preliminary analysis of data from an independently
conducted follow-up survey of 2000 randomly selected NSAF respondents and nonrespondents.
What is Gained from Additional Call Attempts & Refusal Conversion and What are the Cost Implications?
(Updated November, 2002) (Author: Timothy Triplett)
Abstract: This research uses data from more than 20
random digit dialing (RDD) studies that have been conducted at the University of Maryland's
Survey Research Center over the past 12 years. Research looks closely at call attempt
patterns, outcomes of call attempts, and response rates. The research is useful in
making informed decisions on when is the best time to call certain types of households, how
useful is refusal conversion, and how to best design an auto scheduler. Future plans is to include
in the analysis call attempt data from the National Survey of America's Families.
Using a Short Follow-up Survey to
Compare Respondents and Nonrespondents (Added October, 2002)
(Authors: Timothy Triplett, Adam Safir, Kevin Wang, Rebecca Steinbach)
Abstract: The research analyzes the potential for nonresponse bias in the 1999 National Survey of
America's Families (NSAF) survey. The NSAF is primarily a random digit dial (RDD) telephone
survey, consisting of a short screener interview to determine household eligibility and a longer
extended interview during which survey items of interest are gathered for sampled household members.
In order to examine the potential for nonresponse bias, a follow-up survey of a sample of
respondents and refusals from the NSAF screener interview was conducted by a different survey
organization than the one which conducted the main survey. The follow-up survey contained key
items from the main survey, which were used to examine differences between respondents and
nonrespondents on these measures. In addition, the follow-up survey also contained questions
on subjects thought to be correlated with willingness to participate in a survey, such as attitudes
towards surveys and government, and perceptions of being busy.
How Long Should you
Wait Before Attempting to Convert a Refusal? (Added
October, 2001) (Authors: Timothy Triplett, Julie Scheib, Johnny Blair)
Abstract: With the increasing difficulties obtaining high response rates in telephone
Random Digit Dial (RDD) studies, refusal conversion is becoming a more important and expensive
component of the data collection process. One issue that is not clear in the literature is just
how long you should wait before calling back a household in which someone has refused the interview.
Often you hear the phrase "cooling off period," but how long is that period, and does a cooling off
period really have the intended effect of increasing the likelihood of converting the refusal?
This paper analyzes more than 5,000 nationwide RDD refusal conversion attempts in surveys conducted
at the University of Maryland's Survey Research Center over the past five years. By combining
studies, we have enough data to analyze refusal conversion rates by how many days have elapsed
between the initial refusal and the refusal conversion attempt. In addition, the data set is
large enough to separately look at refusals by selected respondents versus refusals in which the
respondent selection was not yet completed. Separate analysis can be done by gender of the
person who refused, as well as regional comparisons. Finally the range of survey topics and
interview lengths help ensure that the findings may apply to other RDD surveys.
The Effects of Telephone Introductions on Cooperation: An Experimental Comparison (Added November, 1999) (Authors: Nileeni Meegama, Johnny Blair)
Abstract: This paper studies the use of two alternative introductions to systematically
vary type of components of a survey introduction in a field experiment to see which variation
produces the best cooperation rate.
The Effect of Alternative Incentives on Cooperation and Refusal Conversion in a Telephone Survey (Added November, 1999) (Authors: Martha Kropf, Julie Scheib, Johnny Blair)
Abstract: As the cost and effort to gain cooperation in telephone surveys increases,
many researchers are exploring the use of incentives to increase initial cooperation rates and as
an inducement in refusal conversion. Seldom has the combined use of incentives for multiple
purposes been used in a single telephone survey. This paper reports on an experiment where
incentives were used to try and increase initial cooperation rates or as an inducement in refusal
Verbal Reports Are Data! A Theoretical Approach to Cognitive Interviews(Added, August, 1999) (Authors: Frederick Conrad, Johnny Blair, Elena Tracy)
Abstract: Respondents are, in effect, instructed to carry out a task – that is to do whatever mental work is necessary to provide an answer. By this view, they
can get into trouble in several ways. For example, respondents can misunderstand what they have been asked to do and, as a result, carry out the wrong task. Or
they can understand the instruction just fine but find themselves unable to carry out the task they have been assigned. Or
they can successfully carry out the assigned task but find themselves unable to fit their answer into the options provided. Each
of these difficulties demands a different sort of solution, for example, making question wording clearer, simplifying the task, or better
matching the response options to the way people think about the topic of the question.
A Comparison of Mail and E-Mail for a Survey of Employees in Federal Statistical Agencies(Added, February, 1998) (Authors: Mick Couper, Johnny Blair, Timothy Triplett)
Abstract: This paper reports on the results of a study comparing e-mail and mail for a survey of employees in several
government statistical agencies in the U.S. As part of a larger study of organization climate, employees in five agencies were
randomly assigned to a mail or e-mail mode of data collection. Similar procedures were used for advance contact and followup
of subjects across modes. This paper describes the procedures used to implement the e-mail survey, and discusses the results
of the mode experiment.
A Probability Sample of Gay Urban Males: The Use of Two-Phase Adaptive Sampling(Author: Johnny Blair)
Abstract: The major goal of the Gay Urban Males Study (GUMS) was to provide reliable (i.e., replicable) population estimates for the gay
male population of four major cities. In order to implement the probability sample necessary to achieve this objective, a
survey sample design was selected that took into account the likelihood of flaws in the data used for planning.
Initial Cooperators VS. Converted Refusers: Are There Response Behavior Differences? (Authors: Timothy Triplett, Johnny Blair, Teresa Hamilton, Yun Chiao Kang)
Abstract: With the increasing difficulties obtaining high response rates in telephone Random Digit Dial (RDD) studies, refusal conversion is becoming a
more important and expensive component of the data collection process. One issue that is not clear in the literature is just how long you should wait before calling
back a household in which someone has refused the interview. Often you hear the phrase "cooling off period," but how long is that period, and does a cooling off
period really have the intended effect of increasing the likelihood of converting the refusal? This paper analyzes more than 5,000 nationwide RDD refusal conversion
attempts in surveys conducted at the University of Maryland's Survey Research Center over the past five years. By combining studies, we have enough data to analyze
refusal conversion rates by how many days have elapsed between the initial refusal and the refusal conversion attempt. In addition, the data set is large enough to
separately look at refusals by selected respondents versus refusals in which the respondent selection was not yet completed. Separate analysis can be done by gender
of the person who refused, as well as regional comparisons. Finally the range of survey topics and interview lengths help ensure that the findings may apply to
other RDD surveys.
A Paper Describing the SRC "Fix-it Program"(Authors: Timothy Triplett, Beth Webb)
From Impressions to Data: Increasing the Objectivity of Cognitive Interviews (Authors: Frederick Conrad, Johnny Blair)
Conducting Cognitive Interviews to Test Self-Administered and Telephone Surveys: Which Methods Should We Use? (Authors: Susan Schechter, Johnny Blair, Janet Vande Hey)
Survey Procedures for Conducting Cognitive Interviews to
Pretest Questionnaires: A Review of Theory and Practice
(Authors: Johnny Blair, Stanley Presser)
All research papers on this site are
Adobe PDF documents.
If you cannot read
PDF files, click HERE!! to download the free Acrobat
If you could not find what you were looking
for then click HERE!!
Timothy Triplett's Home Page: