1976 — 1977 |
Kahn, Robert Groves, Robert |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Measuring the Efficiency of Telephone Surveys For Social Science Data Collection @ University of Michigan Ann Arbor |
0.915 |
1978 — 1981 |
Cannell, Charles Groves, Robert |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Experimentation With Computer-Assisted Telephone Data Collection Techniques @ University of Michigan Ann Arbor |
0.915 |
1993 — 1995 |
Groves, Robert M |
R01Activity Code Description: To support a discrete, specified, circumscribed project to be performed by the named investigator(s) in an area representing his or her specific interest and competencies. |
Survey Design Acknowledging Nonresponse @ University of Michigan At Ann Arbor
One of the most common methods of scientific inquiry in demographic research in the sample survey, which permits inference to large diverse natural populations. One theoretical foundation of that inference rests on complete measurement of all sample units; that is, the absence of unit nonresponse. When sample persons who fail to participate differ on key health-related variables from those who do participate, nonresponse error can threaten inference both for descriptive and structural model-based estimates. The efficacy both of efforts to reduce nonresponse through data collection strategies and of statistical postsurvey adjustment for nonresponse rest on (at least implicit) theories of survey participation. The proposed project builds on four years of prior investigation of nonresponse features of several large U.S. national household surveys and the theoretical principles inductively derived from qualitative investigations of survey participation. Based on that work the investigators have encouraged the enrichment of observational and other measures of attributes of the interviewer, neighborhood, household, sample person, and features of the interaction between interviewer and sample person. The collection of these data is specified at the design stage of a survey by a designer who anticipates the fact that nonresponse will occur. The data are used to guide nonresponse reduction activities, decisions to terminate efforts to increase response rates, as well as statistical models for postsurvey adjustment. The proposed project evaluates the utility of some of these design features for the Health and Retirement Survey (HRS) and the upcoming Asset and Health Dynamics of the Oldest Old (AHEAD). The specific aims of the project are an examination of the characteristics of HRS nonrespondents relative to those of respondents and correlates of HRS nonresponse; construction of alternative postsurvey nonresponse adjustment techniques, evaluated through sensitivity analysis on important HRS analyses and variance properties of the resulting estimators; construction of estimates of nonresponse error for key HRS descriptive statistics; development of cost and error models for nonresponse, evaluating the real tradeoff decisions investigators face between continuing to expend data collection resources to increase response rate and limiting or eliminating further expenditures; and a set of proposed design features for the second wave interviewing of HRS and the AHEAD project informative of nonresponse error features of the studies.
|
1 |
1995 — 1998 |
Singer, Eleanor Groves, Robert |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Survey Respondents' Reactions to Persuasion Attempts: Ethical Valuations and Behavioral Response @ University of Michigan Ann Arbor
Increasingly relied on by government and private sector, sample surveys range from health care to prayer in schools, to the design of future automobiles. They are a basic tool in commercial activity and in statistical operations of government agencies, and they contribute to induction and testing of theories in most of the social sciences. Increasingly, they are a common experience in the daily lives of U.S. residents. Efforts to maintain response rates, so as to preserve the representativeness of surveys and their inferential power, raise serious ethical issues for survey research. Focusing on the issue of equity, this pilot project will design and test methods to: explore public reactions to ethical issues in the use of incentives and to alternative ways to reduce nonresponse; and investigate the effect of these reactions on people's willingness to participate in surveys. The investigators want to find out under what conditions, if any, people feel an obligation to participate in a survey; under what conditions, if any, they expect a tangible reward for their participation; what strategies to increase participation offend their sense of equity or their standards of politeness; and how these various beliefs combine to affect the likelihood of participation. The pilot will examine the following methods to measure these variables: direct questioning to measure beliefs, perceptions, expectations, and standards; videotaped vignettes to measure predicted behavior in response to different combinations of survey characteristics and reward; and a small field experiment to measure actual behavior. The narrow aim of this research is to investigate the role of respondent perceptions in survey participation; it also has broader implications for understanding the motivational force of equity evaluations and the role of attitudes in influencing behavior. Several articles and plans for further research should result from this award.
|
0.915 |
2002 — 2008 |
Presser, Stanley (co-PI) [⬀] Singer, Eleanor Groves, Robert Couper, Mick (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Identifying Causal Mechanisms Underlying Nonignorable Unit Nonresponse Through Refusals to Surveys @ University of Michigan Ann Arbor
This project is a set of randomized experiments aimed at systematically both producing and eliminating unit nonresponse error in survey estimates. Specifically, three attributes of the survey request (topic interest, survey sponsor, and monetary incentives) will be experimentally manipulated in concert with choice of target population. Sampling frames containing persons with known characteristics (e.g., occupational groups, interest groups, groups of consumers of specific products or services) will be used. Randomly identified subsamples of these groups will be asked to participate in self-administered surveys on topics of relevance to the frame and topics irrelevant to the frame. Crossed with this factor, the sponsorship of the survey will be experimentally varied, with one sponsor relevant to the frame and one irrelevant to the frame. Finally, the use of a monetary incentive will be crossed with both of the other factors to measure the effects of extrinsic benefits of participation. The key hypothesis is that topic interest and sponsorship act to produce nonignorable nonresponse when the surveys contain items relevant to the frame, and that monetary incentives act to reduce the magnitude of nonresponse error by bringing into the respondent pool sample persons with low topic interest and minimal affect toward the sponsor. The effects of sponsor affect and topic interest are expected to be additive; monetary incentives are expected to counteract the nonignorability influences of both factors.
The practical importance of this work to statistics based on surveys is: a) to help agencies conducting surveys anticipate when different sponsors may obtain different results; b) to provide evidence about potentially harmful effects on nonresponse error of interviewers' emphasizing single purposes of a survey; c) to produce evidence regarding the ameliorating effects on nonresponse error of monetary incentives; and d) to test a conceptual structure that will help survey sponsors anticipate when nonresponse rates will affect error and when they will not. This research is supported by the Methodology, Measurement, and Statistics Program and a consortium of federal statistical agencies under the Research on Survey and Statistical Methodology Funding Opportunity.
|
0.915 |
2005 — 2007 |
Groves, Robert M |
P01Activity Code Description: For the support of a broadly based, multidisciplinary, often long-term research program which has a specific major objective or a basic theme. A program project generally involves the organized efforts of relatively large groups, members of which are conducting research projects designed to elucidate the various aspects or components of this objective. Each research project is usually under the leadership of an established investigator. The grant can provide support for certain basic resources used by these groups in the program, including clinical components, the sharing of which facilitates the total research effort. A program project is directed toward a range of problems having a central research focus, in contrast to the usually narrower thrust of the traditional research project. Each project supported through this mechanism should contribute or be directly related to the common theme of the total research effort. These scientifically meritorious projects should demonstrate an essential element of unity and interdependence, i.e., a system of research activities and projects directed toward a well-defined research program goal. |
Core--Data Innovation @ University of Michigan At Ann Arbor
Program projects (versus separate R01s) make sense when combining separate ideas enhances the chance of significant scientific progress. We think this is an opportunity to do so, for two reasons: a) this P01 combines different disciplines, sensitive to different quality aspects of self-report data, bringing with them a set of complementary perspectives; and b) we share a set a data collection vehicles whenever possible, so that collaboration across projects is realized. The specific aims of the data innovation core are to: Aim 1: assemble the paradata analytic structure for the Health and Retirement Study; Aim 2: enhance the coordinated use of data collection vehicles among projects, especially in later years of the P01; and Aim 3: magnify the proposed effects of individual projects in the P01 by enhancing the data collection funds and consultative talent in response to workshops sponsored by the networking core of the P01.
|
1 |
2006 — 2009 |
Groves, Robert Raghunathan, Trivellore [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Dissertation Research: An Investigation of the Nexus of Survey Nonresponse and Measurement Error @ University of Michigan Ann Arbor
This project will examine the nexus between nonresponse and measurement errors in sample surveys. Recent research has not demonstrated a strong relationship between nonresponse rates and nonresponse bias. Nonetheless, best practices argue that researchers should attempt to maximize response rates. One voiced concern about practices involving nonresponse reduction is that reluctant people in the sample, successfully brought into the respondent data set through persuasive efforts, may provide data filled with measurement error. However, no study has looked at links among the propensity to be a respondent, true values on a question of interest, and the respondent's measurement error properties for that question. The research will fill this gap by addressing three questions. First, under what circumstances is nonresponse propensity related to the survey variables of interest? Is noncontact or refusal nonresponse more likely to induce nonresponse bias? Second, what is the relationship between nonresponse propensity, nonresponse bias, and measurement error? In particular, how do properties of questions and characteristics of respondents affect the nexus between nonresponse bias and measurement error? Is nonresponse propensity arising from noncontact or that from refusal more susceptible to a correlation with measurement-error bias? Third, can traditional statistical adjustments or analytic techniques remedy the problem? Data from two national and two regional surveys will be analyzed to answer these questions. The research will demonstrate when placing resources into nonresponse reduction is outweighed by increases in measurement error on key statistics. The research also will suggest changes to be made during field collection or in postsurvey adjustments that jointly account for two error sources: nonresponse and measurement error.
Sample surveys are the mechanism by which key indicators of the nation's well-being are created and through which mechanisms of personal and societal changes are tested. Understanding error structures of sample surveys - both in terms of when errors occur and the mechanism behind their occurrence - is critically important because conclusions made from surveys could be dramatically wrong. As nonresponse rates continue to rise, survey organizations are increasing their persuasive efforts in an attempt to maintain a representative sample on which inferences can be made. If these persuasive efforts actually lower the quality of data, then the additional funds for these efforts are misguided. The findings from this research will inform survey practitioners in how to conduct surveys and obtain better quality data and the scientific community that uses surveys to understand when survey-derived findings may be at risk. As a Doctoral Dissertation Research Improvement award, this award also will provide support to enable a promising student to establish a strong independent research career. The research is supported by the Methodology, Measurement, and Statistics Program and a consortium of federal statistical agencies as part of a joint activity to support research on survey and statistical methodology.
|
0.915 |
2006 — 2012 |
Groves, Robert Tourangeau, Roger [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
The Effects of Survey Presentation On Nonignorable Nonresponse and Measurement Error @ University of Michigan Ann Arbor
Every request to take part in a survey is framed in some way. This project consists of a set of experiments that investigate how the presentation of the survey request affects nonresponse and measurement error. The experiments are guided by a theory of survey participation (the salience-leverage theory) that claims that people decide whether to take part in a survey based on whatever aspects of the survey are made salient in the presentation of the survey request and on how they evaluate those features. Two initial experiments randomly vary the description of the topic and sponsor of the survey, with hypothesized effects both on nonresponse propensities and on reporting. In the third experiment, survey design features that can mediate or reduce the error-producing influences of the survey topic and sponsor will be examined. Thus, the project experimentally tests mechanisms producing nonresponse bias and measurement errors and, once these effects have been documented, provides guidance to the survey practitioner about how to reduce their impact.
While the research is theoretically motivated and features experimental control, there are important practical implications of the work for the federal statistical agencies and the larger survey community. Sometimes estimates of key social indicators (e.g., the prevalence of rape or the frequency of defensive use of handguns) vary widely across surveys. The effects explored in this project may help explain these discrepancies. In addition, this work will a) help agencies conducting surveys anticipate when different sponsors may obtain different results, b) provide evidence about potentially harmful effects on nonresponse error and measurement error of emphasizing a single purpose of a survey, and c) produce evidence regarding design features that can reduce the effects of the presentation of the survey on nonresponse and measurement error.
|
0.915 |
2008 — 2011 |
Groves, Robert Conrad, Frederick (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Collaborative Research: Acoustic Properties, Listener Perceptions, and Outcomes of Interactions Between Survey Interviews and Sample Persons @ University of Michigan Ann Arbor
While the sample survey is one of the cornerstones of social science research methods, the tool is experiencing large declines in participation of sampled persons. This project studies participation in cross-section Random-Digit Dialed (RDD) telephone surveys. The declines have led to such significant research cost increases that serious consideration of terminating basic surveys is occurring.
One large source of variation in RDD cooperation rates is the interviewer. In centralized telephone interviewing facilities interviewers are often assigned similar mixes of cases, yet obtain very different response rates. All the stimuli yielding a decision to participate in an RDD survey must be delivered through audio channel ? through words, pitch, inflection, and pacing of the interviewers? speech. These attributes are not the traditional objects of study of survey methodologists. Hence, the project combines insights from speech science, phonetics, psycholinguistics, and survey methodology. A central conceptual framework utilizes the companion notions of tailoring, convergence, and similarity in cooperative dyadic communication.
5,000 digital audio recordings of RDD telephone interviewer introductions (from two data collection organizations, 6 different surveys, and 165 different interviewers) are transformed into a quantitative data set suitable for dynamic and static statistical models of response propensities. Acoustic measures motivated by concepts from speech science are extracted using Praat acoustic software; raters make judgments of perceived attributes of speakers using both concepts central to leverage-salience theory and the social psychology; and word and disfluency rates are extracted from transcriptions of the audio files.
The resulting data set is a relational one with records at the levels of the interviewer, respondent (case), contact, and conversational turn. Hierarchical and survival models are built using the combined conceptual frameworks predicting the final participation decision of a sample case.
While the research emphasizes basic science, it has potential practical outcomes. Academic survey research is a crucial tool for a $15 billion commercial research industry in the US; this project?s success can contribute to the health of this commercial sector.
|
0.915 |
2008 — 2013 |
Schwarz, Norbert [⬀] Groves, Robert Peytcheva, Emilia (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Doctoral Dissertation Research: Language of Administration as a Source of Measurement Error: Implications For Survys of Immigrants and Cross-Cultural Survey Research @ University of Michigan Ann Arbor
U.S. surveys of minorities and immigrants allow respondents to answer in the language of their choice. Psychological and linguistic research suggests that the language of administration may influence responses through its impact on various cognitive processes and through the cultural frame that a language brings to the conversational context. However, the extent to which the language of survey administration affects survey responding is currently unknown. The objectives of this project are to examine the extent to which language of administration influences bilingual bicultural respondents' answers to survey questions and to investigate which types of questions are most susceptible to language effects. These objectives will be addressed through secondary analyses of observational data and an experiment. Secondary analyses of the New Immigrant Survey will compare differences in responses among those interviewed in English and those interviewed in Spanish and identify the types of questions for which these differences are particularly pronounced. The lack of random assignment to a language will be overcome by using propensity score methods. In addition, an experimental web survey of about 300 adult Arab-American immigrants from the Detroit area, randomly assigned to an English or Arabic version of the questionnaire, will be launched. The experiment will include questions with varying degrees of sensitivity and social desirability across the American and Arab cultures to assess if the language of administration affects conformity with cultural norms. It also will include autobiographical questions to assess differences in recall as a function of the match between the language spoken at the time of the initial encoding and the language at the time of recall.
This project brings basic psychological and linguistic research to bear on surveys of bilingual bicultural respondents, for whom the language of survey administration may evoke different cultural frames. It will illuminate how the language of survey administration affects respondents' answers and will highlight the methodological implications for surveys of immigrant populations. The broader societal impact of this work derives from the relevance of collecting accurate data from immigrants. If language affects socially desirable responding and the accuracy of autobiographical recall, the choice of language may be an important factor in collecting accurate data for policy decisions. At present, little is known about these possibilities, despite the considerable resources invested in surveys of immigrant populations. As a Doctoral Dissertation Research Improvement award, this award also will provide support to enable a promising student to establish a strong independent research career.
|
0.915 |
2009 — 2010 |
Groves, Robert Macwhinney, Brian (co-PI) [⬀] |
N/AActivity Code Description: No activity code was retrieved: click on the grant title for more information |
Eager: Workshop On Shared Infrastructure For the Social and Behavioral Sciences @ University of Michigan Ann Arbor
EAGER: Workshop on Shared Infrastructure for the Social and Behavioral Sciences
Robert M. Groves University of Michigan
SBE 0930160
Abstract
This project involves a workshop to be mounted in the spring of 2009 to address the shared infrastructure needs of the social and behavioral Sciences. The workshop addresses the fundamental question: What social and behavioral scientific questions of great import to human understanding cannot now be answered because researchers do not have the needed infrastructure to answer them? The infrastructure needs involve data needs and satisfactory methodological tools. The focus of the workshop centers on how human scientists can build the needed infrastructure.
Participants in this project are chosen to represent researchers at the cutting-edge of individual lines of inquiry. Viewpoints involve both new, bold initiatives, and key needed additions to well established thinking about current human science domains.
The product of this workshop is a listing of key questions that can be answered with new infrastructure, explicated in a language that can be understood across the SBE disciplines represented in the Directorate. The results are potentially transformative because the discussion has broad impact for researchers doing social and behavioral research across all subfields of the human sciences.
|
0.915 |