This study sought to investigate the presence of response heaping to survey items of the Constant Sum format. We hypothesized that participants would have a tendency to provide responses that are indicative of response heaping behaviors. Results indicate that participants exhibited an overwhelming tendency (99.75% of responses) to provide response heaping behaviors. Further, the sample frame utilized in this study differed significantly from previous studies that have investigated response heaping in other contexts. Our findings contrasted sharply with extant literature with respect to who is likely to engage in response heaping. We conclude that no demographic group is immune to response heaping tendencies given Constant Sum item formats. We encourage social researchers to remain cognizant of response heaping and to make every effort to mitigate its effects on measurement error, and ultimately score and measure validity.
Surveys have long been one of the most popular approaches to research in the social sciences. Typically designed to reach a large audience, well-designed surveys provide an ideal methodology for learning more about a person’s opinions, attitudes, perceptions, or social behaviors. The ultimate goal of survey research is the facilitation of quantitative analysis to allow for generalizability to an entire population by measuring particular constructs within a sample of people who represent the population of interest. 1 Although there are numerous benefits to survey research methods, a considerable body of literature has also noted a number of validity threats that contribute to measurement error. For example, selection bias, coverage bias, nonresponse bias, voluntary response bias, social desirability bias, among other biases are well-documented. A far lesser known type of bias is called response heaping.
Response heaping occurs in survey questions that require a numeric response when respondents indicate a preference for rounded numbers, often those ending in 0 or 5. 2, 3 A classic example of response heaping is evidenced by the tendency for people to report their ages in multiples of 5. 4 Heaping patterns differ from expected empirical subjective measurements. 4 Heaping is relatively common, as research indicates response heaping has been observed in various areas including hunting and fishing, 3, 5 compensation, 6 and blood pressure assessment, 7 to name a few. Questions are more likely to result in response heaping when they require respondents to use estimation and less likely to result in response heaping when respondents use counting to arrive at an answer. 8 Heaping is observed more often when data are self-reported (such as age), when limited-precision instruments are used (such as weights), and in rounded data (such as grade point averages). 9
Response heaping is thought to be the result of multiple cognitive processes. Survey participants may have a cultural tendency to choose ‘round’ or ‘even’ numbers, referred to as digit preference, or they may select options based on occurrence rates and multiples of a given timeframe. 4 Additional evidence points to uncertainty and ease-of-task as indicators of heaping as a response strategy, 2 and others have suggested that heaping may be a form of survey satisficing. 10.
2.2. Response Heaping as a Challenge for Survey ResearchersSocial research relies heavily on the use of surveys as a source of measurement. Inherent to survey research is the concept of total survey error, of which measurement error is a subtype. Measurement error can be attributed to biases and variance brought about by the respondent’s own behavior (e.g., misreporting true beliefs, failing to pay attention to a question), interviewer behavior (e.g., mis-recording responses, providing misdirection), and the questionnaire itself (e.g., confusing question wording or response options). 11
Response heaping, although not studied extensively in the social sciences, can contribute to measurement error. Often, survey research uses a ‘prototype,’ or a single value, to represent a range of responses. 12 This can be problematic, as response heaping indicates imprecision in data, and frequency distributions having heaps can cause bias in measures including means or totals 9 note that in regression-discontinuity designs bias is particularly likely in the outcome variable if there is heaping in the running variable, thus threatening the internal validity of the measure..
2.3. Survey Item TypesItem types also may play a role in response heaping. At present, we are unaware of any research that has investigated response heaping by item type. However, studies that have investigated heaping have tended to focus on open-ended data in which participants were asked to provide values (e.g., frequency of behaviors, age, etc.). It seems, however, that Constant Sum item types would be particularly vulnerable as items similarly consist of open-ended values that participants enter, but the total must add up to 100.
2.4. PurposeThe purpose of this study was to investigate the presence of response heaping to survey items of the Constant Sum format. We hypothesize that participants will have a tendency to provide responses that are indicative of response heaping behaviors.
A convenience sample consisting of 100 first-year students enrolled in a Doctor of Veterinary Medicine (DVM) program served as the sample frame. Demographic characteristics consisted of considerably more females (n = 85, 85%) than males (n =15, 15%), a characteristic common in veterinary medical programs. With respect to race and ethnicity, 73 (73%) self-identified as White, 9 (9%) as Asian, 8 (8%) as Black, 7 (7%) Hispanic and 3 (3%) as Other.
3.2. ProceduresData were collected as part of a routine programmatic assessment and permission to conduct the study was granted by the university’s Institutional Review Board (IRB). An electronic survey was administered to incoming DVM program students in August 2017. The larger program assessment included items about previous educational experiences, learning preferences, study strategies previously used, career goals, and more. Students were informed that participation in the study was purely voluntary, yet all 100 students completed the survey for a 100% response rate.
3.3. MeasuresA subset of four items was selected for the present study. The reason for inclusion was the four items consisted of a Constant Sum format in which the values (ranging from 0 to 100) entered by participants must sum to 100. That is, participants must enter values into a free text box, as opposed to selecting an already available option. A sample layout of the item format is presented in Figure 1. The four items presented to participants involved a definition of a “fixed” mindset and a “growth” mindset and asked students to characterize their current mindset with respect to intelligence, personality, skills/abilities, and attitude. Each participant was asked to respond to each of the 4 items, with each item requiring a response in two columns.
Because of the context of this study in which the emphasis was upon participants’ response patterns to the four Constant Sum items, substantive findings were beyond the scope of the present study. To that end, data analysis consisted simply of producing descriptive statistics, namely frequency values. These values were then evaluated to determine how many were divisible by 5 or 10, thereby exhibiting potential evidence for response heaping.
A total of 100 veterinary medical students completed 4 constant sum items consisting of two categories (fixed mindset or growth mindset). No missing data were present in any of the participants’ response vectors. Of the 800 responses evaluated, 798 (99.75%) contained values that were indicative of response heaping (values divisible by 5 or 10). Only one student provided values of 33% and 67%, respectively. Table 1 presents a frequency table highlighting the percentage values students selected.
Results of this study indicate clear evidence of response heaping. Although one might reasonably assume participants largely will select incremental values when responding to survey items of this nature, we were surprised that only one participant provided a response that did not involve an increment of 5 or 10. This individual selected values of 33% and 67%, respectively. Given these percentage values approximate a 2:1 ratio, it was surprising that considerably more participants did not also select this option. Additionally, we anticipated that some participants might select values of 49% and 51%, respectively, as this pair of values would indicate a nearly even split with a minor leaning toward one mindset type. No responses of this nature were discernible in the data.
One potential issue may be that participants were presented four Constant Sum items with which values must total 100. It is possible that participants were more likely to engage in response heaping because of the large range. However, it should be noted that values summing to 100 is the default option in most survey software packages for Constant Sum item types. This lends support to the notion that response heaping likely is very common when using Constant Sum item types. Interestingly, however, this phenomenon also poses some questions about how participants might respond if the total sum was smaller or different. One particularly interesting question is how participants might respond if presented a range in which values were not divisible by 5 or 10 (e.g., 0 to 33), or odd numbers that are divisible by 5, but not 10 (e.g., 0 to 25). Future research should explore these considerations to determine if participant behavior tends to change under these conditions.
Perhaps the central concern of response heaping is that these participant behaviors tend to introduce error that contaminate measures and threaten the validity of scores and ratings. For social research, this can be a significant problem. For example, multiple instances of response heaping can have a compounding effect on error that may distort score validity. Although an occasional over/under-estimate of a value by a single participant may not have a significant bearing on the accuracy of a measure, response heaping by most, if not all participants, is certain to significantly impact both the accuracy and the stability of a measure. In social research in which researchers ask questions about participants behaviors (e.g., how often a participant engages in a particular behavior), opinions (e.g., the degree to which a participant finds something acceptable/unacceptable), perspectives (e.g., how strongly a participant holds a particular view) and various latent traits (e.g., how a participant might characterize his or herself), response heaping may pose rather serious validity threats. Thus, social researchers must remain cognizant of these errors and consider ways to best mitigate them.
Previous research has indicated that persons of low socioeconomic status, 13 persons with lower literacy levels 14 and persons in less modernized countries 15 are more likely to engage in response heaping. Work by Boyle and Grada 16 found some limited evidence that women are more likely to engage in response heaping than men and research by Coale 17 found that non-Whites were more likely to engage in response heaping than Whites. Barbieri and Hertrich 18 reported that heaping varies across countries and cultures. Participants in the present study, however, largely were highly educated, females of relatively higher socioeconomic status residing in the United States. Given this contrast of demographic characteristics to extant research and the prevalence of response heaping behaviors, we conclude that no demographic group is immune to response heaping tendencies.
Walejko 19 noted that response heaping may be a form of satisficing, as some participants may not want to answer questions carefully or honestly. This also is an interesting consideration for social research, as the topics studied often are of a personal nature to participants. Thus, it remains unknown if the nature of the subject (particularly if it is of a personal nature), the social distance between the item from the participant, or other social factor might affect response heaping tendencies. These questions provide some interesting pathways for future research.
This study does consist of several limitations. All 100 participants invited to participate in the study completed all four items, thus mitigating any effects of sampling bias. Further, no missing data were discernible, thus mitigating some elements of response bias. However, because the sample consisted of students in a health professions professional degree program it is unknown if the results are generalizable across other contexts, including higher education samples. Further, because the sample frame consisted predominantly of females, we did not have sufficient numbers of male participants to test for gender effects. Similarly, the sample largely consisted of White persons between the ages of 22 and 28, so we were unable to explore other factors such as race/ethnicity, age, etc.
Another limitation of the data was that it consisted of a convenience sample of incoming veterinary students at one college of veterinary medicine. In no way does the sample frame resemble population parameters among members of the general public in the United States, thus we present no evidence for how a representative cross-section of American participants might respond to Constant Sum items. Future research should focus on exploring response heaping behaviors across a variety of cultural groups and contexts
This study sought to investigate the presence of response heaping to survey items of the Constant Sum format. We hypothesized that participants would have a tendency to provide responses that are indicative of response heaping behaviors. Results indicate that participants exhibited an overwhelming tendency (99.75% of responses) to provide response heaping behaviors. Further, the sample frame utilized in this study differed significantly from previous studies that have investigated response heaping in other contexts. Our findings contrasted sharply with extant literature with respect to who is likely to engage in response heaping. We conclude that no demographic group is immune to response heaping tendencies given Constant Sum item formats. We encourage social researchers to remain cognizant of response heaping and to make every effort to mitigate its effects on measurement error, and ultimately score and measure validity.
[1] | Neuman, W.L. Social Research Methods: Pearson New International Edition: Qualitative and Quantitative Approaches. Pearson Higher Ed, 2013. | ||
In article | |||
[2] | Holbrook, A.L., Anand, S., Johnson, T.P., Cho, Y.I., Shavitt, S., Chavez, N., and Weiner, S. “Response Heaping in Interviewer-Administered Surveys: Is It Really a Form of Satisficing?” Public Opinion Quarterly, 78 (3), 591-633, 2014. | ||
In article | View Article | ||
[3] | Vaske, J.J., and Beaman, J. “Lessons Learned in Detecting and Correcting Response Heaping: Conceptual, Methodological, and Empirical Observations.” Human Dimensions of Wildlife, 11(4), 285-296, 2006. | ||
In article | View Article | ||
[4] | Roberts, J.M., and Brewer, D.D. “Measures and tests of heaping in discrete quantitative distributions.” Journal of Applied Statistics, 28(7), 887-896, 2001. | ||
In article | View Article | ||
[5] | Hultsman, W.Z., Hultsman, J.T., and Black, D.R. “Hunting satisfaction and reciprocal exchange: Initial support from a lottery-regulated hunt.” Leisure Sciences, 11(2), 145-150, 1989. | ||
In article | View Article | ||
[6] | Hirsch, B.T. “Why Do Part-Time Workers Earn Less? The Role of Worker and Job Skills.” ILR Review, 58(4), 525-551, 2005. | ||
In article | View Article | ||
[7] | Wen, S.W., Kramer, M.S., Hoey, J., Hanley, J.A., and Usher, R.H. “Terminal digit preference, random error, and bias in routine clinical measurement of blood pressure.” Journal of Clinical Epidemiology, 46(10), 1187-1193, 1993. | ||
In article | View Article | ||
[8] | Burton, S., and Blair, E. “Task Conditions, Response Formulation Processes, and Response Accuracy for Behavioral Frequency Questions in Surveys.” Public Opinion Quarterly, 55(1), 50, 1991. | ||
In article | View Article | ||
[9] | Barreca, A.I., Lindo, J.M., and Waddell, G.R. “Heaping-Induced Bias in Regression-Discontinuity Designs.” Economic Inquiry, 54(1), 268-293, 2015. | ||
In article | View Article | ||
[10] | Krosnick, J.A. “Response strategies for coping with the cognitive demands of attitude measures in surveys.” Applied Cognitive Psychology, 5(3), 213-236, 1991. | ||
In article | View Article | ||
[11] | Krosnick, J.A., Lavrakas, P.J., and Kim, N. “Survey Research.” In Handbook of Research Methods in Social and Personality Psychology (pp. 404-442). Cambridge University Press, 2013. | ||
In article | View Article | ||
[12] | Beaman, J., Vaske, J.J., Schmidt, J.I., and Huan, T.C. “Measuring and Correcting Response Heaping Arising From the Use of Prototypes.” Human Dimensions of Wildlife, 20(2), 167-173, 2015. | ||
In article | View Article | ||
[13] | Myers, R.J. “An Instance of Reverse Heaping of Ages.” Demography, 13: 577-80, 1976. | ||
In article | View Article PubMed | ||
[14] | Budd, J.W., and Guinnane, T. “Age-Misreporting, Age-Heaping, and the 1908 Old Age Pension Act in Ireland.” Population Studies, 45: 497-518, 1991. | ||
In article | View Article PubMed | ||
[15] | Nagi, M.H., Stockwell, E.G., and Snavley, L.M. “Digit Preference and Avoidance in the Age Statistics of Some Recent African Censuses: Some Patterns and Correlates.” International Statistical Review, 41: 165-74, 1973. | ||
In article | View Article | ||
[16] | Boyle, P.P., and Gráda, C.O. “Fertility Trends, Excess Mortality, and the Great Irish Famine.” Demography, 23: 543-62, 1986. | ||
In article | View Article PubMed | ||
[17] | Coale, A. “The Population of the United States in 1950 Classified by Age, Sex, and Color-A Revision of Census Figures.” Journal of the American Statistical Association, 50(269): 16-54, 1955. | ||
In article | View Article | ||
[18] | Barbieri, M., and Hertrich, V. “Age Difference between Spouses and Contraceptive Practice in Sub-Saharan Africa.” Population, 60: 617-54, 2005. | ||
In article | View Article | ||
[19] | Walejko, G.K. “The Effectiveness of an Interactive Web Survey in Decreasing Satisficing and Social Desirability Bias.” Paper presented at the Annual Meeting of the American Association for Public Opinion Research, Chicago, IL, USA, 2010. | ||
In article | |||
Published with license by Science and Education Publishing, Copyright © 2018 Kenneth D. Royal and Kristan M. Erdmann
This work is licensed under a Creative Commons Attribution 4.0 International License. To view a copy of this license, visit https://creativecommons.org/licenses/by/4.0/
[1] | Neuman, W.L. Social Research Methods: Pearson New International Edition: Qualitative and Quantitative Approaches. Pearson Higher Ed, 2013. | ||
In article | |||
[2] | Holbrook, A.L., Anand, S., Johnson, T.P., Cho, Y.I., Shavitt, S., Chavez, N., and Weiner, S. “Response Heaping in Interviewer-Administered Surveys: Is It Really a Form of Satisficing?” Public Opinion Quarterly, 78 (3), 591-633, 2014. | ||
In article | View Article | ||
[3] | Vaske, J.J., and Beaman, J. “Lessons Learned in Detecting and Correcting Response Heaping: Conceptual, Methodological, and Empirical Observations.” Human Dimensions of Wildlife, 11(4), 285-296, 2006. | ||
In article | View Article | ||
[4] | Roberts, J.M., and Brewer, D.D. “Measures and tests of heaping in discrete quantitative distributions.” Journal of Applied Statistics, 28(7), 887-896, 2001. | ||
In article | View Article | ||
[5] | Hultsman, W.Z., Hultsman, J.T., and Black, D.R. “Hunting satisfaction and reciprocal exchange: Initial support from a lottery-regulated hunt.” Leisure Sciences, 11(2), 145-150, 1989. | ||
In article | View Article | ||
[6] | Hirsch, B.T. “Why Do Part-Time Workers Earn Less? The Role of Worker and Job Skills.” ILR Review, 58(4), 525-551, 2005. | ||
In article | View Article | ||
[7] | Wen, S.W., Kramer, M.S., Hoey, J., Hanley, J.A., and Usher, R.H. “Terminal digit preference, random error, and bias in routine clinical measurement of blood pressure.” Journal of Clinical Epidemiology, 46(10), 1187-1193, 1993. | ||
In article | View Article | ||
[8] | Burton, S., and Blair, E. “Task Conditions, Response Formulation Processes, and Response Accuracy for Behavioral Frequency Questions in Surveys.” Public Opinion Quarterly, 55(1), 50, 1991. | ||
In article | View Article | ||
[9] | Barreca, A.I., Lindo, J.M., and Waddell, G.R. “Heaping-Induced Bias in Regression-Discontinuity Designs.” Economic Inquiry, 54(1), 268-293, 2015. | ||
In article | View Article | ||
[10] | Krosnick, J.A. “Response strategies for coping with the cognitive demands of attitude measures in surveys.” Applied Cognitive Psychology, 5(3), 213-236, 1991. | ||
In article | View Article | ||
[11] | Krosnick, J.A., Lavrakas, P.J., and Kim, N. “Survey Research.” In Handbook of Research Methods in Social and Personality Psychology (pp. 404-442). Cambridge University Press, 2013. | ||
In article | View Article | ||
[12] | Beaman, J., Vaske, J.J., Schmidt, J.I., and Huan, T.C. “Measuring and Correcting Response Heaping Arising From the Use of Prototypes.” Human Dimensions of Wildlife, 20(2), 167-173, 2015. | ||
In article | View Article | ||
[13] | Myers, R.J. “An Instance of Reverse Heaping of Ages.” Demography, 13: 577-80, 1976. | ||
In article | View Article PubMed | ||
[14] | Budd, J.W., and Guinnane, T. “Age-Misreporting, Age-Heaping, and the 1908 Old Age Pension Act in Ireland.” Population Studies, 45: 497-518, 1991. | ||
In article | View Article PubMed | ||
[15] | Nagi, M.H., Stockwell, E.G., and Snavley, L.M. “Digit Preference and Avoidance in the Age Statistics of Some Recent African Censuses: Some Patterns and Correlates.” International Statistical Review, 41: 165-74, 1973. | ||
In article | View Article | ||
[16] | Boyle, P.P., and Gráda, C.O. “Fertility Trends, Excess Mortality, and the Great Irish Famine.” Demography, 23: 543-62, 1986. | ||
In article | View Article PubMed | ||
[17] | Coale, A. “The Population of the United States in 1950 Classified by Age, Sex, and Color-A Revision of Census Figures.” Journal of the American Statistical Association, 50(269): 16-54, 1955. | ||
In article | View Article | ||
[18] | Barbieri, M., and Hertrich, V. “Age Difference between Spouses and Contraceptive Practice in Sub-Saharan Africa.” Population, 60: 617-54, 2005. | ||
In article | View Article | ||
[19] | Walejko, G.K. “The Effectiveness of an Interactive Web Survey in Decreasing Satisficing and Social Desirability Bias.” Paper presented at the Annual Meeting of the American Association for Public Opinion Research, Chicago, IL, USA, 2010. | ||
In article | |||