Author ORCID Identifier
Reiter-Palmon https://orcid.org/0000-0001-8259-4516
Document Type
Article
Publication Date
5-1-2019
Publication Title
Psychology of Aesthetics, Creativity, and the Arts
Volume
13
Issue
2
First Page
144
Last Page
152
Abstract
Divergent thinking tests are often used in creativity research as measures of creative potential. However, measurement approaches across studies vary to a great extent. One facet of divergent thinking measurement that contributes strongly to differences across studies is the scoring of participants’ responses. Most commonly, responses are scored for fluency, flexibility, and originality. However, even with respect to only one dimension (e.g., originality), scoring decisions vary extensively. In the current work, a systematic framework for practical scoring decisions was developed. Scoring dimensions, instruction-scoring fit, adequacy of responses, objectivity (vs. subjectivity), level of scoring (response vs. ideational pool level), and the method of aggregation were identified as determining factors of divergent thinking test scoring. In addition, recommendations and guidelines for making these decisions and reporting the information in papers have been provided.
Recommended Citation
30. Reiter-Palmon, R., Forthmann, B., & Barbot, B. (2019). Scoring divergent thinking tests: A review and systematic framework. The Psychology of Aesthetics, Creativity, and the Arts, 13(2), 144-152. https://doi.org/10.1037/aca0000227
Comments
©American Psychological Association, [2019]. This paper is not the copy of record and may not exactly replicate the authoritative document published in the APA journal. The final article is available, upon publication, at: http://dx.doi.org.leo.lib.unomaha.edu/10.1037/aca0000227