Learning Outcomes Related to the Use of Personal Response Systems in Large Science Courses

by Jolee West, Wesleyan University


The use of Personal Response Systems, or polling technology, has been receiving wider attention within academia and also in the popular press. While neither the technology nor the pedagogical goals are new, general knowledge and implementation of course-related polling appears to recently have reached the critical threshold. Between 2004 and 2005, the implementations by “early adopters”[1]began to seriously influence the “early majority” resulting in wider visibility of the technology. This trend is illustrated by the increasing number of references to “clickers” and “personal response systems” on the EDUCAUSE website from 2004 until the present, as well as a recent spate of newspaper and e-zine articles.[2]

Many institutions, including community colleges, liberal arts colleges, and large research universities have now adopted Personal Response Systems (i.e., polling technology) in their larger lecture courses across the curriculum. For example, MIT, University of Massachusetts-Amherst, Harvard, Yale, Brown, University of Virginia, Vanderbilt, and Duke, have all implemented personal response systems for larger physics and/or biology lecture courses. A number of implementations took place under the auspices of granting programs, such as the Pew Center for Academic Transformation and the Davis Educational Foundation’s Creating Active Learning Through Technology, which focus on the economics of teaching large lecture courses and the transformation of these typically passive learning style courses into active learning experiences for students.

But as is often the case in the adoption of new instructional technologies, arguments for adoption rarely rest on published analyses demonstrating improvements in learning outcomes. Commonly, such assessments simply have not been performed. Nevertheless, in researching the technology for my own institution, I did a hard search for learning outcome studies. I found that data abound on student satisfaction with personal response systems, on whether it made their class more interesting, improved attendance and the like.[3] But reports of learning outcomes are few and far between. What follows is a discussion of four references I found reporting learning outcome analyses related to the use of interactive engagement pedagogical methods in large science courses. Only the last two cases are personal response systems specifically mentioned. But as we will see, the technology is not really the star in this show; not surprisingly, it is the pedagogy that takes center stage.

A controlled study by Ebert-May et al. shows that student confidence in their knowledge of course materials is significantly increased in courses taught using interactive engagement methods over those taught by traditional lecture: “Results from the experimental lectures at NAU suggest that students who experienced the active-learning lecture format had significantly higher self-efficacy and process skills than students in the traditional course. A comparison of mean scores from the self-efficacy instrument indicated that student confidence in doing science, in analyzing data, and in explaining biology to other students was higher in the experimental lectures (N = 283, DF = 3, 274, P < 0.05).”[4]

A large study by Hake of 63 introductory physics courses taught with traditional methods versus interactive engagement (IE) methods, examined student learning outcomes using a commonly applied pre- and post-test design based on the Halloun-Hestenes Mechanics Diagnostic test and Force Concept Inventory. The study, which included 6,542 students, concluded that “A plot of average course scores on the Hestenes/Wells problem-solving Mechanics Baseline test versus those on the conceptual Force Concept Inventory show a strong positive correlation with coefficient r = + 0.91. Comparison of IE and traditional courses implies that IE methods enhance problem-solving ability. The conceptual and problem-solving test results strongly suggest that the use of IE strategies can increase mechanics-course effectiveness well beyond that obtained with traditional methods [original emphasis].”[5]

The Pew Center for Academic Transformation has been interested in examining transformation of courses from passive to active learning experiences by using controlled experiments. One of its beneficiaries, the University of Massachusetts-Amherst, conducted a two year study of courses redesigned for use of a Personal Response System. The Office of Academic Planning and Assessment at University of Massachusetts concluded that in these courses “[attendance] in the redesigned sections was consistently high, and students performed consistently better on the new problem-centered exams compared to the old exams based on recall of facts.”[6]

Lastly, a recent study by Kennedy and Cutts examined actual response data per student over the course of a single semester. In-class questions were of two types, one which asked the student to self-assess their study habits, and the other which focused on course content. These data were analyzed with end-of-semester and end-of-year exam performance results using cluster analysis and MANOVA. Their investigation showed that students who more frequently participated in use of the personal response system and who were frequently correct in their responses, performed better on formal assessments. Students who infrequently responded, but did so correctly, nevertheless performed poorly on formal assessments, suggesting level of involvement during the class is positively correlated with better learning outcomes.[7]

To sum up, what my search found was that where data exist, they are positive in supporting not just the use of personal response systems, but more specifically, the pedagogy associated with the use of these systems. These studies suggest that better learning outcomes are really the result of changes in pedagogical focus, from passive to active learning, and not the specific technology or technique used. This is an important caveat to interested faculty–the technology is not a magic-bullet. Without a focused, well-planned transformation of the large lecture format and pedagogical goals, the technology provides no advantage. If the manner in which the technology is implemented in class is not meaningful nor interesting to the student, participation lapses. Ultimately, what these studies demonstrate is that student participation is key to positive learning outcomes.


  1. See E. M. Rogers, Diffusion of Innovations (New York: Collier Macmillan, 1983).2. C. Dreifus, “Physics Laureate Hopes to Help Students Over the Science Blahs,” New York Times(Nov. 1, 2005), http://www.nytimes.com/2005/11/01/science/01conv.html?ex=1132376400&en=c13349a4a1f8cf78&ei=5070&oref=login; Alorie Gilbert, “New for Back-to-school: ‘Clickers,'” CNET’s News.com (2005), http://news.com.com/New+for+back-to-school+clickers/2100-1041_3-5819171.html?tag=html.alert; Natalie P. McNeal, “Latest Campus Clicks a Learning Experience,” The Miami Herald (Oct 17, 2005), http://www.miami.com/mld/miamiherald/news/12920758.htm.3. Steven R. Hall, Ian Waitz, Doris R. Brodeur, Diane H. Soderholm, Reem Nasr, “Adoption of Active Learning in a Lecture-based Engineering class” IEEE Conference, (Boston, MA, 2005), http://fie.engrng.pitt.edu/fie2002/papers/1367.pdf; SW Draper and MI Brown, “Increasing Interactivity in Lectures Using an Electronic Voting System,” Journal of Computer Assisted Learning, 20 (2004): 81-94, http://www.blackwell-synergy.com/links/doi/10.1111/j.1365-2729.2004.00074.x/full/; Ernst Wit, “Who Wants to be… The Use of a Personal Response System in Statistics Teaching” MSOR Connections 3(2) (2003), http://ltsn.mathstore.ac.uk/newsletter/may2003/pdf/whowants.pdf4. Diane Ebert-May, Carol Brewer and Sylvester Allred, “Innovation in Large Lectures–Teaching for Active Learning” BioScience 47 (1997): 601-607, 604.

    5. Richard R. Hake, “Interactive-engagement Versus Traditional Methods: a Six-thousand-student Survey of Mechanics Test Data for Introductory Physics Courses,” American Journal of Physics 66 (1998): 64-74, http://www.physics.indiana.edu/~sdi/ajpv3i.pdf,18.

    6. Office of Academic Planning and Assessment, University of Massachusetts, Amherst, Faculty Focus on Assessment, v.3(2) (Spring 2003), http://www.umass.edu/oapa/oapa/publications/faculty_focus/faculty_focus_spring2003.pdf, 2.

    7. GE Kennedy, QI Cutts, “The Association Between Students’ Use of an Electronic Voting System and their Learning Outcomes,” Journal of Computer Assisted Learning21 (2005): 4, 260-268, http://www.blackwell-synergy.com/doi/pdf/10.1111/j.1365-2729.2005.00133.x