Download Engaging Students Using “Clickers” in Large Enrollment Courses

Transcript
Engaging Students Using “Clickers” in Large Enrollment Courses
Janelle Sikorski
Department of Geology, Miami University
I. What are clickers?
Classroom response systems or “clickers” are instructional technology that allow faculty of any
discipline or level to efficiently ask questions and receive student responses frequently within a single
class session. Clickers are not the proverbial “magic bullet”, i.e., they are not the solution to all the
pedagogical challenges unique to large enrollment courses nor are they a formal pedagogy by themselves.
Clickers are a tool, however, that can be used in a variety of different ways. As recently stated by Beatty
and Gerace (2009),
“don’t ask what the learning gain from classroom response system (CRS) use is; ask what pedagogical
approaches a CRS can aid or enable or magnify, and what the learning impacts of these various
approaches are.”
II. Why use clickers in a large enrollment course?
Classroom response systems provide faculty with a tool to easily collect, process, and respond to
student questions in large enrollment courses. Large enrollment courses present pedagogical challenges
because of their size and physical environment (Geski, 1992). A specific challenge for faculty at Miami
University is learning how best to incorporate and adapt more active learning methods into their large
enrollment courses. Such work is being funded through the Top25 initiative, which seeks to convert our
top 25 enrolled courses from predominantly lecture-based to more inquiry-based. The core premise of
inquiry-based learning is the requirement that learning should be centered around student questions. In
large courses with 90 to 200 enrolled students, the ability for unaided faculty to solicit and appropriately
respond to every student question every class session is unrealistic; yet this inability could significantly
hinder the learning process for many students (Kuh, 2008).
Why university-level faculty use clickers (Bruff, 2009):
* to facilitate peer instruction
* to uncover student misconceptions
* to generate classwide discussion
* to create structure/pace to an individual class session
III. What kinds of questions can I ask with clickers?
The types of questions that can be asked using clickers are diverse with new examples being
developed in classrooms everyday. Clicker questions can be prepared ahead of a class session or
developed “on the fly” once in the classroom. In general, college-level faculty tend to use either content
or process type questions. For an overview of clicker questions with some examples review Appendix A.
Content Questions: The goal of asking content questions is to directly assess student learning in the classroom.
Content questions are often derived from lecture material, textbook reading, or some course assignment. These types
of questions most commonly take the form of multiple-choice questions and have clear correct or incorrect answers.
Content type questions often fall into four main categories including recall, conceptual, application, and critical
thinking (Bruff, 2009).
1
Process Questions: The goal of process questions is to assess student perceptions of the course material. Process
questions often ask the students to reflect on their experience with an assignment or concept. For example, they
might solicit student opinion on their confidence level with a topic, monitor their progress on course projects, or
collect their opinions on open-ended course-related questions (Bruff, 2009). One literature review (Fies and
Marshall, 2006) found that process/reflection type questions were not used in the classroom as frequently as content
type questions.
When drafting clicker questions ask yourself (modified from Bruff, 2009):
*Why am I asking this question?
*What do I hope to learn about my students by asking this question?
*How can I use the results of the question to help facilitate more active learning methods?
*How do I predict students will respond to this question?
*What will I say or do if the result distribution does or does not match my prediction?
IV. What are the challenges of using clickers in large enrollment courses?
Despite perceived benefits of clicker use reported by faculty (Bruff, 2009; Caldwell, 2007) using
a classroom response system creates its own set of challenges for faculty. Five such challenges are
discussed below:
Challenge #1: A significant time investment is often required to learn how to best operate the
supporting software and how to manage collected data.
Set up the system well in advance of the start of the semester and give yourself some time to practice with
the clicker and supporting software. Keep a positive attitude. Learning how to effectively use clicker
technology and develop robust questions may take at least a year. To feel more comfortable using
clickers, try to speak with colleagues who are already using them in their classroom. Maybe together you
can build a digital warehouse of clicker questions for your discipline, as few such resources exist.
Challenge #2: Introducing classroom response systems may require faculty to modify their teaching
strategies and/or goals.
For example, clickers are often used to help facilitate peer instruction (Mazur, 1997) or generate
classwide discussions. For faculty who aren’t familiar with these methods or who are uncomfortable with
managing classroom discussions in a large classroom setting, additional preparation may be required.
Challenge #3: Using clickers to poll students multiple times in a single class session requires time.
Most faculty report the need to reduce the amount of topics covered in a single class period to
accommodate for the extra polling and discussion time. Faculty, however, have also reported that clicker
use allows them to more easily identify when their students have mastered a concept and are ready to
move onto a new topic. Thus in some cases, clickers can help faculty navigate course material more
effectively and better match course material to the needs of their students.
Challenge #4: Clicker use requires a large degree of instructor flexibility
Because clickers provide faculty with the opportunity to ask a question and receive nearly instant
feedback from their students, faculty now have the ability to respond to this feedback within the same
class period. For faculty this drives a need to be prepared to readdress a previously covered concept, to
move on to a new concept, or even to discuss an unexpected topic or concept as misconceptions present
themselves.
2
Challenge #5: Faculty need to be prepared to address student resistance to clicker use
As more active learning methods are introduced to traditional lecture-dominated classrooms, instructors
often experience student resistance to course activities (Brudzinski and Sikorski, 2011). As clickers are
often used to facilitate active learning, student attitudes toward clicker use can be negative. Trees and
Jackson (2007) concluded that “if students agree with these assumptions about the ideal learning
environment, the use of clickers may capitalize on an unmet need. If they do not, clickers may actually
increase student apathy and engender resistance.” Take the time to explain to your students what you
hope they will gain through use of clickers in the classroom. Having this discussion tends to increase
student acceptance of clicker use and helps them better connect the practice to the learning goals of the
course.
V. Do clickers help students learn?
“The effectiveness of clicker use on student learning probably is greatest when clicker use is connected to
clear learning goals or course pedagogy.”
Considering the significant commitment of faculty time to develop the use of clickers and the
financial cost to our students, it would be reassuring to know that students learning in classroom response
system-supported classrooms perform better than students in more traditional classroom settings. The
answer to this question remains unclear and remains an area for additional, well-conceived research.
Specifically, Fies and Marshall (2006) identified the need to move beyond anecdotal evidence and
traditional classroom pedagogy and define what clickers add to the learning environment. They noted the
need for:
* More controlled comparisons between classrooms in which the only difference between the two
classrooms is the use of clickers or not.
* More studies exploring the connection between clicker use in connection with diverse populations and
content areas.
* More studies exploring clicker use in connection with diverse pedagogical approaches.
Some preliminary research in these areas is starting to emerge (Patterson et al., 2010; Morling et
al., 2008; Caldwell, 2007; Trees and Jackson, 2007), but infrequent use of clickers within the classrooms
of two of these studies (Patterson et al. 2010; Morling et al., 2008) complicate interpretations of published
student performance. The effectiveness of clicker use on student learning probably is greatest when
clicker use is connected to clear learning goals or course pedagogy. As suggested by Patterson et al.
(2010), there must be “an active dialogue among faculty and students about the [clicker] questions and
their distracters,” otherwise they felt we are replacing the opportunity for our students to practice verbal
communication skills for the push of a button.
3
VI. What do faculty need to get started? What do the students need?
In order for a classroom response system to be used, students must first purchase a response
device and then register that device online. For example, Miami students register their clickers through an
online tool currently on Blackboard. Instructors using a classroom response system will require a clicker
receiver and the appropriate supporting software. The software used at Miami University is known as
TurningPoint®. Every classroom computer on campus has or is capable of having TurningPoint®
software installed on it. Thus, there is no additional cost to faculty who want to use this software. Clicker
receivers are also provided at no cost to faculty through IT services. For students the advantage of having
a single university-supported system is that they will only need to purchase one clicker and can use it in
any of their classes over the course of their college career. New clickers currently sold online and in
Oxford will cost the students between $40-50. A more affordable option available to students is known as
ResponseWare. ResponseWare is an application that can be downloaded onto a student’s smart phone and
basically transforms the phone into a clicker. Students can purchase a one-year license for $16 or a fouryear license for $32. For more information about TurningPoint® use and support at Miami University
please contact Ricardo Maduro at [email protected].
VI. Want to learn more?
Brudzinski, M. & Sikorski, J. (2011). Impact of the COPEL (Community of Practice on Engaged Learning on
active-learning revisions to an introductory geology course: focus on student Development, Learning Communities
Journal, in press.
Bruff, D. (2009). Teaching with classroom response systems: creating active learning environments. San Francisco,
CA: Jossey-Bass.
Beatty, I.D. & Gerace, W.J. (2009). Technology-enhanced formative assessment: a research-based pedagogy for
teaching science with classroom response technology, Journal of Science Education and Technology, 18, 146-162.
Caldwell, J. E. (2007). Clickers in the large classroom: current research and best-practice tips, Life Sciences
Education, 6, 9-20.
Fies, C. & Marshall, J. (2006). Classroom response systems: a review of the literature, Journal of Science Education
and Technology, 15, 101-109.
Geski, J. (1992). Overcoming the drawbacks of the large lecture class, College Teaching, 40, 151-155.
Kuh, G. (2008). High impact educational practices: What are they, who has to access them, and why they matter,
Association of American Colleges and Universities, Washington, D.C.
Mazur, E. (1997). Peer instruction: a user’s manual. Upper Saddle River, NJ: Prentice-Hall.
Patterson, B., Kilpatrick, J., & Woebkenberg, E. (2010). Evidence for teaching practice: the impact of clickers in a
large classroom environment, Nurse Education Today, 30, 603-607.
Trees, A.R. & Jackson, M.H. (2007). The learning environment in clicker classrooms: student process of learning
and involvement in large university-level courses using student response systems, Learning, Media, and
Technology, 32, 21-40.
Websites: Derek Bruff’s “Clicker” website full with suggestions, examples, and instructional videos:
www.vanderbilt.edu/cft/cr/crs.htm
TurningPoint® website: http://www.turningtechnologies.com/
4
APPENDIX A: An Overview of Clicker Questions with Examples
In general, university-level faculty tend to use either content or process clicker questions. Below is a
summary of some ways these types of questions are used in my GLG111 course. For a more thorough
coverage of these question types and to see more examples representing a wider range of disciplines
please review Bruff, 2009.
Content Question Examples:
I. Recall Questions: These questions are used more to assess student learning than they are to engage
students in classwide discussion. Recall questions are often used at the beginning of a course lesson and
ask students to remember facts or procedures from a reading or previous course session. Recall questions
can both help prepare students for course material and help students/faculty identify areas needed for
review or additional instruction (Bruff, 2009).
Example #1: What type of volcano is more likely to erupt explosively?
A. Composite/stratovolcano
B. Shield volcano
C. Both are equally likely to produce an explosive eruption
D. Neither are likely to produce an explosive eruption
Example #2: What type of fault is this?
A. Right-lateral, strike- slip fault
B. Left-lateral, strike-slip fault
C. Normal, dip-slip fault
D. Reverse, dip-slip fault
II. Conceptual Questions: These questions assess a student’s ability to place general course vocabulary or
principles into an appropriate context. These types of questions are useful in uncovering and addressing
student misconceptions about course content (Bruff, 2009).
Example #3:
Question #1: Overall, how much of Earth’s mantle is liquid?
A.Very little to none (n =45)
B. About half (n=85)
C. Most to all (n=30)
Instructions given to students after question #1: At 15-200 km below the surface the conditions are also favorable
for some rocks to be molten (magma formation). At this depth is where most volcanic magma are formed. On your
diagram of Earth (part 2), draw a star at the depth of the source of magma.
Question #2: Overall, how much of the mantle is liquid?
A.Very little to none (n=109)
B. About half (n= 20)
C. Most to all (n =31)
5
Example #4: Two people are debating tsunami formation. Which person best understands tsunami?
A. Person #1 says, “Tsunami usually only form at convergent plate boundaries because that’s the plate boundary
where the largest earthquakes happen and the seafloor moves up or down during an earthquake.”
B. Person #2 says, “Tsunami form at any plate boundary when any earthquake occurs underwater, as long as the
earthquake is large enough. For example, California gets lots of earthquakes, and it is near the coast, so the San
Andreas Fault, which is a transform fault, can cause a tsunami.”
III. Application Questions: Application questions ask students to apply their previous knowledge and
experiences to a new situation. Some application questions may require students to follow a certain
procedure to solve the problem presented to them, while another may require students to make a
prediction of the outcome for a given scenario (Bruff, 2009).
Example #5: How do you think gases impact volcanic eruptions?
A.The higher the gas content the more explosive the eruption.
B. The lower the gas content the more explosive the eruption.
C. Gas content in magma/lava has no impact on the style of eruption.
IV. Critical Thinking Questions: These questions require students to analyze the relationship between
multiple variables or evaluate a claim based on some defined criteria. As critical thinking questions tend
to take the form of free response questions on exams and quizzes, it can be challenging to convert such
questions into a clicker-friendly format. For a more detailed discussion of the challenges and solutions of
using critical thinking clicker questions please review Bruff, 2009. I often use a series of
questions/images to elicit student responses and then use those responses to help facilitate a classwide
discussion. The follow-up discussion period allows me to discover the reasons students used to arrive at
their response. In the future, I hope to use these responses to better formulate a series of reason-focused
multiple-choice clicker questions.
Example #6:
Question #1: What characteristics or qualities make a glacier a glacier?
Students generate a list a glacier characteristics. The list is often overly vague and unfocused.
Question #2: Is this a glacier? (at least three carefully selected images are projected on the screen one at a time
with discussion occurring after each vote)
A. yes
B. no
After question #2, students are instructed to use our discussion to write their own definition of the word glacier.
6
Process Questions
I. Student Perception: These questions are used to gather information about students. Often these
questions take the form of opinion questions. The results of such questions allow faculty to better know
their students, but in most cases also allow students to better understand their peers (Bruff, 2009).
Example #1: Do you think observations made by scientists are fundamentally different than observations
made by non-scientists?
A. yes
B. no
II. Confidence Level Questions: These questions are often used to help instructors interpret the results of a
previous clicker question, such as a true or false question, to better understand if students really knew the
answer or if they successfully guessed. Other questions can be used to help students develop their ability
to reflect on their decision making process or help prepare them for an upcoming exam (Bruff, 2009).
Example #2: I can complete a velocity vs. depth chart in Excel.
A.Very Confident
B. Somewhat
C. Not at all
III. Monitoring Questions: These questions come in a variety of forms, but can be used to help faculty
better monitor the student’s learning experience. For example, are they on track with a major midterm
project? How long is it taking them to complete the course assignments? Do they remember the course
procedures outlined on the course syllabus? (Bruff, 2009). I have also used monitoring type questions in
my courses to determine student interest level in course material and adjust as necessary.
Example #3: Which hypothesis for the formation of this feature do you want to explore today?
A. Volcanic Eruption
B. Salt Dome
C. Meteorite Impact
Example #4: I have the homework #3 chart completed.
A. yes
B. no
Example #5: Some days I find the level of “quiet” chatter in the class a distraction to my learning.
A. I strongly agree with this statement
B. I somewhat agree with this statement
C. I disagree with this statement
Example #6: I know the appropriate way to contact the instructor if I have a problem in the course or need to
contact the instructor.
A. I strongly agree with this statement
B. I somewhat agree with this statement
C. I disagree with this statement
7