How to Review for an Exam if Professor Gives a Review Sheet
A lot of what I do as a college teacher draws upon the accumulated wisdom and practice of my profession, plus my personal experience. I accrue ideas and strategies from mentors and colleagues, I read about pedagogy, I attempt to get a feel for what works and what doesn't in my classes, and I ask my students what is working for them. That'due south what I doubtable that most of us practice, and probably it works pretty well.
But as stats guru and blogger Andrew Gelman pointed out not too long ago, we don't oft formally test which of our practices work. Hopefully the accumulated wisdom is valid — but if you're a social scientist, your training might make you want something stronger than that. In that spirit, recently I ran a few numbers on a pedagogical practice that I've ever wondered about. Do review sheets help students prepare for tests?
Background
When I commencement started teaching undergrad courses, I did not brand review sheets for my students. I didn't think they were particularly useful. I decided that I would rather focus my fourth dimension and free energy on doing things for my students that I believed would actually help them learn.
Why didn't I think a review sheet would be useful? There are two ways to make a review canvas for an exam. Method #i involves listing the important topics, terms, concepts, etc. that students should report. The review sheet isn't something you study on its own — it'due south like a guide or checklist that tells you what to study. That seemed questionable to me. It's substantially an outline of the lectures and textbook — pull out the headings, stick in the boldface terms, and voila! Review sheet. If anything, I thought, students are better off doing that themselves. (Many resources on written report skills tell students to scan and outline before they beginning reading.) In fact, the first time I taught my big Intro course, I put the students into groups and had them make their own review sheets. Students were non enthusiastic about that.
Method #2 involves making a certificate that actually contains studyable data on its own. That makes sense in a class where there are a few critical nuggets of knowledge that everybody should know — like maybe some key formulas in a math class that students demand to memorize. Merely that doesn't actually utilize to about of the courses I teach, where students demand to broadly understand the lectures and readings, make connections, apply concepts, etc. (As a upshot, this assay doesn't really apply to courses that use that kind of approach.)
And so in my early days of instruction, I gave out no review sheets. But boy, did I become protests. My students really, really wanted a review sheet. So a couple years ago I finally started making list-of-topics review sheets and passing them out before exams. I got a lot of positive feedback — students told me that they actually helped.
Generally speaking, I trust students to tell me what works for them. Merely in this instance, I've held on to some nagging doubts. Then recently I decided to collect a little information. It'due south not a randomized experiment, simply even some correlational data might exist informative.
Method
In Blackboard, the course website management system we apply at my school, you can plough on tracking for items that y'all post. Students have to be logged in to the Blackboard organisation to access the course website, and if y'all turn on tracking, information technology'll tell you when (if e'er) each pupil clicked on a item item. Then for my latest midterm, the second one of the term, I decided to plow on tracking for the review sheet so that I could find out who downloaded it. And then I linked that data to the examination scores.
I posted the review sheet on a Monday, ane week before the exam. The major stardom I drew was betwixt people who downloaded the sail and those who never did. But I also tracked when students downloaded it. There were optional review sessions on Thursday and Fri. Students were told that if they came to the review session, they should come prepared. (It was a Jeopardy-mode quiz.) So I divided students into several subgroups: those who outset downloaded the sheet early in the week (before the review sessions), those who downloaded information technology on Thursday or Friday, and those who waited until the weekend before they downloaded it. I accept no record of who really attended the review sessions.
A quick caveat: It is possible that a few students could've gotten the review sheet some other style, like past having a friend in the class print it for them. But it's probably reasonable to assume that wasn't widespread. More than plausible is that some people might have downloaded the review sheet but never really used information technology, which I have no way of knowing about.
Results
Okay, so what did I find? Starting time, out of N=327 students, 225 downloaded the review canvas at some point. Most of them (173) waited until the final infinitesimal and didn't download it until the weekend before the exam. 17 downloaded it Th-Friday, and 35 downloaded information technology early in the calendar week. So patently most students thought the review canvass might aid.
Did students who downloaded the review sail do any better? Nope. Zip, zilch, zip. The correlation between getting the review canvass and test scores was nigh nil, r = -.04, p = .42. Here'southward a plot, farther broken downward into the subgroups:
This correlational analysis has potential confounds. Students were not randomly assigned — they decided for themselves whether to download the review canvass. Then those who downloaded information technology might take been systematically different from those who did not; and if they differed in some fashion that would affect their operation on the second midterm, that could've confounded the results. In particular, mayhap the students who were already doing well in the class didn't bother to download the review sheet, but the students who were doing more than poorly downloaded it, and the review sheet helped them shut the gap. If that happened, you'd observe a zippo correlation. (Psychometricians telephone call this a suppressor issue.)
So to accost that possibility, I ran a regression in which I controlled for scores on the outset midterm. The elementary correlation asks: did students who downloaded the review sail practice better than students who didn't? The regression asks: did students who downloaded the review sheet do better than students who performed just as well on the beginning midterm but didn't download the sheet? If there was a suppressor result, controlling for prior operation should reveal the effect of the review canvass.
But that isn't what happened. The two midterms were pretty strongly correlated, r = .63. But controlling for prior performance made no difference — the review sheet still had no consequence. The standardized beta was .00, p = .90. Here's a plot to illustrate the regression: this time, the y-axis is the balance (the divergence between somebody'south actual score minus the score we would have expected them to get based on the first midterm):
Limitations
This was non a highly controlled study. As I mentioned earlier, I accept no way of knowing whether students who downloaded the review canvas actually used it. I likewise don't know who used a review canvas for the starting time midterm, the one that I controlled for. (I didn't think to turn on tracking at the start of the term.) And there could be other factors I didn't business relationship for.
A better way to practise this would be to run a true experiment. If I was going to do this right, I'd go into a class where the instructor isn't planning to requite out review sheets. Tell students that if they enroll in the experiment, they'll be randomly assigned to get dissimilar materials to assistance them prepare for the test. Then you give a random half of them a review canvass and tell them to use it. For both upstanding and practical reasons, you lot would probably want to tell everybody in advance that you'll adjust scores so that if in that location is an effect, students who didn't go the sail (either because they were in the control group or considering they chose not to participate) won't exist at a disadvantage. You'd take to be conscientious in what y'all tell them well-nigh the experiment to balance informed consent without creating demand characteristics. Merely it could probably exist done.
Conclusions
In spite of these problems, I call back this information is strongly suggestive. The nigh obvious confounding gene was prior operation, which I was able to command for. If some of the students who downloaded the review sail didn't use it, that would attenuate the difference, simply it shouldn't make it get away entirely. To me, the most plausible explanation left continuing is that review sheets don't make a difference.
If that's true, why do students ask for review sheets and why do they call up that they help? As a pupil, you just have a limited capacity to estimate what actually makes a difference for you — because on any given test, you volition never know how well you would have done if y'all had studied differently. (By "express chapters," I don't hateful that students are impaired — I mean that there'due south a fundamental bulwark.) So a lot of what students exercise is rely on feelings. Do I feel comfortable with the material? Do I feel like I know information technology? Exercise I feel ready for the exam? And I suspect that review sheets offer students an illusory feeling of control and mastery. "Okay, I've got this thing that's gonna help me. I feel better already." So students become convinced that they make a difference, and then they insist on them.
I also suspect, by the way, that lots of other things work that fashion. To date, I have steadfastly refused to give out my lecture slides before the lecture. Taking notes in your own words (non rote) requires you to be intellectually engaged with the cloth. Following along on a printout might feel more relaxed, only I uncertainty it's better for learning. Maybe I'll examination that 1 adjacent time…
Students, fellow teachers, and anybody else: I'd welcome your thoughts and feedback, both pro and con, in the comments section. Thank you!
Source: https://thehardestscience.com/2009/05/28/do-review-sheets-help/
Postar um comentário for "How to Review for an Exam if Professor Gives a Review Sheet"