I just got back from attending The Teaching Professor Technology Conference for the first time. It was a very well-run conference of about 600 attendees. That many people passionate about teaching (and technology) energized me. Of course there were presentations about the latest tech tools, but several people, including the keynote speaker, talked about putting pedagogy before the flashy tools. Spot on.
I love it when I learn something that seems obvious in hindsight. Here's one of those for me: students (especially first-generation students like I have at EKU) don't necessarily know how much effort we expect them to put into each assignment. Something we might think should take 1-2 hours they might think they can shoot off in 20 minutes. I wager that faculty's time estimates are inflated, so perhaps time is not the best metric to use. But imagine if faculty gave students some indication of the effort they estimate and expect an assignment would take.
One of my classes is about information literacy and I've been thinking of indicating the difficulty (or expected effort) of each assignment using a information source analogy. For example, easy assignments could get a rating of text or Facebook post (quick and easy to read). More difficult assignments could get blog or news article, and the most difficult could get journal article. You could also use a more familiar star rating or some other symbol. I think this might help students schedule their studying and avoid surprises. I'm a bit concerned about the difficult ratings on the hardest assignments scaring or intimidating students. Perhaps including some language about growth mindset would be helpful ("We haven't done an assignment this tough in class -- yet").
Another cool idea is to have the students provide their own difficulty or effort ratings of the assignments (after the fact) and then report those ratings the next semester. Perhaps students will trust and take heed of other students' ratings more than professor ratings. This is sort of a TripAdvisor approach to the issue.
A similar idea is to have students write instructions for the course ("read the text first, then do the discussion boards" or "preparing for the tests took more time than I thought"). I've done this sort of thing in some of my classes by asking students at the end of the semester what they wish they had known at the start of the semester, and then giving these to students the next semester. I have gotten some advice that I did not pass along ("drop this class now"), but most of the advice is appropriate and helpful. Again, it has the cache that my advice does not.
Friday, October 9, 2015
Monday, September 21, 2015
Flipping with a Jigsaw, epilogue
I'm now in my third semester teaching my flipped social psychology course. Everything is going well -- I'm now to the pleasant stage of not needing to revise my materials much -- but I did make some changes. I led a Professional Learning Community last spring about metacognition (you can see some info about that here) and one idea I got from that literature is the importance of retrieval in learning. I admit that I used to think of tests as mostly assessments of learning. But research in psychology (where most of the ideas about metacognition come from) clearly demonstrates that learning is strengthened by retrieval. So, this means that tests can (should) be thought of as formative tools that enhance learning. This was the motivation for having so many smaller quizzes that are lower stakes in my flipped course. But if taking the quiz once is good, perhaps taking it a second time would be better. So this is why I changed my class structure.
I used to have the students engage in the jigsaw, take the quiz, and then we'd go over the quiz right away. Now we do the jigsaw and take the quiz, but we don't go over it right away. Instead, the next day they come in and take the exact same quiz a second time, using the same quiz itself with all their notes and comments on it. This means that between taking the quiz the first and second time they can go back to the text or recorded lectures to look up the answers to the questions on the quiz. Or they can talk to other students about them. Either activity would be beneficial in terms of learning the material, I believe. The idea is that thinking of the answers to the questions twice is twice as much retrieval.
Correct answers are worth one point on the first version of the quiz, but only half a point if they get it wrong first and right the second time. So it's in their interest to do as well as possible the first time.
This has lead to higher final grades on the quizzes. Scores on the first six quizzes are on average 10.6% higher than in the previous two semesters (combined). That's a letter grade higher -- not insignificant. So the question is does that reflect a commensurate increase in learning. I don't have data that speaks to that definitively. And I do think the students have pretty quickly figured out a strategy for the second quiz: if they were trying to deiced between two options, pick the best one the first time and the second best the second time. Fair enough, and clever. But they still get more points for the first choice if they get it right, and they still have to narrow it down to two options from four. I wish I had good data about their learning that was not contaminated by this strategy, but I just don't. I just have to trust the research on retrieval, I suppose.
This has introduced some logistical complexity. Each day starts with taking the previous day's quiz a second time, then we go over the quiz, then do the jigsaw on that day's material, and finally take the quiz on that material. So twice as many scantron forms used and twice as much time set aside for quizzes. And the quiz grading is more complex because I have to compare both versions to give credit for getting it right the second time. (A dependable graduate assistant is worth her weight in gold!) Overall I think the increase in learning as result of added retrieval is worth the added complexity.
My courses are always little laboratories. I'll see how this turns out.
I used to have the students engage in the jigsaw, take the quiz, and then we'd go over the quiz right away. Now we do the jigsaw and take the quiz, but we don't go over it right away. Instead, the next day they come in and take the exact same quiz a second time, using the same quiz itself with all their notes and comments on it. This means that between taking the quiz the first and second time they can go back to the text or recorded lectures to look up the answers to the questions on the quiz. Or they can talk to other students about them. Either activity would be beneficial in terms of learning the material, I believe. The idea is that thinking of the answers to the questions twice is twice as much retrieval.
Correct answers are worth one point on the first version of the quiz, but only half a point if they get it wrong first and right the second time. So it's in their interest to do as well as possible the first time.
This has lead to higher final grades on the quizzes. Scores on the first six quizzes are on average 10.6% higher than in the previous two semesters (combined). That's a letter grade higher -- not insignificant. So the question is does that reflect a commensurate increase in learning. I don't have data that speaks to that definitively. And I do think the students have pretty quickly figured out a strategy for the second quiz: if they were trying to deiced between two options, pick the best one the first time and the second best the second time. Fair enough, and clever. But they still get more points for the first choice if they get it right, and they still have to narrow it down to two options from four. I wish I had good data about their learning that was not contaminated by this strategy, but I just don't. I just have to trust the research on retrieval, I suppose.
This has introduced some logistical complexity. Each day starts with taking the previous day's quiz a second time, then we go over the quiz, then do the jigsaw on that day's material, and finally take the quiz on that material. So twice as many scantron forms used and twice as much time set aside for quizzes. And the quiz grading is more complex because I have to compare both versions to give credit for getting it right the second time. (A dependable graduate assistant is worth her weight in gold!) Overall I think the increase in learning as result of added retrieval is worth the added complexity.
My courses are always little laboratories. I'll see how this turns out.
Thursday, August 20, 2015
Two new activities based (loosely) on metacognition
It's a new
academic year and I'm busy getting ready for the new year, like every other
teacher. I'm always on the hunt for new ideas to imporve my teaching, and a
couple of seemingly serendipitous events have lead to two new activities.
The first
activity came as a result of my Chair forwarding this piece about the effect of asking students to set goals
for their education. The author of that piece refers to research about goal
setting that certainly is encouraging. Two caveats to that research: 1) the
intervention in the main study he refers to (described here) is quite labor- and time-intensive, and 2) the author
tried a more modest intervention and referred to it as a "failure".
Despite those issues I decided to attempt to create an activity that would get
at the same effect. In the past I've asked
students on the first day to report their reasons for taking my class, but that
usually produces non-helpful (but honest) answers like "to fulfill
requirements" or "it fit my schedule". In this activity
I start out asking the same question (reason for taking this class), and what
grade they want to earn in the course. But I then move on to getting them to
think about their goals for the course other than a particular grade. I then
ask them to reflect on their habits that both get them closer to and prevent
them from attaining their educational goals. Hopefully this elicits some honest
reflection. Then, and I think here is where the action takes place, I ask
them to imagine their life after they graduate if they were somehow able to
increase and improve their helpful habits and minimize or eliminate their
un-helpful habits, not just in this course but in their life as a whole.
Describe what their life would be like if that happened. Conversely, I also
ask them to consider what their life would be like if they decreased their
helpful habits and increased their un-helpful habits. Finally, capitalizing on
a little cognitive dissonance research about public commitments, I ask them
to at least try to minimize the un-helpful and maximize the helpful habits for
the next semester, and ask them to sign and date the document.
I plan to hand
out the activity either the first or second class of the term and get them to
return it by the next class. I'll hold on to them until perhaps midterm (this
is a flipped class that has quizzes in every class, otherwise I would hold
them until just after the first exam) and them return the activities to them
and ask them to consider their progress on their habits. Reflecting on their
process is a metacognitive activity. My activity can be seen here, feel free to use it as you see fit.
The other
activity came about because I got an email from the Director of the Honors
program at my university talking about the students in Honors electives. I
teach several such courses and was a bit startled to be reminded that
the students were mostly first-semester sophomores. I had thought of them as
more advanced students, so the reminder was important. My Honors
electives are typically taught as seminars, much like a graduate seminar, and I
often assign empirical articles that would be appropriate for graduate
students, but perhaps a new type of reading for these students. Now, I don't shy
away from assigning challenging readings (indeed there's research about the
value of doing so), but I also don't want to overwhelm them early on. So, I
created an activity I plan to have students complete the first day we discuss a
reading -- before we get to the discussion. I first ask if they had done the
assigned reading. Then, assuming they had, I ask them how they approached the
reading activity -- did they read alone, was music playing, etc. Then I ask
them how they did the reading -- did they read it all at once, did they write
in the margin, highlight, take notes, etc. Then I ask them what their goal was
while they were reading and I give them a list of options based on Anderson and
Krathwohl's taxonomy (similar to Bloom's taxonomy), including understanding,
analysis, synthesis, etc. They can choose as many as apply. I bet this will be
the first time they realize that they can read with different goals in mind.
Finally, and this mirrors the previous activity, I ask them about what they did
that helped them understand the reading and what they did that did not help
them. I plan to have them share their responses with a classmate in the hopes
that they would get some helpful tips from others, or at least realize that
they are not alone in struggling with the reading. Again, I plan to collect the
completed activities and then return them a month or so later so they can compare their current practices
with what they said was helpful and not helpful. The activity can be seen here.
These are new
activities for me so I can't report here on their effectiveness. Perhaps I will
comment about that in a future post. Good luck with the new year!
Thursday, May 28, 2015
The Dreaded Teaching Evaluations (good news!)
Where I work we use the Student Ratings of Instruction (SRI) from IDEA. They are useful instruments meant to be used to modify and improve instruction, but of course we mainly use them (inappropriately) to evaluate instruction. I must admit that I've never found them too useful in terms of improvement, and often found the results to be annoying (the adjusted scores drive me nuts). But that may change now that I've heard a presentation by the president of IDEA, Ken Ryalls.
Dr. Ryalls (he's a former faculty, in fact a social psychologist like myself) explained that IDEA has always been about improving instruction, but that they are now trying to emphasize and enhance that function of the IDEA reports that faculty get about their classes.
Faculty get to indicate how important 12 different objectives are for their courses (from no importance, minor importance, to essential). The objectives range from gaining factual knowledge, to learning to apply course material, to developing a clearer understanding of, and commitment to, personal values. You can select them all to be essential or none, it's up to you. You then get a single score that indicates how well you did on the objectives you said were most important, and this number is said to be the most important in the whole report. I understood all of this pretty well. What Dr. Ryalls told me that I did NOT know was that IDEA has a set of papers that target each of the objectives (find them here). These papers are written by faculty with expertise in thes relevant areas, and are research-based. So, if you say that getting students to apply course content is important to you but your results indicate that you did not do so well on that objective, you can go to the website and learn about ways to improve in this area. That is pretty handy.
But IDEA doesn't stop there. They also have no less than 57(!) papers about a variety of pedagogical topics, like the flipped classroom, deep learning, team teaching, getting students to read, the list goes on! They also have teamed up with the POD network (Professional Organizational Development) to write papers about the other items on the SRI questionnaire. So if your students say that you don't show a personal interest in their learning, there's a paper about how to do that better. Perhaps you want to get your students to think creatively -- there's a paper on that.
In short, there's a lot more to the IDEA organization than those pesky teaching evaluations. Used correctly (as in, as IDEA means for them to be used), the IDEA reports can really help faculty target their efforts to improve instruction on the objectives they say are the most important. I've been doing it wrong all this time!
Monday, March 16, 2015
Flipping with a Jigsaw, part 6
Part
6: Conclusions and suggestions
So that’s how I flipped my Social
Psychology course. Overall it required a considerable amount of work, but no
more than prepping a class for the first time. And, like most courses, now that
I have completed the first time teaching it, the second time is much less work.
In fact I am teaching it for the second time now, and I can report that it
going just as well as the first time (and requires much less work).
Here are a few of what I think are the
main points from this series of blogs:
- I challenge instructors to really consider the Dee Fink question: are you creating courses that will result in significant learning experiences that will endure well past graduation? How would you change your course if every student was as eager and able as your best student?
- We preach about the importance of active learning and many instructors insert episodes of active learning into their lectures. Flipping takes you all in; class sessions are (or can be) devoid of lecturing and all active engagement by the students.
- Flipping requires the willingness to give up lecturing at least to some degree, and perhaps totally. I admit that this was hard for me. I was used to receiving the students’ attention, with all the power and status that affords. Giving that up is a sacrifice. But I believe the benefit to the students is more important than the boost I get to my ego or my entertainment.
- There are many things you can do during the class time instead of lecturing. I chose the Jigsaw and can enthusiastically recommend it. The Jigsaw has a proud history and impressive empirical support. It’s also messy, loud, and chaotic. And I mainly stand around doing nothing while the students are working – but that’s the point: they are engaging with the material in active ways, not me. And I think that’s the way it should be.
Here are my suggestions:
- Realize that flipping a course will require a significant amount of prep work. I started designing my flipped class and the materials months before the class started. Build in that prep time.
- Read up on and consider the Jigsaw. It can work in nearly any discipline, from sciences to the humanities. It does require that you trust your students to not only learn material but also teach it to their peers. The success of the Jigsaw relies on the materials you design. And remember that the Jigsaw demands the engagement of every student in class, something that I’ve struggled to achieve in other formats.
- Use existing resources whatever you decide to do during the flipped class sessions. I used the essay questions I had crafted over the years and many active learning materials as the prompts for the Jigsaw. I also used many existing multiple choice questions for the quizzes. Take advantage of the work you have already done. I also suggest you look (again) at the resources that come with your textbook. In my opinion these have gotten better over the years and are worth a new look.
- And finally, don’t fear change. I had my Social Psychology course ‘in the can’. I could walk into almost any day in the semester and rattle off the lecture without any prep, if I had to. And I think I did a pretty good job of it, too (of course). But Dee Fink’s presentation made me question the effectiveness of my course in achieving a broader set of goals. I’ve enjoyed treading the line between being fearless and foolhardy. I’ve not always fallen on the right side of that line, but I also know that line will take me where I really want to be.
Monday, March 9, 2015
Flipping with a jigsaw, part 5
Part
5: Student perceptions and performance
Of course I was nervous about how students
would take to this dramatic departure from traditional classrooms. If nothing
else they would not be spacing out in class. So during the first and last mental
health days (I put them into the schedule to give the students a break from the
Jigsaw, and to allow me to use some of my favorite class activities) I passed
out a brief assessment of their attitudes toward the flipped/Jigsaw format.
The first question asked them if they
would like to stay with the flipped format or go back to the traditional format
(I stressed that switching was a real possibility in the first wave of data
collection). Here’s what they said:
That is, all of the students said they
preferred the flipped format in the first wave, and nearly all (13 of 15) said
they preferred it in the second wave.
I also asked two questions that asked
them to compare the flipped course with other traditional courses they’ve
taken. One was about how hard they are working in this class, and the other was
about how much they were learning in this class. The results:
In words, this means that they felt this
class was about the same as other courses in terms of difficulty, but a
majority felt they were learning more in this class than traditional courses.
I asked three overall evaluation
questions. One was about how much they enjoyed the flipped format:
It’s hard to see in this figure, but it’s
a 5-point scale. So a total of 3 responses indicated something less than 4 on a
5-point scale.
The other two questions were copied from
the teaching evaluations we use at EKU (IDEA): teacher excellence and course
excellence:
These data are consistent with data I
typically get when teaching this course in the traditional way.
I also asked three open-ended questions: one
about what they like least about the flipped format, and one about what they
liked the most, and one about what they would like to change about the flipped
format. The criticisms centered on the necessity of moving around so much
during class, and how long the video presentation were (they’re correct; some
are 25-30 minutes). Many students had no criticisms or suggestions for changes.
When I showed my Chair the comments about what they liked the most he accused
me of writing them myself. Some were admittedly amazing:
“Smaller groups allow
me to ask questions about something I don’t understand whereas I wouldn’t ask
if I had to in front of the whole class.”
“I am actually learning
quite a bit and the format keeps me motivated.”
“You control your
grade, it is dependent on how you study and what you put into it.”
“I like that I am
learning more and retaining more. It also takes the stress of tests away so I
can focus on learning and understanding the material more.”
“Being on equal ground
with others. Makes it easier to understand at times.”
Overall I am very pleased by these
results. My flipped class is not seen as a blow-off, and the students seemed to
appreciate the purpose of the flipped format. I was concerned that I would take
a hit on my teaching evaluations because of the novel format. These informal
mid-term evals were encouraging, and my official (IDEA) teaching evals were almost
identical to the previous time I taught the same course in a traditional
format.
Unfortunately it is not possible to make
direct comparisons between the traditional and flipped versions of my course in
terms of what my students learned. While many questions from the exams in my
traditional course found their way on the quizzes in the flipped course, the
overlap was not complete. However, I can report that the distribution of final
grades did not differ substantially:
These data contain grades from the past 6
times I taught the class using the traditional format (total number of students
= 162), and just one flipped class (N
= 28). Perhaps a slight skew to higher grades in the flipped course, but not
dramatic (for the stats-philic, the average final percentage in the traditional
course was 78%, 83% for the flipped, t
= 0.22, ns.).
These data are not ideal for assessing
what students learned in class (nor for what they might remember a year or ten
later), but they are the best I can get. This is certainly a frustration when
comparing teaching methods generally. And you can forget about random
assignment. But overall my fears were unrealized and my hopes largely
fulfilled. And that counts as success in my book.
Next
up: Conclusions and suggestions
Subscribe to:
Posts (Atom)