Our professors in medical school know a lot about what they teach. A lot. I still get a thrill out of the instances in which professors cite their own research in their lecture slides, effectively saying, “Hey, see this mechanism I’m teaching you right now? I was the one who figured it out.” Which, honestly, is awesome. It may sound obvious, but in medical school, we are taught science by some of the leading scientists in the world, which sounds like the perfect recipe for knowledge attainment from a student perspective. But does knowing something inside-and-out mean that you are actually the person most qualified to teach it? I think about the relationship between content expertise and the ability to teach fairly often. I remember watching the US Open as a kid and being baffled by the fact that the world’s best tennis players had coaches. “Why,” I wondered, “does Andy Roddick have a teacher if he’s a better tennis player than his teacher? Shouldn’t he be the coach?” Turns out that having expert-level ability in a particular area is not necessarily synonymous with being able to teach that skill effectively. And this is something that we need to be acutely aware of, both on the tennis court and in the lecture hall.
This isn’t to say that our medical school professors aren’t effective teachers. On the whole, I would argue that the professors I’ve encountered during my first six months at Jefferson practice effective pedagogy. But the former high school teacher in me says there is room for improvement, and has noticed a couple of high-yield and easy-to-implement teaching “hacks” that would improve overall instructional practice and student understanding if they were consistently instituted by each professor during every lecture.
I’ve seen first-hand how effective each of these pedagogical practices can be in a classroom setting: they kept me and my students afloat during the difficult adjustment period that was my first few months of teaching. Furthermore, they’re all evidence-based and backed by literature. And lastly, as mentioned, they are very easy to implement, and wouldn’t require any professor to massively overhaul their already-developed lecture material (which, given the time-consuming nature of research, and the sheer volume of what many professors teach, would not be a super reasonable request).
- Learning objectives. A well-written learning objective concretely describes the skill or specific knowledge that students should attain by the end of the lecture. It is not a broad list of topics that is to be discussed in lecture. A good learning objective might look like “List the factors that influence the concentration of urate in the blood,” (note the specificity and measurability) while a non-learning-objective might be “We will learn about desmosomes and gap junctions” (too vague; unmeasurable). Well-written learning objectives give students a clear and focused indication of where they should direct their energy and attention during a lecture. And when learning objectives are revisited by a professor during the course of a lecture, they function as an important road map that allows students to navigate the often overwhelming influx of information contained in each lecture. An old teaching coach of mine used to say, “If you can’t come up with learning objectives for your lesson, then you probably don’t know what you’re trying to actually teach.”
- Checks for understanding (CFUs). A CFU is a quick “pulse check” done with students during a lesson to assess how well they understand the material at that given point in time. The purpose of a CFU is twofold: it a) allows students to monitor their own learning (“How well am I understanding this lesson so far?”) and b) gives professors important data on student understanding (“Well, since two-thirds of the class couldn’t name the right enzyme, it means I need to take two minutes to reteach this important point before we continue”). A CFU can take on many different forms -- the simplest of which is probably a quick quiz using an audience response system (or even just a show of hands) in the middle of class to gauge student understanding.
- Real-world connections. At some point, every student has asked, “When are we going to use this?” Even if the argument isn’t wholly realistic (sorry former Algebra 2 students – you probably won’t actually use quadratic functions to model the parabolic path of a basketball next time you’re playing a pickup game), making real-world connections to seemingly intangible scientific material is crucially important for long-term student engagement and understanding. In medical school, this is effectively equivalent to presenting clinical correlations. It seems unlikely that I’m going to remember anything about folic acid down the road if I’ve only been shown its structure and role in various biochemical pathways. If I can recall pictures of babies with spina bifida and pregnant women eating leafy greens, though, it’s a whole different ballgame.
Where Are We Now?
How do our professors actually do in regards to writing learning objectives, using CFUs, and making real-world connections to material? To get a general sense of this, I looked at randomly-selected lectures from twenty different microbiology and/or physiology instructors and counted how many of the twenty professors used each of the three techniques. Here’s what I found:
- Learning objectives: Nearly all professors had well-written learning objectives (as defined above) in their syllabus notes, but only 30% explicitly included them in their lecture slides as well. Learning objectives are incredibly effective as an active learning tool, and for this reason, should be woven into the professor’s direct instruction as s/he delivers lecture material (and not just confined to the textbook!) Only 5% of professors purposefully revisited the learning objectives for the lesson during the lecture, giving students important “signposts” for where they are in the arc of the lesson.
- Checks for understanding: Just 25% of professors worked at least one explicit check for understanding into their lectures. This was, in my opinion, the most disappointing result. Three out of every four lecturers, according to this data, are not giving students a structured opportunity to monitor their own learning during lecture. In almost every case, the lecturers who were using CFUs simply gave a quick set of quiz questions in the middle of a lecture that all students were asked to respond to (which is easy, quick, and effective -- other professors, take note!)
- Real-world connections: Almost every professor (85%) included some sort of clinical correlation in their lecture, allowing students to draw concrete connections between difficult-to-grasp biochemical pathways and clinically recognizable diseases and disorders. As someone who thinks immediately of that one Netter-Pellagra-drawing every time I hear the word “nicotinic acid,” I was thankful for this result, and can probably attribute a lot of my learning to the fact that professors are explicitly making these connections during lecture. But let’s bump this number up to 100%!
How many professors, out of the twenty, included all three hacks in their lectures? Only two (and yes, one of them was Dr. Ronner). While nearly every professor (19/20) I checked out was using at least one of the three techniques, 12 of those 19 were only using one of the three techniques, meaning almost every professor studied could easily insert at least one more of these quick “hacks” into their lectures to markedly improve student understanding and engagement. From my perspective, this looks like great news – It’s the equivalent of a professional basketball player being told she could improve her vertical by just lacing up her sneakers a bit differently. A quick, easy, and high-yield fix for all involved.
Our professors are all excellent scientists and accomplished researchers. But let’s push them to make their pedagogical practices as strong as possible, too.
Read more at www.physicianexecutiveleadership.com!
About The Author
Paul Leo is a medical student in the Sidney Kimmel Medical College Class of 2019.