Tag Archives: physics education research

Khan Academy and the Effectiveness of Science Videos

This must-watch video is from our friend Derek Muller, physics educator and science video blogger.

 

Derek writes:

It is a common view that “if only someone could break this down and explain it clearly enough, more students would understand.” Khan Academy is a great example of this approach with its clear, concise videos on science. However it is debatable whether they really work. Research has shown that these types of videos may be positively received by students. They feel like they are learning and become more confident in their answers, but tests reveal they haven’t learned anything. [ed. note: textbook definition of pseudoteaching]

The apparent reason for the discrepancy is misconceptions. Students have existing ideas about scientific phenomena before viewing a video. If the video presents scientific concepts in a clear, well illustrated way, students believe they are learning but they do not engage with the media on a deep enough level to realize that what was is presented differs from their prior knowledge.

There is hope, however. Presenting students’ common misconceptions in a video alongside the scientific concepts has been shown to increase learning by increasing the amount of mental effort students expend while watching it.

References

My Ph.D. thesis, which includes the content from the publications below, can be downloaded here: Designing Effective Multimedia for Physics Education

2008 Muller, D. A., Sharma, M. D. and Reimann, P.,
Raising cognitive load with linear multimedia to promote conceptual changeScience Education92(2), 278-296

2008 Muller, D. A., Bewes, J., Sharma, M. D. and Reimann, P.
Saying the wrong thing: Improving learning with multimedia by including misconceptionsJournal of Computer Assisted Learning,24(2), 144-155

2008 Muller, D. A., Lee, K. J. and Sharma, M. D.
Coherence or interest: Which is most important in online multimedia learning?Australasian Journal of Educational Technology,24(2), 211-221

 2007 Muller, D. A., Sharma, M. D., Eklund, J. and Reimann, P.
Conceptual change through vicarious learning in an authentic physics settingInstructional Science35(6), 519-533

The implication of Derek’s research, both for online science videos and for in-the-classroom science lessons, are obvious. Derek discussed his PhD research in more detail in his previous post “What Puts the Pseudo in Pseudoteaching?” You can find more of Derek’s videos at Veritasium.com or on the Veritasium YouTube Channel. Follow him at @veritasium on Twitter.

What Puts the Pseudo in Pseudoteaching?

Today we have a guest post from Derek Muller, a physics educator who runs the science video blog Veritasium.  Derek is @veritasium on Twitter.

I have made some great pseudoteaching – but it was all in the name of research, let me assure you.

My interests in physics, education, and film converged in a doctoral dissertation at the University of Sydney starting in 2004. Since nearly all forms of education involve multimedia presentations in some form (e.g. a lecture with pictures, an illustrated text, an animation with narration, etc.), I proposed that, by studying this confined unit, we can learn some of the fundamental mechanics of teaching and learning which are at play in broader contexts. My central research question was:  how does one design effective multimedia to teach physics?

I made an eight-minute video on Newton’s First and Second Laws and it had all the hallmarks of outstanding pseudoteaching. Here’s a short excerpt from the video:

1. Looks like good teaching

  • The script was written as clearly and concisely as possible.
  • Ideas were demonstrated with concrete examples.
  • Animations were added to highlight the salient features of the examples.
  • Graphs for the motion were provided and explained with narration.
  • Research-based principles for multimedia design (developed by Richard Mayer and others) were adhered to.

2. Students feel like they are learning

  • Students were pre and post-tested plus a small group was interviewed.
  • Students reported higher confidence in the correctness of their answers on the post-test.
  • To describe the video, students used phrases like ‘simple’, ‘clear and concise’, ‘easy to understand,’ and ‘a good review’.

3. Very little learning takes place

  • For students with no high school physics, the average pre-test score was 6.0 out of 26.
  • The average post-test score, administered immediately after the video, was 6.3 (on the same questions).
  • Some students told me that they saw their (alternative) conceptions presented in the video (e.g. The force of her hand was greater than the force of friction so the book could slide with constant velocity).

Why I think lectures and videos so often amount to pseudoteaching

1. They are outside the zone of proximal development (ZPD)

  • Physics involves many interacting concepts. If students don’t have deep, well-defined, ideas about these concepts, the lecture will be well beyond their ZPD (and that is before we consider mathematical ability, misconceptions, etc.)

2. Misconceptions cause mis-perception

  • For example a misconception about acceleration – perhaps thinking of it as velocity – would mean a student is incapable of accurately perceiving what the lecturer is saying. Furthermore, if the lecturer is saying it in a clear, casual way, the student will think they understand it and that it corresponds to what they are thinking already.

3. Misconceptions cause proactive interference

  • Proactive interference is a construct from cognitive science. It is a term for when a previously learned idea/behavior interferes with a newly learned idea/behavior. I experienced this when I moved to Australia because here the light switches flick down for on and up for off.
  • Furthermore, this means that even if students have a ‘breakthrough’ they may revert to older ideas, days or weeks later. Just as I kept turning lights off, and turning on the windshield wipers (when trying to indicate) long after I knew what I really should be doing.

4. Lack of motivation and/or attention

  • Sometimes we all tune out. If the information does not pass our sensory buffer, it can have no effect on cognition.

5. No opportunity to ask questions

  • It is impossible to ask questions of a video and difficult to do in a lecture setting.

So what can be done to increase the effectiveness of multimedia presentations?

1. Make sure students are in the zone of proximal development

  • It is important that students have a strong understanding of the prerequisites.
  • It is also important that the educator knows the alternative conceptions prevalent in his/her audience. Having misconceptions puts students outside the ZPD even if their other prerequisites are strong.

2. Help the viewer correctly perceive the presentation by starting with the misconceptions

  • If the ideas that students are really thinking are presented first, they will perceive them correctly. This can then serve as a starting point for explaining how the scientific concept differs.

3.  Counter proactive interference by using previous conceptions as footholds

  • By tying into the student’s prior knowledge, the misconception acts as a conceptual peg on which the scientific knowledge is hung. According to studies on proactive interference (and science education research), the misconception is robust and likely to be recalled – so it is important that the scientific idea is closely tied to it.
  • The misconception should be discussed for its own merits – why is this idea so common? In what ways does it correctly reflect observations of the world? In what specific ways does it lead to inaccurate reasoning?

4. Make the presentation short and interesting. Use activities, questioning, reflection etc. around the presentation.

  • This should help keep attention and motivation.
  • Much of the learning would take place during the reflection activities.

The multimedia on Newton’s First and Second Laws that I outlined above I called the Exposition. I made two additional films, each of which included common misconceptions. One, called the Dialogue had the misconceptions presented as the genuine beliefs of one of the actors. Through discussion with the tutor character, these misconceptions were resolved. The other, called the Refutation, consisted of the same material as in the Exposition plus misconceptions stated and refuted. Here a short excerpt from the Dialogue:

After watching one of the misconception treatments, students’ confidence in the accuracy of their post-test answers improved about the same amount as after watching the Exposition. It seems watching any short instructional segment improves confidence by x. But in interviews they were more likely to say the video was ‘confusing’ or ‘hard to understand’. So how much did they learn? Scores on the post-test were significantly higher than for the Exposition treatment. In fact, students with no high school physics who watched the Dialogue nearly doubled their average score from 6 to 11 out of 26 (the Refutation was similar but not quite as impressive).

Even more interesting was how much mental effort students reported investing in watching the multimedia treatment. Students who watched the Exposition reported an average of about 5 out of 9 (‘neither low nor high mental effort’), whereas those who watched the Dialogue averaged 6 out of 9 (‘rather high mental effort’).  Depending on what is presented, students watch it in a different way (perhaps more actively), and that determines how much learning occurs.

How does this view help us understand teaching and learning more broadly?

For one thing, I think it shows that pseudoteaching is audience dependent.

In the discussion above I mainly used data from the Fundamentals stream – students with no high school physics background. Students in the Advanced stream (these are students who did well in high school physics) achieved the same gains across all multimedia treatments. Any ceiling effect would have been slight because their average post-test score was 85%.

Another pseudoteaching post mentioned how Feynman’s lectures became populated with graduate students and faculty. This is exactly the kind of audience for whom the lectures would not be pseudoteaching. These learners would:

  • Be in the Zone of Proximal Development.
  • Have few misconceptions (many fewer than undergraduates).
  • Have better formed schemas so proactive interference has less impact.
  • Be intrinsically motivated by physics and therefore very attentive to the presentation.

There is a remark often made at science education conferences, usually with a chuckle, “Can’t learn anything from these talks because you know we learn nothing from a lecture.” I hope everyone recognizes the problem with statements like these. We can learn from presentations. What and how much we learn comes down to the level of the presentation, our existing schemas and misconceptions, and our motivation and attention.

Full disclosure

I have the excellent fortune to rarely teach a class of more than 14 students. Most are very bright and keen and I have virtually no discipline issues. I know every student by name and one of my mottos is “never say anything a student could say for you.” My classes are much more a discussion than a lecture and I definitely feel like this is the best method for teaching and learning.

The point of this post is not to promote one-way presentations or video lectures. It is to raise the level of discussion about multimedia (and about teaching and learning more generally). I think the transmission/construction dichotomy is unproductive and misleading. It creates a very narrow view of education (like Animal Farm – “Four legs good, two legs bad,” “hands on good, hands off bad,” “doing good, listening bad,” “newfangled good, traditional bad,” etc.) Does constructivism really support hands-on, doing, not telling? I’m not sure it does. Constructivism says ‘learners construct there own understanding actively, by thinking,’ but it does not say how this can best be facilitated.  Listeners and viewers are not necessarily passive. I argue what is presented determines how the presentation is viewed which determines how much learning occurs.

References

My Ph.D., which includes the content from the publications below, can be downloaded here: Designing Effective Multimedia for Physics Education

2008 Muller, D. A., Sharma, M. D. and Reimann, P.,
Raising cognitive load with linear multimedia to promote conceptual changeScience Education92(2), 278-296

2008 Muller, D. A., Bewes, J., Sharma, M. D. and Reimann, P.
Saying the wrong thing: Improving learning with multimedia by including misconceptionsJournal of Computer Assisted Learning,24(2), 144-155

2008 Muller, D. A., Lee, K. J. and Sharma, M. D.
Coherence or interest: Which is most important in online multimedia learning?Australasian Journal of Educational Technology,24(2), 211-221

 2007 Muller, D. A., Sharma, M. D., Eklund, J. and Reimann, P.
Conceptual change through vicarious learning in an authentic physics settingInstructional Science35(6), 519-533

Afterword

Derek’s Veritasium videos are crafted using the results from his research. Here’s a great example:

Be sure to check out the entire collection Veritasium.com and at the Veritasium YouTube channel. I would like to add that Derek’s results are important and should inform our face-to-face class discussions as well.

Increasing Engagement in Science

As part of a session on innovative practices in science at TeachMeet New Jersey 2011, I gave a presentation entitled “Tips, Tools, and Techniques for Increasing Engagement in Science”

I have posted that presentation, complete with speaker’s notes and plenty of links to further information, here: http://bit.ly/EngageSci

Any feedback you have would be greatly appreciated! (e.g., is there a bigger theme I am missing, etc.) Thanks! J3BC3J3HSY8J

Reassessment Experiment

CV.3 (A) I can solve problems involving average speed and average velocity.

That learning goal is the thorn in the sides of many of my students right now.

They took their midterm exam last week and many missed the question associated with that goal. The (A) denotes that it is a core goal.  Which means that, based on this grading scale:

their quarter grade cannot go above 69 until all core goals are met.

I handed the exams back in class yesterday.  Naturally, many students wanted to reassess on the spot. Since I have an archive of quizzes from previous years, it was easy for me to print out a bunch and let them have at it.

And most of them missed it again on the reassessment. No surprise there, really. Without any remediation, it was just another shot in the dark.

So as an experiment, I posted the following to our class’s Edmodo page today:

Does CV.3 have you Down? If so, do the following by Monday:

(1) Explain, in detail, the difference between average speed and average velocity. Simply writing the two equations won’t be sufficient.

(2) Describe in detail a situation where an object’s average speed and its average velocity have the same value.

(3) Describe in detail a situation where an object’s average speed and its average velocity have different values.

(4) Create your own physics problem involving average speed and average velocity that is NOT a simple “plug-and-chug” type problem. (For example, “A car travels 50 miles north in 2 hours. What is its average speed and velocity?” is NOT acceptable.) Write up both the problem and a complete solution. Feel free to use pictures, graphs (even video) as part of your problem. Check out this link for non-“plug-and-chug” problem types: http://tycphysics.org/TIPERs/tipersdefn.htm

(5) Cite all resources (classmates, parents, books, web pages, videos, etc.) you used. (It doesn’t have to be in proper MLA format. A simple list is fine.)

Submit you work HERE on Edmodo. You should upload a file (word, PDF, etc.). The work must be YOUR OWN. I can tell when “collaboration” is really copying.

I hope this provides both the necessary remediation and a unique opportunity to reassess beyond simple quiz questions. I am really excited to see what kind of problems they write. I have done student problem writing in the past, but was never pleased with the results. Perhaps by requiring them to create a TIPER problem, we can push past equation memorization and towards understanding.

This scenario has also raised a few more unanswered questions: Why do I have this goal in my course in the first place? Why do my students keep missing it even though all quizzes (and the midterm) are open notebook? And if so many students are missing it, is it really a “core” goal?

Some Resources for New Physics Teachers

In a comment from an earlier post, Matt Wasilawski writes:

Thank you very much for these posts, I am looking forward to using them in physics. I have been teaching Earth Science and AP Environmental Science for the past 10 years. I was assigned to teach Physics this year. I was hoping that you could direct me to more specific modeling suggestions for topics in Physics. I do not have a strong background in Physics but have been working hard to develop my knowledge base.

Here are some of my resource recommendations to help new physics teachers with planning and instruction:

Get yourself a copy of Randy Knight’s Five Easy Lessons: Strategies for Successful Physics Teaching. He discusses the best in physics education research, describes several methods for interactive engagement, and goes through a typical physics course unit-by-unit with lesson plan ideas and places where students have misconceptions and stumbling blocks. Every physics teacher should have this book because we all should be incorporating more teaching strategies based on physics education research.

Walking in front of motion detectors to kinesthetically match graphs of motion -- highly recommended by physics education research

The K-12 Physics standards by Heller and Stewart have lesson plan ideas and activities which are founded on physics education research.

The ASU Modeling Website has their Mechanics curriculum available for download, including teacher notes.

Mark Shober is a modeler who put all his materials on his class website. It’s tied to his class calendar, which makes it great for pacing.

And lastly, there is the Physics Classroom website. While it doesn’t mesh perfectly with modeling, it is much better than the most widely used physics textbook. The website has online readings and animations for you and your students, worksheets with links to the corresponding online readings, problem sets with audio solutions, labs, rubrics, and objectives. There are also Minds-On Physics modules, which are good for formative assessment.

I know there are many more, but these are the ones that stick out in my mind as being most helpful.

To my more experienced readers: Leave your favorite resources for new physics teachers in the comments!