Category Archives: Uncategorized

Toilet Paper Roll Drop 2012

The challenge:

Students had to calculate the ratio and then test their prediction by dropping the toilet paper rolls.

One group used rotational and translational dynamics.

Two other groups used energy conservation.

All 3 groups got the same ratio. But was it correct? Here’s the video from this year’s drop:

You can read about last year’s drop here.

Physics of Angry Birds Lesson on CUNY-TV

Many thanks to Ernabel Demillo and the crew of Science and U!

You can read more about how we use Angry Birds in class here:
Angry Birds in the Physics Classroom

A Mistake Made in Haste

In my previous post, I was so enamored with a group’s creation of the average velocity step graph that I neglected to check that group’s math.

Turns out they calculated the average velocity using the total distance rather than the interval distance. In other words, they simply took the distance column and divided by the time column. (Just goes to show how fragile students’ knowledge is, and how subtlety and nuance are difficult for students to grasp.)

So I graphed their data again, this time using steps that start at t=0 (in black below). This meant the steps overlapped and it was hard for me to see the “best fit line” in this case (also in black) though this time there is no intercept.

In red, I calculated and graphed the interval average velocity that I thought they had done originally. Yikes! The average velocity is all over the place. Small timing errors seem to have a much bigger effect for the interval velocities.

I still like the activity, but I don’t want to make it more complicated than necessary, especially with errors in the interval velocities.

Now what should I do next year? Have everyone do it their own way first and then repeat data collection having everyone use time as independent variable? Skip the velocity graphing altogether (my original intention, until I saw the students’ step graph)?

Thanks for your help!

A Graph to Visualize Average Velocity

Note: This is an expansion of the today’s Noschese 180 post. I thought it was too good not to share here.

We started Constant Acceleration in college-prep today. Rather than dive right in with carts and motion detectors, I propped up one end of a lab table with textbooks (best use ever) and let a C-battery roll down. (Batteries accelerate more slowly than marbles and hot wheels cars. They also roll much straighter.)

“What do you observe?” I asked

“It rolls down and gets faster.” they said.

“Prove it. You have 10 minutes.” I challenged them. I hate prescribing directions for activites like this. I want to see how my students approach these tasks.

They wanted stopwatches and metersticks. Some wanted tape.

One group wisely rolled the battery down a whiteboard and left marks at one second intervals. They were done in 2 minutes.

The other groups marked out equal intervals of distance to time with a stopwatch. Most groups made data tables to show that it takes less time to travel each successive distance interval, thereby showing it continously increases in speed.

Many groups added a velocity column and calculted the “velocity” for each interval to show it changes. (But velocity when? where? average? I didn’t want to go down that road just yet. I just let it be.)

Some groups went further and also made distance-time graphs of their data to show the slope increases.

Two groups went even further and added an average-velocity step graph like this one:

It was beautiful. And something I had never considered doing.

***

You see, over the years, I’ve tried a variety of acceleration labs. Kids would collect position-time data and make position-time and velocity-time graphs. And getting the velocity-time graph was always laborious. Here are some methods I’ve tried in years past…

Method 1: Manually draw tangent lines on the position-time graphs. Calculate and graph the slopes of the tangent lines. (Tedious)

Method 2: Use the slope tool in Logger Pro to get the slope of the tangent at each data point. Graph the slopes of the tangent lines. (Computer issues)

Method 3: Kids calculate the average velocity for each distance/time interval. Tell them to graph it at the midpoint in time. This typically involved a lot of hand waving b/c kids didn’t quite understand why at the midtime rather than the end time. And I’d still have groups that would incorrectly graph the average velocity at the end time. One time I made a data table worksheet to avoid this issue — but it was scary table with rows in between rows for midtime data.

Method 4: Method 3, but using Excel (OMG, what was I thinking?)

***

The average velocity step-graph method is perfect. It doesn’t matter how the students took the data. They calculate the average velocity for each interval, then graph each average velocity as a step that is as long as the interval. No need to handwave about midtimes. No need to assume the acceleration is constant.

The board pictured above inspired me, so I had all groups make their own average velocity step graph as well, just to see if it would work.

“Is this how the velocity-time graph really looks?” I asked.

“No. There wouldn’t be any steps. It would be a line. Or a curve.” they said.

They made the leap on their own to draw a line through the steps. And, lo and behold, the “best fit line” cuts through the middle of each step — the midtime.

You can’t miss it. A great visualization.

Kids who took data at equal time intervals had equal sized step-widths and step-heights. Kids who took data at equal distance intervals had unqual step-widths and step-heights (the steps got narrower and shorter over time — which in a data table looks like non-constant acceleration). But the line still cut through the midtime of each step. Now we can talk about why that happened and what that means AFTER, rather than all the handwaving and number crunching first.

Several graphs also got a y-intercept, which we chalked up to reaction time error.

I love it when I learn from kids!

UPDATE: There’s a mistake in the step-graphs here. Read my follow-up post “A Mistake Made in Haste.” Sorry!

You Khan’t Ignore How Students Learn

From Harvard EdCast’s “The Celebrity Math Tutor” (transcript below)

Buffy Cushman-Patz: What efforts do you take to ensure that your pedagogy is consistent with what education research shows about how people learn, especially how people learn math and science?

It’s unfortunate that “The Teacher to the World” was only able to mention one study about how students learn. A study which he then dismisses. And since he doesn’t describe any other efforts to be consistent with pedagogy, his real answer to Buffy’s question is: “I don’t.”

Let’s look at Khan’s response in more detail:

“Now we are getting pretty deep on our own analytics on our website.”

I don’t see how statistics about how many times students have watched/rewound each video or how many times students miss a question in the exercises tells us anything about how effective his videos are. I don’t see how he could use that data to refine his future videos in the same way a teacher would reflect and refine lessons from year-to-year.

“…you can’t come up with these rules the way, all teaching has to be done like this.”

He’s right. There is no one rule, no one formula, for teaching. The Physics Education Research User Guide website contains 51 different research-based teaching methods. The website can filter these methods by type, instructional setting, course level, coverage, topic, instructor effort, etc. And while 51 different methods may seem overwhelming, they all have one important characteristic in common: interactive engagement (IE).

So what is interactive engagement? Hake defines IE as methods “designed at least in part to promote conceptual understanding through interactive engagement of students in heads-on (always) and hands-on (usually) activities which yield immediate feedback through discussion with peers and/or instructors.”

A video lecture is not interactive engagement.

“…maybe you explain once and you reemphasize that this goes against misconception A, B, C, or D.”

Khan (along with most of the general public, in my opinion) has this naive notion that teaching is really just explaining. And that the way to be a better teacher is to improve your explanations. Not so! Teaching is really about creating experiences that allow students to construct meaning.

“And I think frankly, the best way to do it is you put stuff out there and you see how people react to it…”

This is flawed. People’s reactions are not indicators of effectiveness. Pre/post testing is needed to indicate effectiveness. Ah, but perhaps there is a relationship between people’s reaction and effectiveness? The research indicates otherwise. In the very research study that Khan says is valid (and then dismisses), student actually did better after watching the videos they described as confusing, and made no gains after watching the videos they described as easy to understand. Additional research indicates that when an instructor switches over to IE methods, course evaluations from students tend to be more negative than the previous year, despite gains from students going up. (Don’t worry, a few years after the switch to IE, the evaluations go back to pre-IE levels.)

You see, the comments they put, they’ll ask questions based on… Every time I put a YouTube video up, I look at the comments — at least the first 20, 30, 40 comments that go up — and I can normally see a theme: that look, a lot of people kind of got the wrong idea here. Or maybe some people did, and then I’ll usually make another video saying “Hey, look after the last video, I read some the comments and a lot of y’all are saying this is not what we’re talking about it’s completely different.” So that means I am attacking the misconceptions.”

Again, it’s not about crafting better explanations. It’s about helping students wrestle with their conceptions and guiding them.

“But I think if you had a formula in place, and you do that every time, I think once again the learner will say, “This guy’s not thinking through it and he’s not teaching us his sensibilities, his thought processes. He’s just trying to meet some formula on what apparently is good video practice.”

Another naive notion of teaching. The goal is not for the teacher to teach the students his sensibilities and thought processes. The goal is for the teacher to have the students use their sensibilities and thought processes to reason through the concepts. Empower the student to think for themselves, rather than consuming the teacher’s ideas.

“And I’ll go the other way: you can dot all the “i”s and cross all the “t”s on some research-based idea about how a video should be made, but if your voice is condescending, if you’re not thinking things through, if it’s a scripted lecture, I can guarantee you it’s not going to appeal with students.”

Yet there are plenty of people who prefer to watch Walter Lewin’s highly-scripted performance lectures to Khan’s off-the-cuff style lectures. (Though remember that preference has nothing to do with effectiveness. In fact, Lewin’s showstopping lectures were no more effective than the mundane professors before him.)

“…and I think in general, people would be doing a disservice if they trump what one research study does and there’s a million variables there: who was the instructor, what were they teaching, what was the form factor, how did they use to produce it? You’d be doing yourself a disservice if you just take the apparent conclusions from a research study and try to blanket them onto what is really more of an art. It’s like saying that there’s a research study on what makes a nice painting and always making your painting according to that research study that would obviously be a mistake.”

Here is the most damning piece of evidence, from Hake’s famous six thousand student study:

The six thousand students in Hake’s study were not in a single class. They were in 62 different courses, from high school to university, taught by a variety of instructors with different personalities and expertise. And yet ALL the IE courses made greater gains (the slope of the graph — between 0.34 and 0.69) than the traditionally taught courses (average 0.23). It should also be noted that the green IE courses above were NOT identical and did not follow some magic teaching formula. They only had to conform to the Hake’s broad definition of IE given above. So you see, those “million variables” that Khan mentions don’t matter. METHOD trumps all those other variables.

But surely teacher expertise matters, right?

Yes and no.

NO: As seen in Hake’s study above, when comparing IE teachers to traditional teachers, expertise doesn’t matter because IE always trumps traditional.

NO: Note the small spread of the red-colored traditional classes shown above, which hover around an average gain of 0.23. Traditional methods produce very similar results no matter the level of the course or instructor.

YES: When comparing IE teachers to other IE teachers, expertise does matter. IE gains ranged from 0.34 and 0.69. As instructors get more comfortable using IE methods, gain increases. See, for example, this graph about the effectiveness of modeling instruction:

Expert modelers had higher gains than novice modelers.

But surely there is a place for lectures, right?

Yes, BUT students must be “primed” for the lecture. According to the PER User’s Guide FAQ:

It is possible for students to learn from a lecture if they are prepared to engage with it.  For example, Schwartz et al. found that if students work to solve a problem on their own before hearing a lecture with the correct explanation, they learn more from the lecture.  (For a short summary of this article aimed at physics instructors, see these posts – part 1 and part 2 – on the sciencegeekgirl blog.) Schwartz and Bransford argue that lectures can be effective “when students enter a learning situation with a wealth of background knowledge and a clear sense of the problems for which they seek solutions.”

For more information about  how people learn, I highly recommend two great FREE online books from the National Academies Press:

If you are a physics teacher, be sure to get these discipline specific books about how students learn physics:

And just in case you think I’m an armchair critic with nothing to contribute, I want you to know I’ve opened up my classroom to the whole world on my Noschese 180 blog, where I’ve been sharing a picture and a reflection from each school day. It’s not quite the Noschese Academy, but I hope you find it worth reading and commenting, as we journey through teaching together.

A Demonstration of the Ineffectiveness of Traditional Instruction

First, answer this question:

A student in a lab holds a brick of weight in her outstretched horizontal palm and lifts the brick vertically upward at a constant speed. The force of the student’s hand on the brick is:
A. constant in time and equal to zero.
B. constant in time, greater than zero, but less than W.
C. constant in time and equal to W.
D. constant in time and greater than W.
E. decreasing in time but always greater than W.

Now watch this video. Feel free to pause, rewind, and rewatch as needed.

Finally, answer this question again:

A student in a lab holds a brick of weight in her outstretched horizontal palm and lifts the brick vertically upward at a constant speed. The force of the student’s hand on the brick is:
A. constant in time and equal to zero.
B. constant in time, greater than zero, but less than W.
C. constant in time and equal to W.
D. constant in time and greater than W.
E. decreasing in time but always greater than W.

Believe it or not, the concept needed to reach the correct answer is given in Khan’s video. Highlight below to reveal:
C. constant in time and W. Why? Since the brick moves at a constant velocity, the forces on the brick (you and gravity) must be balanced.

Physics Teaching 2.Uh-Oh

My first talk! Given at the STANYS 2011 Physics Breakfast on November 8th, 2011 in Rochester, New York

Links to resources mentioned in the talk:

A huge thank you to Gene Gordon for inviting me to speak at the breakfast. It was great to share my passions and meet my virtual colleagues face-to-face!

I’d love any feedback you have, positive and negative. Thanks!

Khan’s School of the Future

From Hacked Education:

Khan Academy announced this morning that it has raised \$5 million from the O’Sullivan Foundation (a foundation created by Irish engineer and investor Sean O’Sullivan). The money is earmarked for several initiatives: expanding the Khan Academy faculty, creating a content management system so that others can use the program’s learning analytics system, and building an actual brick-and-mortar school, beginning with a summer camp program.

“Teachers don’t scale,” I remember Sal Khan saying to me when I interviewed him last year. What can scale, he argues, is the infrastructure for content delivery. And that means you just need a handful of good lecturers’ record their lessons; the Internet will take care of the rest.

But online instruction clearly isn’t enough, and as “blended learning” becomes the latest buzzword — that is, a blend of offline and computer-mediated/online instruction — Khan Academy is now eyeing building its own school. The money from the O’Sullivan Foundation will go towards developing a “testbed for physical programs and K-12 curricula,” including an actual physical Khan Academy school.

What might Khan’s “school of the future” look like?

In his video interview with GOOD Magazine, Khan said:

As far as the future of learning is concerned, the school is going to be one or two really big classrooms, and because everyone can work at their own pace, we are going to see the best be a higher bar and you’re going to see everyone having access to that and they can move up with the best.

One or two large classrooms where everyone works at their own pace? That sounds a lot like Rick Ogston’s Carpe Diem school:

Matt Lander reports the following from his visit to Carpe Diem:

Carpe Diem is a hybrid model school, rotating kids between self-paced instruction on the computer and classroom instruction. Their building is laid out with one large computer lab, with classroom space in the back. They had 240 students working on computers when I walked in, and you could have heard a pin drop.

Carpe Diem has successfully substituted technology for labor. With seven grade levels and 240 students they have only 1 math teacher and one aide who focuses on math. [emphasis mine]

Carpe Diem also  touts they get great results with less per pupil spending. How? Well, as implied above, they have fewer teachers and staff. Also, take a look at the Carpe Diem Parent/Student handbook and you can see why: they have NO nurse and NO food service. Other ways I bet they save on money, compared to a regular public school: there is very little equipment to buy (aside from computers and furniture) — no art supplies, no science labs, no physical education equipment, etc. It seems Carpe Diem also lack special programs: no special education, no athletics, and no performing arts. We also have the problem of using standardized test scores to measure success. I think what is more important is: How many are successful in college? How many stay on past freshman year?

Carpe Diem is really an online school that also has a few brick and mortar campuses. The curriculum they use for both their virtual and physical schools is called e2020. From e2020’s website:

e2020 then designs each lesson with student-centered objectives that maximize the use of Bloom’s Taxonomy of Learning Domains. Lessons are designed in order to provide the student with an optimal learning experience that is unique for each course.  Students progress through the lesson with a series of activities such as, direct instruction videos by certified teachers; vocabulary instruction: interactive lab simulations; journals and essay writing; 21st century skills; activities that include projects, design proposals, case studies, on-line content reading; and homework/practice before being formatively assessed with a quiz. Topic test and cumulative exam reviews are provided to reinforce mastery prior to students’ taking summative assessments.

So the kids work through the modules at their cubicles and can seek out extra help at “workshops.” You can test some of the modules if you register here. For the science modules I tested, there are periodic multiple choice assessments. The in-module labs are all simulations — no manipulation on any physical equipment. It seems kids can pass the state exams based on their module work, but I wager they will be severely ill equipped for college or the real world, especially in STEM fields.

My issues with this blended/hybrid model of school:

• The conception of learning seems to be isolated, rather than group.
• It appears to teach/assess mostly low-order practices.
• I can’t see how physics and chemistry could be done well, and thus contribute to  developing the STEM workforce.
• How can ONE teacher be versed in pedagogically appropriate ways of helping students across SEVEN grade levels?

Blended learning schools like Carpe Diem pale  in comparison to what schools like High Tech High are doing:

Where would you rather go to school?

Newton’s 3rd Law (or How to Make Effective Use of Video for Instruction)

Exhibit A:

Exhibit B:
Download the high-quality video clips for each collision.

Discuss.

Same Planet, Different Worlds

What is the future of learning?

Vision #1: Doing Old Things in New Ways

Vision #2: Doing New Things in New Ways

(Thanks to David Smith of the Da Vinci Discovery Center of Science & Technology for bringing the Cyberlearning video to my attention)