Quizzes vs. Projects (Mass & Weight Edition)

Tests are evil, let them do projects.

That type of rhetoric frequently appears in my Twitter stream. My gut reaction is hell yeah. But some recent quiz results have gotten me thinking ….

Take for example, this learning objective:
The student understands the difference between mass and weight.

Here’s a student project (not mine) which clearly addresses the objective.

Here’s another project (also not mine). This one is very creative and totally adorable.

But those two projects are really just rehashes of the traditional explanation of the difference between mass and weight: “mass is the amount of stuff an object has and doesn’t change, while weight is the gravitational pull on an object and can change depending on location.” I wonder what would happen if those two students encountered quiz questions like the ones below. Would they make the same mistakes as several of my students did? I feel that even though my students can parrot back the difference between mass and weight (like in the above videos), they don’t really understand that difference if they miss these type of quiz questions:

New Doc 32_2

New Doc 32_3

New Doc 32_4

I did find one project where a student (again, not mine) gives a thorough explanation and uses several examples. I predict that this student should be able to answer those quiz questions.

What I’m trying to say is that I feel that teacher-generated questions and experiences (quizzes, labs, whiteboard problems, etc.) are important because they challenge students to think and apply in ways they probably wouldn’t if we just left them to their own devices.

But I also get that projects let students be creative and allow them to demonstrate their understanding in ways that quizzes simply can’t.

Perhaps the answer is just “all things in moderation.” Or perhaps the project parameters need improvement so students aren’t simply reciting Wikipedia definitions from a Powerpoint? Or something else?

What are your thoughts?

Edtech PR Tips

I’m not a PR guy. I’m just a teacher. But they say that if you want to be a disruptor, the best experience is no experience. So here goes…

1. It’s not about the technology. It’s about what students are empowered to do because of your technology. Show us how you take students beyond what they could do previously. Show student work (“Hey, look what this kid can do!”). Stop focusing on checkmarks, badges, data, dashboards, and slick UI.

2. Learning is social. Show students interacting with each other, questioning, helping, constructing — all as a result of using your technology. Don’t show kids glued to screens, headphones on, working en masse and in isolation. It’s creepy.


The Learning Lab at a Rocketship school, where students spend 2 hours each day.

3. Don’t use phrases that signal you have simplistic views about teaching and learning. In particular: Learning stylesdigital nativesindividualized instruction, and content delivery.

4. Practices are equally as important as content. Show how you enable students to engage and grow in the core practices in math, science, and ELA.


Credit: Tina Cheuk, tcheuk@stanford.edu [PDF (scroll to bottom)]

5. Show how you implement/compliment research-based practices about how students learn. Study up on these characteristics of effective teaching methods. Otherwise…

6. Run controlled, peer-reviewed experiments that use conceptual diagnostic tests to measure growth. We know most anything works better than (or as well as) passive lecture instruction. But how does implementation of your technology stack up to other evidence-based teaching methods? And be sure to use conceptual diagnostic tests, not final exams or standardized tests or failure rates. CDTs have been painstakingly researched and designed to measure true conceptual understanding rather than algorithm memorization. Without strong evidence, we’re just skeptical of your claims.


Hake’s analysis of 62 different physics courses as determined by gain on a physics conceptual diagnostic test.

7. Don’t contradict yourself. Your words should match your actions.

8. Show feedback and testimonials from students. In particular, have students demonstrate their deeper understanding and expert thinking that has been enhanced by using your product. Or perhaps your technology has decreased student anxiety and contributed to a positive classroom climate. However, don’t have students talk about shallow things such as raising grades and doing well on tests.


Testimonials from Pearson/Knewton’s MyEconLab

9. There’s nothing revolutionary about old wine in new bottles. A digital textbook is still a textbook. A video lecture is still a lecture.

10. Read everything Audrey Watters writes. Everything.

Do you have any more edtech PR tips to share? Any more examples of bad PR? Any good examples? Thanks!

Convincing Reluctant Teachers

This question was posted to Twitter today:

Question: how do you convince teachers who are ADAMANT that they teach to the rigor required by CCSS that they really don’t?

(CCSS means Common Core State Standards)

This is a great question. I think it applies to a wide range of situations. You can replace “CCSS” with the Next Generation Science Standards, the new AP Physics 1 and 2 course, or any curricula du jour. It all boils down to showing these teachers that traditional teaching methods do not lead students to a deeper understanding of the concepts.

Some folks may suggest showing the reluctant teachers sample test questions from the new assessments. I say stay far away from that. These teachers will likely look for tricks to game the assessments so students can be successful without the in-depth understanding these teachers think they are teaching.

My suggestion is to have the reluctant teachers administer a basic conceptual diagnostic test to their students. The questions are so basic, so easy, the teachers will say “Of course my students can ace this!”

And then wait for the results to come in.

In all likelihood, the students (on average) will do poorly. Amazingly poorly. Even worse than if they had simply guessed randomly.

To which the reluctant teacher responds, “What happened? They should have known all this!”

Now’s your chance. I think now they’ll be more receptive to what you have to say about how students learn math and science and why interactive engagement techniques work.


Here’s Erik Mazur (Harvard physics professor) explaining what happened when he gave his students a conceptual diagnostic test:

(The video is an excerpt from Mazur’s longer “Confessions of a Converted Lecturer” talk.)


Extensive lists of concept inventories can be found at FLAG and NC State. Remember, many of these tests have been painstakingly developed and refined by researchers. Be sure to abide by the developers’ rules with administering the tests to students. You should not post them to the internet or discuss the answers with students.

My Google Reader Alternatives

Google Reader will be ending on July 1st. After searching through several apps and services, I’ve finally settled on a few alternatives I like.

First, I imported my Google Reader feeds to Feedly. This can be done in just one-click if you allow Feedly access to your Google Reader account.

However, I hardly ever read posts in Feedly because:

    • I can’t star/favorite posts. (However, I can “bookmark for later.”)
    • I was having issues with the Feedly Android and iOS apps: My feeds wouldn’t sync. The posts I had read would reappear as unread. The apps also lack the option to star/favorite posts.
    • The Feedly Android and iOS apps do not allow for offline reading.
    • Feedly apps don’t show videos (not even thumbnails) that are embedded in posts. Feedly has corrected this problem. Thanks, @Alby!

Thankfully, there are iOS and Android apps that sync with Feedly and solve all the problems above. Here are my two favorites:

iOS: Newsify (free, no ads)

Photo Jun 29, 7 48 00 AM

Newsify also shows embedded YouTube videos:

Photo Jun 29, 7 48 39 AM

Android: gReader (free with ads, $4.99 no ads)

2013-06-29_07-43-53 (1)     2013-06-29_07-44-06 (1)

gReader also shows embedded YouTube videos:


Plus, gReader lets you easily subscribe to new blogs via the share option in the browser:


Most importantly, Newsify and gReader play well with each other and stay synced so I can easily read, star, and share posts from both my Android phone and iPad.

Be sure to move your feeds to Feedly before July 1st! Good luck!

The Spirit of SBG

You want switch to standards-based grading, but, for whatever reason, you cannot. Do not worry. All of the strengths of SBG can be done within a traditional grading system:
  1. Shift from tracking by chapter to tracking by concept.
  2. Allow opportunities for students to show growth.
  3. Don’t grade homework and practice.
  4. Provide timely and effective feedback.
  5. Spiral concepts throughout the curriculum and your assessments.
  6. Give shorter, more frequent quizzes.
  7. Assess what you value.
  8. Provide clear goals and expectations for performance.
  9. Encourage risk taking, failure, iteration, and experimentation.
  10. Do what works best for your students and your situation.

A traditional system done in the spirit of SBG  is much, much better than an SBG system done poorly. (Trust me, I’m speaking from experience!)

Project Work: Group or Individual?

Thanks to Chija Bauer for prompting me to write this post:

For the last several years, I’ve allowed students to work together in groups on their end-of-year projects (a self-designed lab investigation). The rationale was that students would be able to do much more complicated experimental designs with two, three, or four people than with just one. But in the end, I was never satisfied with how it worked out. Often the experiments were simple enough that they could have easily been carried out solo. Or two students actually did the project and then added the name of a non-contributing friend (or two) to the report.

One solution I’ve tried is to require individual reports. This usually ends up with group members submitting identical “individual” reports. Which leads to phone calls, discipline, cries of “I didn’t know we couldn’t do that.” etc., etc. It’s a battle I don’t enjoy fighting, so I don’t find this solution to be successful for me (though your mileage may vary).

This year, each student must do their own unique investigation. All students are now fully immersed in the experimental design process. Sure, some of the experiments require an extra pair of hands, but students have been enthusiastically helping each other out. Jack might be the cameraman for Jill’s terminal velocity experiment. And then Jill might release the cart at the top of the ramp for Jack’s conservation of energy experiment.

Some students have stated that if they work together to collect data, then they should both be able to analyze that data for their projects. My response to this is that they must have unique data sets. Take Jill’s terminal velocity experiment. She’s looking at the effect of mass on terminal velocity by dropping nested coffee filters. Jack is using a camera to film the falling filters so Jill can analyze the videos in LoggerPro. Now Jack is not allowed to use Jill’s data, but Jack could investigate the effect of surface area on terminal velocity or simply repeat Jill’s experiment using jumbo coffee filters or cupcake wrappers instead. And in the end, Jill and Jack can compare conclusions and come up with a mega-conclusion that ties together both experiments.


Sometimes, however, the project work must be done as a group because that’s the only feasible way. I had to do this in my Conceptual Physics class this year for our model defibrillator circuit project and our modified bike light generator project. I did not have enough equipment (or storage!) for each student to have their own circuit kit or bicycle.

Both of these projects came from the Physics That Works curriculum, and I used their solution to this problem of group project vs. individual work. The solution is that the project has two parts: a group component and an individual component. For example, for one project, each group had to modify a bike light generator so that the headlights would light even when the rider wasn’t peddling, yet wouldn’t add more batteries to the landfill. For the group portion of the project, students worked in groups to design and build such a circuit for their group’s bicycle. And everyone in the group received the same grade for that part (25% of the overall project grade).


For the individual portion, each person had to submit an annotated circuit diagram (25% of the project grade) and give a mini-presentation to the class (50% of the project grade). I’ve posted my rubrics below:

Even the way the mini-presentations are handled by the authors of Physics That Works is genius. Students are given several choices for topics for their mini-presentation, but the caveat is that, as group, no two students can do the same mini-presentation and that two of the mini-presentations must come from the two required topics and the others come from the elective topics. For example, for the bike light presentations, these are the options:


Ideally, the mini-presentations would be tied together in one large presentation for the whole group, but each student would only be graded on their contribution.


What are your solutions to the group project vs. individual work dilemma?

Labs, Notebooks, and Reports: For What Purpose?

Today was Senior Seminar: a day-long school event where seniors get breakfast, BBQ lunch, yearbooks, and attend workshops about upcoming college life. So all my seniors were not in class today, which gave me some time to reflect. I was thinking about how best to use lab notebooks and lab reports next year.

You see, this year in college-prep physics, students recorded lab work in spiral-bound graph-paper notebooks. They taped a rubric next to each lab. I collected their notebooks, lugged them around, marked their rubrics, and returned their notebooks. All 51 of them. For each lab. (I could have simply collected one notebook from each lab group, since the other notebooks in the group were usually identical — right down to the conclusion, awkward sentences and all.)


I’ve gone through various other incarnations of notebooks, reports, whiteboards, packets, etc. in my 15 years of teaching. My handwritten reflection for what to do next year are below. I think it captures the best of all those previous systems while still maintaining a reasonable workload.

75aSgLS1D9a0Y0bfF1gCd0f7 (1)

  1. I stamp the lab notebooks during class as evidence that the student was present in lab and participating — brief design, measured data, calculations, and graphs. These are the things that will be identical from noteb0ok to notebook anyway. I won’t be picky about proper format because I’d rather have them spend most of their time taking and analyzing data than worrying about the notebook looking picture-perfect. Also, students who are absent would be required to come during a free period or after school to perform the lab. (I’ve never done that before. It could be overwhelming. But I also think it sends the wrong message to a student that they can just copy the data from a partner.)
  2. Students write a post-lab reflection. After we’ve had our post-lab class discussion to tease out the concepts, idea, models, relationships, etc. from lab, I’d ask students to summarize what they’ve learned, what questions they had,  and what they found to be (in)effective about the lab. I wouldn’t grade this either, but I think taking the time for solo sense making and summarizing is important. This could be done on an exit ticket, in the notebook, or online.
  3. Students write a formal lab report. I think that effective communication of a scientific experiment is important. My failure this year was trying to do it simultaneously in the notebook. How to make a table and graph and put it into a report is an important skill. How to best represent the data is an important skill. How to make a scientific argument based on evidence is an important skill. But reading 50 lab reports about 6 times per quarter is awful. So I’m taking a cue from my freshman writing professor. He set up a rotating schedule in which just a few students submitted an essay each week, based upon one of the books we had read. I think doing it this way would lead to fewer reports to look at each week, thereby allowing me to give more effective feedback. Plus, I’d have fewer copied reports since I’d have just one student from each group-turn in the report. So if there are 3 students in each lab group (A, B, and C) then all the As would turn in a report one week, all the Bs the following week, etc. Hopefully the schedule will allow for 2 write ups per student each quarter in order to show growth.

What’s your system for lab work?

An “I-hit-publish-too-early” update: Of course, none of this directly addresses what I feel is the most important issue with lab work: how to assess the scientific inquiry process. I’m reminded of AAPT’s Goals of the Introductory Physics Laboratory and Eugenia Etkina’s Scientific Abilities.