Projectile Motion Assessment Task

You are a game designer for Rovio Entertainment, the company that makes Angry Birds.  The human resources department wants your input. They are hiring several programmers to build the physics engine for Rovio’s newest game. Here are the demo videos from the top four applicants. Which applicant(s) would you recommend for hire?

Applicant A

Applicant B

Applicant C

Applicant D

Download the original video files for analysis in Logger Pro or Tracker.

These videos were not created by me. I found them online several years ago, but I can’t remember where. If anyone knows, please tell me so I can give the creator proper credit. Thanks!

VPython Screencasts

This year I’ve decided to have my AP Physics C students (15) make screencasts explaining the workings of and reasonings behind their VPython programs. I got the idea from college physics professor Andy Runquist, who makes his students do similar screencasts for their Mathematica assignments. What I like about screencasting is that it gives added insight into which students understand the physics and the coding of their programs and which do not.

We’ll be using Screencast-o-matic because it is easy to use and it’s web-based (no software to download and install). Another reason is because Screencast-o-matic allows for “open submissions” — i.e., students can record and submit their screencasts directly to a designated channel without having to create an account or upload their video to YouTube. Which is great because all the screencasts will be in one place and I don’t have to worry about getting/managing links from students.

To help students with screencasting, I’ve made a tutorial video, along with examples of good and bad screencasts.

Screencast-o-matic Tutorial

Low Quality Screencast

High Quality Screencast

Happy Screencasting!

Metacognition Curriculum (Lesson 1 of ?)

This year, I’m trying to formally introduce my students to various research relating to mindset, how people learn, and metacognition. Today’s lesson was the first. My goal for today was to introduce students to the scientific evidence that our brains can grow new neurons as adults, and that intellectually stimulating environments and exercise can grow our brains and make learning new things easier. I also worked in some of Dweck’s Mindset research, though in hindsight I think I should have made that a separate lesson. Here’s how today’s lesson unfolded…

Do Now: Complete this survey

(You can download a MS Word version here: MindsetSurvey2013. I stole this survey from this post by chemistry teacher Mr. Kilbane, which he stole from Bowman Dickson in this post. Thanks, guys!)

Lesson:

After completing the survey, we watched a short video segment called “Grow Your Brain” from the episode Changing Your Mind (jump to 13:20) in the Scientific American Frontiers series from PBS.

After the video, I asked groups to get a whiteboard and write down as a group:

  • One thing they learned
  • One thing they found surprising
  • One question they still had

Grouped reported out and I collected responses on an overhead. Here’s the results from one class:

Next, students received a packet which contained:

In a (sadly) mostly teacher-centered fashion, we read a few excerpts from the articles, pointed out the differences between the growth and fixed mindsets, and filled out the expert questionnaire.

As I said previously, I think next year I’ll cut out the Mindset research stuff (which is separate from brain research shown in the Scientific American video we watched), and turn it into a lesson of its own. Now I just need to find a short video about Dweck’s research that I can share with students for that separate lesson.

Possible Upcoming Metacognition Lessons…

Also, I need to give a shout out to John Burk, who inspired me when he started building a metacognition curriculum two years ago!

What principles/concepts/ideas/research would you include in a Metacognition Curriculum?

Keep It Simple Standards-Based Grading

Keep ISimple Standards-Based Grading (K.I.S.SBG.)

This post will probably raise the ire of SBG purists. If you are considering switching to SBG, I say go for it. Even if it means you keep it simple the first year, as you and your students figure it all out for the first time. Here’s my K.I.S.SBG. story…

Last spring, I taught a section of conceptual chemisty. Brand new subject for me. To make my life easier, I initially told the students that I would be using the same points-based grading system as their teacher from the fall semester.

And then I sat down to grade their first quiz.

How many points was each question worth? Should some questions be worth more than others? How many points in total? How should I give partial credit? And how is any of this providing helpful feedback to students?

All those questions made it clear: I couldn’t go back to a points-system. It just didn’t make sense to me anymore. So I decided to go SBG, but with a few caveats to keep everyone sane. This is how it ended up looking:

A set of ~5 standards per unit. WHY: This seems to get at the right scope–not too granular, not too broad. Of course, some units had a few more standards, others a few less. Keep it simple.

Each standard was graded binary YES/NO. WHY: Prevents point-grubbing from students. No need to deal with questions like, “Why did she get a 3 on that standard while I only got 2?” Either the student met the standard or they didn’t. Keep it simple.

Standards that are YES cannot go back down. WHY: Prevents students from perceiving this new grading system as unfair. This can save you many headaches, frantic emails from students, and phone calls from parents. Keep it simple.

Term grade = 50 + 50*(#YES/#TOTAL). WHY: No need to worry about conjunctive grading systems, decaying averages, or tiered standards. Kids can quickly and easily calculate their grade. Keep it simple.

No student-initiated reassessments. WHY: This actually wasn’t my rule, but I was lucky if these students showed up to class in the first place. No one came to extra help or during a free period to reassess. So I just put the most missed standards on subsequent quizzes. It worked out fine and I didn’t have kids hounding me for reassessments when the term ended. Keep it simple.

I didn’t write the standards on each quiz, but put them on a separate scoring sheet (see below). As I looked over the quiz, I marked “✔” or “X” for each standard.

When I finished marking all the quizzes, I used the score sheets to transfer the grades into ActiveGrade.

After all the scores were entered, I printed a current grade report for each student. I stapled together the quiz, the score sheet, and the grade report so each student would know where they stood when I returned the quizzes. That way, if the score sheet showed that student “went down” in a standard they previously had correct, they were reassured by the grade report that the YES grade from a previous quiz remained on record. No worrying about logging into ActiveGrade after school or during class. Keep it simple for the student.

At the end of each term was one final quiz to show understanding any unattained standards.

One final bit of advice: If you still want to grade HW, binder organization, class participation, etc, go right ahead. The best part of SBG, in my opinion, is that it gives multiple chances to be successful, gives better feedback about what students can/cannot do, and forces the teacher to spiral the curriculum to enable reassessment. I don’t want you to forgo all those SBG benefits because you still feel uneasy about giving up grading HW completion. Baby steps, baby steps.

Could my system have been better? Sure. But don’t let perfect be the enemy of good. You can tweak and modify next year. Keep it simple, and just do it.

Experienced Teaching Looks a Lot like Jazz

Recently, Michael Pershan (@mpershan) and I had a great conversation on Twitter about lessons and planning. I’ve copied it below. And if you aren’t already following Michael or reading his blog, you’re missing out.

FN: In the beginning, my planning took a content focus — WHAT do I want students to know. Now my planning is task focused — the HOW.

MP: Interesting. Do you rewrite everything each year? What’s your prep like?

FN: While lessons are similar each year, I don’t think I’ve done the exact same lesson twice. Probably not good for my sanity.

MP: But what keeps you from reusing old tasks? I mean, you do, so what’s the new planning? Selecting from set of tasks for kids?

FN: For example, Hooke’s Law task. Do we: uses scales or probes or hanging weights? Do we use 1, 2, or 3 springs? Do we do multiple measurements per trial? Do I force kids to make stretch the independent variable, or do they choose? Do we graph by hand or on the computer? Do we use LoggerPro or Excel? Do we use ideal springs or ones that have a preload? Does each group use identical springs or does each group get different ones? Students in groups of 2, 3, or 4? Do I give them a worksheet or does it go in their lab notebooks? So many little decisions and permutations. All are important decisions that non-teachers don’t even realize we make.

MP: Are these decisions that, you feel, there’s an optimum solution to, or is it different with each batch of kids?

FN: Kids, time, what I want to emphasize, equipment … Lots of factors.

MP: I really appreciate everything you just tweeted. Thank you.

FN: You’re welcome. Not sure that was much help.

MP: Sorry to pry, just trying to get a look at what experienced teaching looks like.

FN: I’d say experienced teaching looks a lot like jazz.

My TEDxNYED Session: Learning Science by Doing Science

Many thanks to the TEDxNYED 2012 crew, especially True Life Media, Basil Kolani, Karen Blumberg, and Matthew Moran for an awesome event. Be sure to check out the rest of the TEDxNYED 2012 talks.

Learn more about Modeling Instruction in Science.

Great News: Tuition Scholarships for Modeling Instruction Workshops

On the heels of the TEDxNYED talk I gave yesterday about modeling instruction, I have some incredible news to share. Someone has taken note of my promotion of modeling, and as a result, is now offering 35 scholarships for physics teachers to attend! Here’s the details:

NEW, AS OF APRIL 28: TUITION SCHOLARSHIPS for physics teachers nationwide who might not otherwise be able to attend a Modeling Workshop at Arizona State University.

Up to 30 (thirty) scholarships of $1,500 each to non-Arizona teachers, and up to 5 (five) scholarships of $3,000 each to non-Arizona teachers from Title 1 schools. These scholarships cannot be combined with other scholarships.

Applications must be submitted by May 11, 2012. Reply to jane.jackson@asu.edu for an application form. Feel free to call her if you have any questions: 480-314-1522

Scholarship recipients must:

  • be U.S. citizens
  • expect to be assigned at least one section of high school physics in the next school year
  • apply to ASU as non-degree graduate student (May 12 deadline to avoid $50 late fee)
  • take the ASU Modeling Workshop in mechanics, physical science with math, or electricity & magnetism for credit.

All 3 workshop courses are offered June 11-29 at the ASU – Tempe campus. Low-cost housing can be arranged.

Details: http://modeling.asu.edu/MNS/MNS.html

Scholarships are provided by an individual who desires to expand Modeling Instruction in physics, which will increase the competitiveness of American workers in the long run.

It would be awesome if all 35 scholarships were awarded. Please help spread the word!

Who can you forward the announcement to, ASAP?

* Your colleagues at school?
* Your district science coordinator or staff developer? (to forward to physics teachers in your district)
* Your student teacher?
* Supportive faculty at the college where you graduated? (for student teachers)
* A physics teacher listserv? A chemistry teacher listserv?
* An officer of your AAPT section? (ask them to forward to members)
* An officer of your state science teachers association? (ask to forward to members)
* Your local physics alliance? or science alliance?

9th grade physics teachers are eligible, even if your school is just starting that course. They might like to take the physical science with math Modeling Workshop, which has 9 days on force and motion, and then 6 days on intro to chem.

Make a phone call to someone who might be interested; that is the most effective action.

You Are What You Assess

image

I found this near the copy machine yesterday while I was running off end-of-quarter reassessments. It was a wakeup call to the disconnect going on between what I value and what/how I assess in my own classroom.

Upcoming: TEDxNYED and EdCamp NYC


Just a quick note to let you all know I was invited to speak at this year’s TEDxNYED. The roster of speakers is impressive and to say I’m nervous is an understatement.  Most of what I’ll be talking about will likely be familiar to you all, but it’s a chance to spread these ideas to a larger audience. The event will be at the Museum of Moving Image on Saturday April 28 and applications to attend will be accepted until Friday April 13, 2012.


Also I’ll be attending EdCamp NYC as part of the NYC Physics Teachers group. I’ll be leading a session on Standards-Based Grading along with fellow physics modelers and SBG-ers Seth Guiñals-Kupperman, Paul Bianchi, and Noam Pillischer. There will also be a “Model It! Science Inquiry Lesson Slam” session with prizes to be awarded. EdCamp NYC will be on Saturday May 5 at Francis Lewis High School in Queens. It is free to attend but advance registration is required.

Hope to see some of you there!

Disrupt This: My Challenge to Silicon Valley

Over the past few months, Audrey WattersDan Meyer, and Keith Devlin have been critical of Silicon Valley, edtech startups, and iPad textbooks which hope to “disrupt” education. In my opinion, the real stumbling block to meaningful change is students’ formal reasoning skills — analytical thinking that cannot be cultivated by pausing and rewinding video or playing Math Blasters.

Here are my 5 points:

  1. Many of our students are transitioning from concrete to formal reasoning.
  2. A significant barrier to learning for understanding is students’ own formal reasoning skills.
  3. Formal reasoning skills (and thus learning for understanding) can be developing when instruction is structured around the Learning Cycle.
  4. Silicon Valley and edtech startups have been focusing on (often inappropriately) just a small fraction of the learning cycle.
  5. My Challenge to Silicon Valley: Help students learn for understanding by innovating around the rest of the learning cycle.

1. Many of our students are transitioning from concrete to formal reasoning.

Below are 3 reasoning puzzles, each followed by a video of college students attempting to solve the puzzle while explaining and discussing their logic. It’s a highly illuminating look at students’ reasoning processes.

I. The Algae Puzzle (Combinatorial Reasoning)

II. The Frog Puzzle (Proportional Reasoning)

III. The Mealworm Puzzle (Scientific Reasoning)

2. A significant barrier to learning for understanding is students’ own formal reasoning skills.

You’re probably thinking, “So, what? Just because Johnny can’t figure out all the possible combinations of algae doesn’t mean he can’t learn physics.” But the research strongly suggests that it does, even in interactive engagement classes.

In a previous post, I presented this graph from Hake’s famous six thousand student study:

As you can see, interactive engagement course outperformed traditional courses in learning gains as measured by the Force Concept Inventory (FCI). The FCI is the most widely used test of physics understanding. But why is there such a wide range of FCI gains among the IE courses and (not shown) among the individual students within a particular course? A study entitled “Why You Should Measure Your Students’ Reasoning Ability” (Coletta, Phillips, and Steiner) suggests reasoning ability is strongly correlated with physics success.

In the study, several different physics courses administered both the FCI (to measure physics gains) and the Lawson Test of Classroom Reasoning Skills (to measure formal reasoning ability). The Lawson test contains several items very similar the three puzzles above. Here’s what they found:

The data were split into quartiles based on the Lawson scores. The light green bars represent the average Lawson test score for each quartile and the dark green bars represent the average FCI gain for each quartile. There is clear correlation between reasoning ability and learning gains in physics. I’d wager this correlation extends to other subjects as well.

3. Formal reasoning skills (and thus learning for understanding) can be developed when instruction is structured around the Learning Cycle.

According to Piaget, intellectual growth happens through self-regulation — a process in which a person actively searches for relationships and patterns to resolve contradictions and to bring coherence to a new set of experiences.

In order to get students to experience self-regulation and further develop their reasoning skills, classroom experiences should be constructed around the Karplus learning cycle, which contains the the stages of EXPLORATION, INVENTION, and APPLICIATION. From Karplus’s workshop materials on the learning cycle:

EXPLORATION: The students learn through their own actions and reactions in a new situation. In this phase they explore new materials and new ideas with minimal guidance or expectation of specific accomplishments. The new experience should raise questions that they cannot answer with their accustomed patterns of reasoning. Having made an effort that was not completely successful, the students will be ready for self-regulation.

INVENTION: Starts with the invention of a new concept or principle that leads the students to apply new patterns of reasoning to their experiences. The concept can be invented in class discussion, based on the exploration activity and later re-emphasized by the teacher, the textbook, a film, or another medium. This step, which aids in self-regulation, should always follow EXPLORATION and relate to the EXPLORATION activities.  Students should be encouraged to develop as much of a new reasoning pattern as possible before it is explained to the class.

APPLICATION: The students apply the new concept and/or reasoning pattern to additional examples. The APPLICATION phase is necessary to extend the range of applicability of the new concept. APPLICATION provides additional time and experiences for self-regulation and stabilizing the new reasoning patterns. Without a number and variety of APPLICATIONs, the concept’s meaning will remain restricted to the examples used during its definition. Many students may fail to abstract it from its concrete examples or generalize it to other situations. In addition, APPLICATION activities aid students whose conceptual reorganization takes place more slowly than average, or who did not adequately relate the teacher’s original explanation to their experiences. Individual conferences with these students to help identify and resolve their difficulties are especially helpful.

4. Silicon Valley and edtech startups have been focusing on (often inappropriately) just a small fraction of the learning cycle.

Unfortunately, Silicon Valley has been dumping its disruptive dollars almost solely into the INVENTION phase and on the tail-end of the phase at that. It views education purely as a content consumption process and ignores the development of formal thinking and reasoning.

Remember, in the invention phase, “The concept can be invented in class discussion, based on the exploration activity and later re-emphasized by the teacher, the textbook, film, or another medium.” That’s Khan Academy videos, flipclass videos, iBooks, an similar technologies designed to present content via direct instruction. However, “Students should be encouraged to develop as much of a new reasoning pattern as possible before it is explained to the class.” Which means that this type of direct instruction should be as minimal as possible, because it robs kids from reasoning and making meaning. In other words, Silicon Valley is putting its energy into the portion of the invention phase that should be as small as possible!

Now let’s look at the application phase. There has been some development here as well, most notably in apps and exercise software which seek to gamify the classroom. But the application phase isn’t about getting 10 right answers in a row or solving problems to shoot aliens. Remember, Without a number and variety of APPLICATIONs, the concept’s meaning will remain restricted to the examples used during its definition. Real learning with understanding means students can reason about the concepts well enough to use them in new and unique concepts (aka transfer). Applications should require students to examine their own thinking, make comparisons, and raise questions. Great applications examples are open-ended problems, problems which present a paradox, and student reflection on both successful and unsuccessful problem-solving methods. Deep learning does not end when the Application phase begins.

5. My Challenge to Silicon Valley: Help students learn for understanding by innovating around the rest of the learning cycle.

Real disruption isn’t going to come from skill and drill apps, self-paced learning, badges, YouTube videos, socially-infused learning management systems, or electronic textbooks. Students must be continuously engaged in the learning cycle. We need to equip our students with the reasoning skills to learn how to learn anything. Focus on experiences in the exploration phase, meaningful sense making in the invention phase, and worthy problems in the application phase.

But, in reality, we only have ourselves to blame. It shouldn’t come as a surprise to us when students can’t think — the status-quo in education has been to spend most of our time on content delivery while robbing students of exploring and reasoning opportunities. And current edtech trends aren’t fixing this problem; rather, they are making it easier to make the problem worse.

To be fair, a few “good disrutptions” have occurred in the other phases of the learning cycle. Motion detectors allow students to “walk a graph” so they can easily explore position-time and velocity-time graphs. GeoGebra allows students to explore and play with geometry and functions quickly and easily. PhET simulations allows students to conduct open-ended planetary orbit experiments that would be impossible in real life. And VPython programming gets students to apply what they learned to write their own simulations and visualizations.

So when presented with the next great edtech “disruption,” ask yourself: has this innovation actually changed how student think about math and science concepts? Or has it just allowed students to get a few more questions correct on the state exam?


For further reading:

The next two articles:

  • “Promoting Intellectual Development Through Science Teaching” (Renner and Lawson)
  • “Physics Problems and the Process of Self-Regulation” (Lawson and Wollman)

are found here: Module 11: Suggested Reading (Workshop Materials for Physics Teaching and the Development of Reasoning)