Category Archives: Uncategorized

Day 13: A New Approach to Colliding Buggies

Frank Noschese:

Today’s 180 blog post that Action-Reaction readers might be interested in….

Originally posted on Noschese 180:

College-Prep Physics: Modeling Instruction’s standard lab practicum for the constant velocity unit is colliding buggies. Lab groups take data to determine the speed of their buggy, then the buggies are quarantined and groups are paired up. Each group pair is then given an initial separation distance for their buggies and are asked to predict the point were the buggies will collide. Once they calculate the answer, they are given their buggies back to test their prediction.

It’s fun, but there are some frustrations. Groups that have poor experimental design or data collection techniques won’t calculate the correct buggy speed, which means they won’t accurately predict the collision point. Also, since only the separation distance is given, there isn’t much focus on the position of the buggy and students are less likely to use a graphical method to find the collision point. They try all sorts of equations instead. In the end, one person in…

View original 393 more words

Flappy Bird Physics Is Real Life?

If you don’t already know, Flappy Bird is the hot new mobile game right now. The premise is simple: navigate the bird through the gaps between the green pipes. Tapping the screen gives a slight upward impulse to the bird. Stop tapping and the bird plummets to the ground. Timing and reflexes are the key to Flappy Bird success.

This game is HARD. It took me at least 10 minutes before I even made it past the first pair of pipes. And it’s not just me who finds the game difficult. Other folks have taken to Twitter to complain about Flappy Bird. They say the game is so difficult, that the physics must be WRONG.

https://twitter.com/ThatPuckBeaut/status/428781313433149440

https://twitter.com/maaddawwg/status/427833802140815361

 

So, is the physics unrealistic in Flappy Bird?

Sounds like a job for Logger Pro video analysis! I used my phone to take a video of Flappy Bird on my iPad. To keep the phone steady, I placed it on top of a ring stand with the iPad underneath.

IMG_20140130_141424614

(I’ve uploaded several of the videos here if you’d like to use them yourself or with students: Flappy Bird Videos.)

Then I imported the videos into Logger Pro and did a typical video analysis by tracking Flappy’s vertical position in the video. Sure enough, the upside-down parabolic curves indicate Flappy is undergoing downward acceleration.

FlappyBirdPosition

But do the numerical values represent normal Earth-like gravity or insanely hard Jupiter gravity? In order to do this, we need to (1) set a scale in the video so that Logger Pro knows how big each pixel is in real life and (2) determine the slope of Flappy’s velocity-time graph while in free fall, which is equal to the gravitational acceleration.

The only thing we could realistically assume is the size of Flappy Bird. If we assume he’s as long as a robin (24 cm), then the slope of the velocity-time graph is 9.75 m/s/s, which is really close to Earth’s gravitational acceleration of 9.8 m/s/s. Flappy Bird is REAL LIFE.

FlappyBirdAcceleration

So then why is everyone complaining that the game is unrealistic when, in fact, it is very realistic? I blame Angry Birds and lots of other video games. Repeating the same video analysis on Angry Birds and assuming the red bird is the size of a robin (24 cm), we get a gravitational acceleration of 2.5 m/s/s, which only 25% of Earth’s gravitational pull.

AngryBirdsRobin

In order to make Angry Birds more fun to play, the programmers had to make the physics less realistic. People have gotten  used to it, and when a game like Flappy Bird comes along with realistic physics, people exclaim that it must be wrong. As one of my students notes:

 

UPDATE 31 Jan 2014:
Inspired by a tweet from John Burk,

we made a video showing Flappy Bird falling at the same rate as a basketball:

Here’s what I did: We determined from the analysis above that Flappy Bird is about 24 cm across. Conveniently, basketballs are also about 24 cm across. So I had my physics teacher colleague Dan Longhurst drop a basketball so I could video it with my iPad. Dan just needed to be the right distance away from the camera so that the size of the basketball on the iPad screen was the same size as Flappy Bird on the screen (1.5 cm). Next, I played the basketball drop video and Flappy Bird on side-by-side iPads and recorded that with my phone’s camera. Once I got the timing right, I uploaded the video to YouTube, trimmed it, made a slow motion version in YouTube editor, then stitched the real-time and slow motion videos together to create the final video you see above.

UPDATE 1 Feb 2014: While the gravitational acceleration in Flappy Bird is realistic, the impulse provided by the taps are NOT realistic. Here’s a velocity-time graph showing many taps. When a tap happens, the velocity graph rises upward:

FlappyBirdConstantPostTapVelocity

As you can see, no matter what the pre-tap velocity (the velocity right before the graph rises up), the post-tap velocity is always the same (a bit more than 2 m/s on this scale). This means that the impulses are not constant. In real life, the taps should produce equal impulses, which means that we would see that the differences between pre- and post-tap velocities are constant.

TL;DR: Is the physics in Flappy Bird realistic? Yes AND no.
YES: The gravitational pull is constant, producing a constant downward acceleration of 9.8 m/s/s (if we scale the bird to the size of a robin).
NO: The impulse provided by each tap is variable in order to produce the same post-tap velocity. In real life, the impulse from each tap would be constant and produce the same change in velocity.

UPDATE 1 Feb 2014 (2): Fellow physics teacher Jared Keester did his own independent analysis and shares his findings in this video:

 

What Happened When I Gave Them the Answers

Frank Noschese:

Reblogging today’s 180 blog post to Action-Reaction in order to try to get more feedback from folks. Click through to read more and leave comments over there. Thanks!

Originally posted on Noschese 180:

[TL;DR - Not as much as I had hoped.]

College-Prep Physics: Students came to class with the following question completed for homework:

You are on a sleigh ride in Central Park one brisk winter evening. The mass of the sleigh with everyone in it is 250 kg, and the horses are pulling the sled with a combined horizontal force of 500 N. The sled moves at a constant speed of 3.33 m/s.
(a) What is the force of kinetic friction on the sleigh?
(b) What is the coefficient of kinetic friction between the sleigh and the ground?

I asked everyone to whiteboard their answers. I heard some students say they didn’t get it. Several other students came up to me — worksheet in hand — to ask if their answer was right.

“I’m going to give you the answers,” I said. “Here they are.”

IMG_20131119_181312

“Now on your whiteboards, I want…

View original 536 more words

Quizzes vs. Projects (Mass & Weight Edition)

Tests are evil, let them do projects.

That type of rhetoric frequently appears in my Twitter stream. My gut reaction is hell yeah. But some recent quiz results have gotten me thinking ….

Take for example, this learning objective:
The student understands the difference between mass and weight.

Here’s a student project (not mine) which clearly addresses the objective.

Here’s another project (also not mine). This one is very creative and totally adorable.

But those two projects are really just rehashes of the traditional explanation of the difference between mass and weight: “mass is the amount of stuff an object has and doesn’t change, while weight is the gravitational pull on an object and can change depending on location.” I wonder what would happen if those two students encountered quiz questions like the ones below. Would they make the same mistakes as several of my students did? I feel that even though my students can parrot back the difference between mass and weight (like in the above videos), they don’t really understand that difference if they miss these type of quiz questions:

New Doc 32_2

New Doc 32_3

New Doc 32_4

I did find one project where a student (again, not mine) gives a thorough explanation and uses several examples. I predict that this student should be able to answer those quiz questions.

What I’m trying to say is that I feel that teacher-generated questions and experiences (quizzes, labs, whiteboard problems, etc.) are important because they challenge students to think and apply in ways they probably wouldn’t if we just left them to their own devices.

But I also get that projects let students be creative and allow them to demonstrate their understanding in ways that quizzes simply can’t.

Perhaps the answer is just “all things in moderation.” Or perhaps the project parameters need improvement so students aren’t simply reciting Wikipedia definitions from a Powerpoint? Or something else?

What are your thoughts?

Edtech PR Tips

I’m not a PR guy. I’m just a teacher. But they say that if you want to be a disruptor, the best experience is no experience. So here goes…

1. It’s not about the technology. It’s about what students are empowered to do because of your technology. Show us how you take students beyond what they could do previously. Show student work (“Hey, look what this kid can do!”). Stop focusing on checkmarks, badges, data, dashboards, and slick UI.


2. Learning is social. Show students interacting with each other, questioning, helping, constructing — all as a result of using your technology. Don’t show kids glued to screens, headphones on, working en masse and in isolation. It’s creepy.

rocketship-charter-schools

The Learning Lab at a Rocketship school, where students spend 2 hours each day.

3. Don’t use phrases that signal you have simplistic views about teaching and learning. In particular: Learning stylesdigital nativesindividualized instruction, and content delivery.


4. Practices are equally as important as content. Show how you enable students to engage and grow in the core practices in math, science, and ELA.

PracticesVennDiagram

Credit: Tina Cheuk, tcheuk@stanford.edu [PDF (scroll to bottom)]

5. Show how you implement/compliment research-based practices about how students learn. Study up on these characteristics of effective teaching methods. Otherwise…


6. Run controlled, peer-reviewed experiments that use conceptual diagnostic tests to measure growth. We know most anything works better than (or as well as) passive lecture instruction. But how does implementation of your technology stack up to other evidence-based teaching methods? And be sure to use conceptual diagnostic tests, not final exams or standardized tests or failure rates. CDTs have been painstakingly researched and designed to measure true conceptual understanding rather than algorithm memorization. Without strong evidence, we’re just skeptical of your claims.

hake1

Hake’s analysis of 62 different physics courses as determined by gain on a physics conceptual diagnostic test.

7. Don’t contradict yourself. Your words should match your actions.


8. Show feedback and testimonials from students. In particular, have students demonstrate their deeper understanding and expert thinking that has been enhanced by using your product. Or perhaps your technology has decreased student anxiety and contributed to a positive classroom climate. However, don’t have students talk about shallow things such as raising grades and doing well on tests.

MyEconLab

Testimonials from Pearson/Knewton’s MyEconLab

9. There’s nothing revolutionary about old wine in new bottles. A digital textbook is still a textbook. A video lecture is still a lecture.


10. Read everything Audrey Watters writes. Everything.

Do you have any more edtech PR tips to share? Any more examples of bad PR? Any good examples? Thanks!

Convincing Reluctant Teachers

This question was posted to Twitter today:

Question: how do you convince teachers who are ADAMANT that they teach to the rigor required by CCSS that they really don’t?

(CCSS means Common Core State Standards)

This is a great question. I think it applies to a wide range of situations. You can replace “CCSS” with the Next Generation Science Standards, the new AP Physics 1 and 2 course, or any curricula du jour. It all boils down to showing these teachers that traditional teaching methods do not lead students to a deeper understanding of the concepts.

Some folks may suggest showing the reluctant teachers sample test questions from the new assessments. I say stay far away from that. These teachers will likely look for tricks to game the assessments so students can be successful without the in-depth understanding these teachers think they are teaching.

My suggestion is to have the reluctant teachers administer a basic conceptual diagnostic test to their students. The questions are so basic, so easy, the teachers will say “Of course my students can ace this!”

And then wait for the results to come in.

In all likelihood, the students (on average) will do poorly. Amazingly poorly. Even worse than if they had simply guessed randomly.

To which the reluctant teacher responds, “What happened? They should have known all this!”

Now’s your chance. I think now they’ll be more receptive to what you have to say about how students learn math and science and why interactive engagement techniques work.

***

Here’s Erik Mazur (Harvard physics professor) explaining what happened when he gave his students a conceptual diagnostic test:

(The video is an excerpt from Mazur’s longer “Confessions of a Converted Lecturer” talk.)

***

Extensive lists of concept inventories can be found at FLAG and NC State. Remember, many of these tests have been painstakingly developed and refined by researchers. Be sure to abide by the developers’ rules with administering the tests to students. You should not post them to the internet or discuss the answers with students.

My Google Reader Alternatives

Google Reader will be ending on July 1st. After searching through several apps and services, I’ve finally settled on a few alternatives I like.

First, I imported my Google Reader feeds to Feedly. This can be done in just one-click if you allow Feedly access to your Google Reader account.

However, I hardly ever read posts in Feedly because:

    • I can’t star/favorite posts. (However, I can “bookmark for later.”)
    • I was having issues with the Feedly Android and iOS apps: My feeds wouldn’t sync. The posts I had read would reappear as unread. The apps also lack the option to star/favorite posts.
    • The Feedly Android and iOS apps do not allow for offline reading.
    • Feedly apps don’t show videos (not even thumbnails) that are embedded in posts. Feedly has corrected this problem. Thanks, @Alby!

Thankfully, there are iOS and Android apps that sync with Feedly and solve all the problems above. Here are my two favorites:

iOS: Newsify (free, no ads)

Photo Jun 29, 7 48 00 AM

Newsify also shows embedded YouTube videos:

Photo Jun 29, 7 48 39 AM

Android: gReader (free with ads, $4.99 no ads)

2013-06-29_07-43-53 (1)     2013-06-29_07-44-06 (1)

gReader also shows embedded YouTube videos:

2013-06-29_07-45-12

Plus, gReader lets you easily subscribe to new blogs via the share option in the browser:

2013-06-29_07-50-52

Most importantly, Newsify and gReader play well with each other and stay synced so I can easily read, star, and share posts from both my Android phone and iPad.

Be sure to move your feeds to Feedly before July 1st! Good luck!