Must’ve had squares on the brain (thanks Dan and Anna!) because as soon as I sat down with the receipt I folded (and unfolded) it like this:
And then I asked Ainsley (4): “How many squares do you see?”
She counted the top row: “1… 2… 3… 4…” Then a long pause, followed by (pointing along the bottom row): “5… 6… 7… 8…”
We both stared at the receipt for a little while longer. Neither of us spoke for a bit. Then—for better or for worse—I broke the silence, tracing the perimeter of a larger square consisting of a 2-by-2 array of smaller squares. She helped me count this 9th square plus two more just like it, and we landed on a total of 11.
It’s hard to tell if those last three squares were lost on her, even after we traced them together. In all likelihood, they probably were. Anyway, that’s not the point of the post. (Nor is her original counting of eight, two rows of four at a time, though I’m confident there’s more than enough material for a blog post in the “how do you see it” conversation.)
Here’s where I am headed with this.
I found myself recalling the classic problem “how many squares on a chessboard?” And then I wondered, “how many squares on an n-by-n board?” I’ve explored both of these questions before, and while I cannot recall the generalization off the top of my head, I’m confident I could find an explicit formula if I tried.
That sequence of thought led to a new question, or rather, a question that’s likely been posed many times by others but is brand new to me:
How many squares on an m-by-n board?
I’m sharing it here sans answer because I don’t yet have an answer (and it’d be no fun to spoil your fun in tracking one down). But I’m excited to start exploring. It may not be the sexiest or most challenging problem in the world, but it’s grabbed my attention nonetheless.
And if I’ve piqued your interest, I’d love if you gave it a try as well.
In which case, drop a line in the comments describing your approach, your answer, and (bonus!) another question about rectangles/squares/boards that interests you.
]]>It only took me a few years to realize that this philosophy of personal development was woefully incomplete. (Quick learner, right?!) So I turned my attention to a more noble pursuit: engaging my students.
In version 1 of this approach, I tried to find ways to get my students to pay more attention to my better-than-they-used-to-be explanations. The net result? Nothing much changed.
In version 2, I put my energy into engaging students not with explanations, but with mathematics. I tried—and still try—to create opportunities for students to engage productively with a problem or a concept. As with everything else in my life, it’s a work in progress. But I’ve seen some promising results.
Last week at NCTM, these percolating thoughts combined with several tweets, sessions, and conversations and led to this thought:
The best way to grow as a teacher is to develop my capacity to listen, to hear, to understand.
(Quick aside: I suppose I might replace the word teacher with husband, father, neighbor, colleague, or stranger, and the statement would still hold.)
This doesn’t mean that I’ll stop working on those other skills. But it does mean I have a new passion for learning about listening—really listening—to students and their thinking.
If you know of any books, articles, or blog posts that might help me along, please share them in the comments. Or maybe you disagree with my thoughts above as some combination of wrong or incomplete? I’d love to hear your pushback in the comments as well. Thanks in advance!
]]>
Several commenters reframed the question as pedagogy vs content knowledge. I found that shift helpful, so let’s run with that.
My original plan was to highlight a few comments here, and then offer a twist (or two) on the original question. But the post received way more than a few comments, and all of them are super thoughtful. Instead of including a few excerpts here, I’m going to push against conventional Internet wisdom and encourage you to go read all the comments. (Seriously. It’ll only take a few minutes, and it’ll be worth every second.)
Alright, welcome back!
This discussion has raised a host of new questions for me. I’ll rattle off three of them here (and possibly more in a future post). Feel free to chime in on one or all. Or just lurk. That’s cool too.
At the risk of damaging the discussion by asking too many questions at once, here’s one more:
]]>
At one point, David asked an intriguing question that sparked a lengthy discussion. I’ll share the question here in the selfish hope that I’ll be able to hear a few more opinions, and continue my own pondering (and learning) in the process.
Here’s the question:
Suppose a teacher gets to divide 100% between two categories: teaching ability and content knowledge. What’s the ideal breakdown?
Update: Several commenters reframed the question as pedagogy vs content knowledge. I find that shift in language helpful. If you prefer that formulation—or if you’d like to push back and offer your own related question—go for it.
Update: I want to learn more about this, shared by elsdunbar: “Deborah Ball describes a horizontal content knowledge as ‘[A]n awareness of how mathematical topics are related over the span of mathematics included in the curriculum.'”
Related post: These thought-provoking comments have sparked a new set of questions for me.
Update: Mark Chubb addresses this question—and a great deal more—in his recent post Professional Development: What Should It Look Like? It’s fantastic. Go check it out.
]]>Describing the world with mathematics, in order to make reasoned predictions and decisions.
That I borrow heavily from Dan Meyer is readily apparent, especially when considering some of the activities I’ve created over the last couple of years (e.g., Charge!, LEGO Prices, Predicting Movie Ticket Prices, Mocha Modeling).
In each of those activities, students build a model in order to make a prediction, ideally a more precise one than the wild estimate I typically call for at the beginning.
A couple weeks ago, a colleague of mine (Jason Merrill) invited me to expand my definition a bit by considering how modeling often plays out in physics. Rather than a method for making precise predictions, modeling in the physics classroom (or laboratory) may sometimes offer a process for inferring material properties and physical constants.
Do Jason’s comments resonate with your experience? If so, can you share any exemplar activities in that inferring properties and constants vein?
How often does your modeling work serve as a means for making predictions? How often does it serve as a means for something else? How would you expand (or revise) the definition of modeling I’ve offered above?
]]>Several Mondays ago, I shared the Desmos Activity Builder version of Charge!, a linear modeling task where students predict how long it will take for a phone to become fully charged. Here’s the diagnosis.
In that post I suggested that the Activity Builder version of this activity was inferior to its original slide-deck-driven version, in part because it struggles with these principles from the Desmos activity building code:
Dave Johnston: “Can we give students more chucks of the data and give them an opportunity to revise their model? What if different groups of students had different data points along the way & they discussed the models they came up with?”
Nathan Kraft: “I’m trying to figure out what Desmos adds to the activity… This activity pigeonholes students into one way of doing it, and even for students in higher grades, I’d like to give that option to explore it differently… In the end, maybe this doesn’t work in activity builder.”
Elizabeth Raskin: “One of the beautiful things about 3 acts is allowing students to determine what information is important (to an extent) and what to do with it.”
Mark Kreie: “Utilizing the Classroom Conversation Toolkit w/ teacher pacing and pausing might be a benefit to this activity.”
I find myself wondering what several commenters suggested: maybe the best version of this task doesn’t live in Activity Builder. I’m definitely open to that possibility.
That being said, I still wanted to see how far forward I could push the AB version.
I recommend opening up Charge! v2 while you read through the rest of the post. Here is what’s new:
(Pssst. Did you see that video reveal?)
Is v2 better than v1? In my opinion, absolutely. Is v2 perfect? Not even close.
I think this new version better addresses Principle #8 (“Create objects that promote conversations…”). Fewer screens. More room for conversation.
But I don’t think I’ve addressed the more difficult Principle #5: “Give students opportunities to be right and wrong in different, interesting ways.”
There’s still just one way to move through this activity. It still feels too scripted. Maybe it does live best outside of Activity Builder. Or maybe not. I’ll continue thinking and tinkering. Hopefully most of my steps are forward.
What do you think of the upgrade? Which of the featured comments resonated with you? What could be done to better address Principle #5?
Let me know in the comments, or drop me a line on Twitter.
Cheers!
]]>These two developments play off one another, and we find ourselves with more and more opportunities to build increasingly interesting things.
One way we push our pedagogy forward is by holding up two activities and asking questions like:
Earlier this week, we took a close look at two linear modeling activities: LEGO Prices and Are People Waiting to Get Married?
If you’re up for it, I’d love to hear your analysis (similarities/differences, strengths/weakness, relationship to the building code) in the comments.
Each activity has its own strengths, and each activity has the potential to generate interesting class discussion. However, I think LEGO Prices does a much better job with creating problematic activities (Principle #4 if you’re counting).
Are People Waiting to Get Married? nibbles at the edges of a context in a largely disconnected way.
LEGO Prices hits you right out of the gate with a single, overarching question that will drive the rest of the activity: How much did that LEGO set cost?!
The key phrase for me here is in order to. That’s the difference. As much as Are People Waiting… has going for it (and, by the way, I rather love Screen 8), there’s no in order to attached to the tasks we’ve given students. Instead, it’s piecemeal. Screen-by-screen. Asking because we can, not because we must. Questions that serve themselves, rather than a single, coherent pursuit.
LEGO Prices suffers in other ways. For example, I think it does a fairly awful job with Principle #5: “Give students opportunities to be right and wrong in different, interesting ways.” It’s narrow-minded in that sense. There’s really just one way to move through the activity. I’m discouraged by that, and hope to discover creative solutions around that weakness in future activities.
And yet while it struggles with that, it doesn’t struggle with this: clarity of purpose. With the exception of Screen 5 (interpreting parameters), everything students do is done in the service of making the most insightful and accurate price prediction they can.
And that’s something I’ll be trying to infuse in similar tasks in the future.
What did you see in the activities? What did you think of my analysis? Where did it resonate? Where do you think it’s off base? I’d love to hear your thoughts in the comments.
Cheers!
]]>Exhibit A: Last weekend I spent an hour with a group of math teachers who woke up early on a Saturday morning to hone their craft. We were just steps away from the beach in Monterey, and yet they’re willing—hungry, even—to gather in a room to reflect on their own learning and teaching, even at 8 am. That’s just awesome.
If you weren’t able to join us, here’s a quick snapshot of the session:
Here’s a list of the activities we looked at during the session, along with the principles I had in mind for each one:
My hope is that these activities serve as exemplars for the principles, and that folks in the session walked away with a sense of how the principles might apply in other situations (whether Desmos or non-Desmos, digital or non-digital). We ended with a brief quiz to encourage folks to wrestle with these principles in new contexts.
Here’s the quiz. Take a look at each task, and let me know in the comments which principle(s) you think each activity exemplifies.
(To check your answers, turn the cereal box upside down. Wait, no. That’s something else. Never mind.)
The original presentation included quite a lot of video (to quickly show off the activities linked above), as well as a presenter (hi there!). So this version of the slide deck (which includes neither) may not be very helpful. But, in case it is, you can download it here.
]]>Some months back, I wrote an activity called Parallel Lines. Here’s the description:
“In this activity, students explore connections between the graphs and equations of parallel lines.”
It’s not awful. But it’s far from great. It really struggles with two principles of our building code in particular:
#4 – Create problematic activities. It’s not clear to students what they’re doing—or why they’re doing it—until the end of the activity (or maybe even at all).
#5 – Give students opportunities to be right and wrong in different, interesting ways. There’s really just one correct path through this thing. And I don’t believe there are interesting ways to be wrong here, either. Bottom line: expect a lot of similar, uninteresting student responses. I’m not sure that’s the best fodder for rich classroom discussion.
Drop a line (or two) in the comments, or let me know what you think on Twitter (@mjfenton).
I’ll be back Friday with a new treatment of my own.
Cheers!
]]>For the “building” piece, I shared five principles from the Desmos Activity Building Code. For each principle, I offered an exemplar activity.
In one case, I even offered an anti-exemplar. I find these contrasts between what works and what doesn’t particularly helpful for developing my own capacity to build more of what works.
With that in mind, I’m kicking off a new series here.
Every Monday, I’ll post an activity that fails to live up to the building code in some way. I’ll share one or two shortcomings that I see. In short, I’ll offer my diagnosis.
Every Friday, I’ll post an upgraded version of the activity, with some commentary on the changes I made. In short, I’ll share my treatment.
And that cycle will be helpful, at least for me. The process of select/reflect/revise/explain will gradually boost my activity building skills.
But! What happens between Monday and Friday, that’s where this could get really interesting.
Whatever level of participation you choose, I do hope you’ll join in the fun.
Keep your eyes on this space. The first activity + diagnosis is coming soon.
Cheers!
]]>