At any rate, to get that bad taste out of my mouth and set the stage for greater success on the next Des-man go around, I created the Dot Capture Game. Here’s what you need:
And of course, the handout:
Give a brief intro—or none at all—and turn ’em loose. If your experience is anything like mine, you’ll find yourself the weaving in and out of some great (albeit trivially-inspired) conversations about slope, intercepts, point-slope form, domain, range, inequalities and shading, vertices, direction of opening, etc.
This is definitely not high-quality modeling stuff (it’s not even low-quality modeling stuff), but it proved a great way to engage students with meaningful (read: productive) practice on a variety of topics related to graphing.
Oh, and the winner in my class? Here you go:
After trying this out in Algebra 1, I thought I’d throw it at my Algebra 2 and Precalculus students to see what they would do with it. It turned out to be good practice in those settings as well. Before sharing with these followup classes, a quick tweak to the handout was in order. In my first class, several students lost their graphs and expressions after hitting a deadly combination of keys on their device, and only one or two had been keeping a shiny written record. So to protect against future heartache, I added a second page to the handout. Here’s what one of them looked like at the end of class:
Here’s a sweet suggestion from Desmos:
]]>@mjfenton What if Ss rolled two dice to determine which curves they had to use and the numbers also represented the coordinate to capture?
— Desmos.com (@Desmos) May 22, 2014
…and then launched into an algebraic confirmation of that solution.
Now on the one hand, throwing a Desmos-generated graph into a “detailed solutions” handout is a great move because, well, just look at it. It’s beautiful. And hey! Multiple representations! Plus it took about 30 seconds from start to finish. No brainer, right?
Well, on the other hand, including something like that is dangerous, because when you find yourself writing the solutions to questions 6 and 7 (as I did just a few moments later), and these questions ask for a graphical display of the solution to a one-variable linear inequality… Well now you’ve tasted greatness, and you won’t settle for anything else.
There’s just one problem: Desmos doesn’t do linear inequalities in one variable.
Okay, that last sentence is actually not true. Desmos will graph linear inequalities in one variable. You just have to ask nicely. Check it out:
I imagine I’m not the only one to do this (and it would still be pretty cool if Desmos would add one-variable number line graphing functionality… Pretty please?), but I thought I’d share how to do it anyway, just in case anyone is curious (and wants to give one-variable graphing a little Desmos-love).
The best way to explain is to throw a few images in here and let them do the talking. Drop me a line on Twitter (@mjfenton) or in the comments if you have any questions (or tips for how to make this even easier or more awesome). Or if your name is Eli and you have a new feature to announce.
I’ve been meaning to blog about the “assessment workflow” in my classroom, but I’ve been putting it off because (a) time is limited, especially at the end of the school year, and (b) I wanted to be mostly satisfied with my workflow before I shared anything (and I’m not there yet).
I’ll write up the full details of how assessment happens in my classroom (it’s been a major work-in-progress this year), but for now I want to share a tiny bit of background and then cut to today’s breakthrough.
Last Sunday I aired some of my thoughts and questions on this topic to @Mythagon. A few other thoughtful folks dropped by to share their own ideas and pose a few new questions for me to chew on. It left me with a clear sense (as have other conversations) that my assessment routine fails students in the category of self-feedback. I’ve been trying to foster more (and better) student reflection in our assessment routine for several months now, and those efforts are the reason I’ve pasted this quick reflection form…
…at the bottom of every new assessment I write. However, I was looking for a way to incorporate something that would require students to be more thoughtful (just shading in a couple of boxes doesn’t necessarily demand any careful consideration) and at the same time foster a growth mindset among my students.
At the end of today’s assessment (after grading them; more on that in the next post), just before collecting everything, I gave students the following directions:
Two minutes later, I collected the papers and we moved on to something else. Later in the day I went through the papers to confirm the results and scores, to get a sense of common mistakes (again, more on this workflow later), and (this part was new today!) to read the SP and STI comments.
It’s early, but I’m sensing that this could be one of the most important features of my classroom in terms of developing a growth mindset among my students. I love the blend of looking back to celebrate something and looking forward at something (and how) to improve.
I’m wondering now about the best way to incorporate this SP/STI reflection into the “aftermath” of all my assessments. The comments (see below for some samples) were physically all over the place, with some easier to read than others. It might be worth the time (and “lost” space on the page) to add a little box near the top of the assessment with room carved out for the SP and STI comments. I’ll tinker with the layout and post an update if I come up with anything promising.
Here are the SP/STI reflections from the first eight papers in the stack today. Some comments are decidedly un-profound, but others are exactly what I was hoping for right out of the gate. I’m hopeful that my classroom will become a more thoughtful and reflective place through this routine. We’ll see how it goes next time.
]]>@mjfenton Trying the workflow today. Will be grading today. Instead of STI and SP, I used "Praise" & "Polish" (something we used earlier)
— Jedidiah Butler (@MathButler) May 9, 2014
The basic flow for each scenario:
One student was struggling with the two-solution result to bullet #3 above. “How could there be two answers?”
I’ve been asked this question before, and feel like I’ve been able to help students reasonably well with a combination of questioning, hand-waving, sketching, etc.
But today? I reached into my pocket and added one more element to the conversation:
Now, I love using Desmos on a laptop. Nothing beats that graphing experience in my mind. But to have a functional version of Desmos sitting in my pocket, ready to bring into a conversation at a moment’s notice… That’s cool.
]]>If you’re just tuning in, check out the first post in the series. Or Topic 5, if you fancy. (Topic 6 doesn’t exist. It’s a long story.)
There’s a point in the school year—toward the end of the first semester, usually—when my brain and my body get together to discuss whether there’s enough left in the tank to write something decent (be it a lesson, assignment, assessment, or whatever). Apparently, late in November 2011, the exchange must have been something like this:
Brain: “Okay, body, whaddya say? Let’s write a quality assessment for Topic 7, shall we?”
Body: “Must… sleep… so… tired…”
Brain: “What’s that? You think all we can manage is a pile of whatsit?”
Body: “Erghhhh… Where are we?”
Brain: “Okay, then, that’s the plan! Mediocrity, coming right up!”
At any rate, the result of my/our/their efforts was nothing to write home about (except maybe to lodge a complaint). I hereby present to you, two questions worth their weight in zero g:
The worst part of it? I based these questions off two I found on the CST, a multiple-choice test the quality and usefulness of which I regularly sneer at. A strange thing it is to despise one’s assessment muse.
Now then… On to happier times!
The first thing that had to die: The multiple-choice-ness of the problems. I’m not opposed to all multiple-choice problems in the world, just most. There are some decent questions here and there. In fact, quite a number of the ones I see in preparing students for the AP Calculus exam strike me as worth their weight in… I don’t know, maybe salt. (Modern day market value, of course.)
But multiple-choice on a graphing linear equations assessment? Not a good fit, in my estimation. For starters, students with no idea of what they’re doing could luck their way into a perfect assessment score, especially when there are only two questions. Next, the format invites students to select an answer without showing much of their thinking. And beyond that, I left myself no room for questions that demand any measure of critical thought. (More on that in a moment.)
With those concerns at least partially in mind, I wrote a new assessment with four questions:
Here’s what I like: Goodbye multiple-choice format, hello (potential for) students showing a record of their thinking.
The questions aren’t amazing, and they’re quite limited in scope as they’re all really just begging for an equation in slope-intercept form (my beloved point-slope form comes up in Topic 8). However, I think they’re a dramatic improvement over the original.
Here’s what I still can’t stand: The assessment is still overwhelmingly focused on procedural understanding.
I suppose this might not change until I revisit/rewrite my skills and concepts list for Algebra 1 to include a more rich approach to graphing lines, but it’s still disappointing to look at a mid-November assessment and see a total lack of “explain-your-reasoning-this” or “explain-your-reasoning-that.”
A quick word about #3 before moving on… I chose to display the ordered pairs so that students wouldn’t struggle with miscounting too-little-toner tick marks. When Desmos adds a “grid density” feature for improved (read: bolder) printing, I’ll consider removing those labels. In the meantime, they’re staying.
More than any assessment in this series so far, I’m super-excited to hear suggestions on how to address the weaknesses of the updated assessment. Have an idea? Please share!
Great idea from @BridgetDunbar:
]]>@mjfenton rough draft-what abt some sort of matching like this: w/o coordinates/labeled axes-then defend answers pic.twitter.com/GeQsk9TZwb
— Bridget Dunbar (@BridgetDunbar) February 8, 2014
It all started with this post, and the most recent assessment adventure is here.
When I first made the shift to standards based grading, I threw together a list of skills as best as I could. When I considered equations involving absolute value, I figured, “What else would I want them to be able to do beyond solving a simple equation and solving a more advanced equation?” Fool. Anyway, here’s the pair of questions on the original Topic 5 assessment:
I was actually half-proud of the sequence of lessons I taught leading up to that original assessment. We built (or attempted to build) an understanding of equations involving absolute value by launching from a verbal approach with heaps of number line action, to a graphical approach with fantastic lines of intersection (and this was pre-Desmos!), before finally exploring an algebraic approach (the one and only representation I remember seeing as a student).
At any rate, a mediocre-to-decent set of lessons followed by a decidedly weak assessment left much to be desired.
One issue I found in the original assessment that I tried to address in the update is that I had very little sense of who was struggling with the algebra and who was struggle numerically with the concept of absolute value. In other words, pick a kid who failed the assessment. Was his weakness only in solving equations, or was he unable to even evaluate expressions involving absolute value.
With that question in mind, I added three questions to the front end of the assessment:
#1-2 are nothing special, but they did give me a better sense of how deep a particular student’s struggles went. And #3 is standard assessment rewrite fodder for me, as you’ve no doubt seen in previous one-minute makeovers. Nothing like a little “explain the error” and “redo it correctly” to see what’s going on inside a student’s head.
The last two questions on the updated assessment are literally cut-and-pasted from the original assessment:
I decided the main point of this assessment was still, “Can you solve an equation involving absolute value or not?” (Is that a worthwhile goal? I’d love to hear your thoughts in the comments.)
I’m still largely dissatisfied with this assessment, though I don’t really know what to do with it. I feel like our in-class approach to this topic—warts and all—was richer than the assessment would suggest. In the next one-minute makeover I’ll explore some options for incorporating questions that dig into students’ understanding of this topic from a verbal and a graphical approach. And while I’m at it, the quick nod at a numerical approach could be strengthened, possibly by shifting evaluation from out-of-context “Hey, what’s the value of this?” questions to determining whether particular x-values are solutions of a given equation (via substitution).
Interested in the lessons and assignments that preceded this assessment. The brave shall enter through this door.
Anyway, more reflecting and tinkering is on the way. In the meantime, drop your own thoughts and questions in the comments below.
]]>If you’re just tuning in, consider checking out the first post in the series, or the most recent post.
We’re drifting into a little section of my Algebra 1 curriculum that I’m at least a little bit ashamed of. I have no one to blame but myself since I put the textbook on the shelf and created lessons, practice, and assessments from scratch. Big plans for improvement in the months ahead, but for now, warts and all…
Here are the two-and-only questions from Form A of my old Topic 4 assessment:
Oh, the shame! I’ll talk more about the gap between this assessment and all-that-is-decent-in-this-world in a moment. For now I’ll just remark that what these questions actually demand of students is so far below what I originally intended that they are essentially useless as an assessment tool in Algebra 1.
The first major flaw in the original assessment is that the questions appear out of nowhere and drift away from our attention just as suddenly. So I replaced these two unrelated questions with a sequence of five questions related to a single scenario. Here are the questions from the new Form A:
My initial intention was for students to use expressions and/or equations to answer the questions on the assessment. On the original version, almost none of my students approached the problems algebraically. Many were able to answer the questions (and many were not), but nearly everyone who answered correctly did so with nothing more than some numerical tinkering.
While I’m not opposed to numerical tinkering (quite the contrary; I think it’s a fantastic practice for students), in this class and on this assessment I was hoping to see whether they could write an expression to model a situation and use the expression to answer another question or two in an efficient manner.
With the original assessment, this was a lost cause. With the updated version (particularly #3-5) I was able to measure at least part of what I set out to measure.
While my current assessment is an improvement over the first version, it still strikes me as terribly inadequate. Here we are, at the end of a unit on linear modeling, and there’s a massive void when it comes to two hugely important things: (1) At no point is any connection made between the verbal/numerical/algebraic representations and a graphical one (and it would be so easy to fix this!), and (2) The scenario is decidedly boring and contrived.
I have ideas for how to address #1, but am at a loss for how to remedy #2 in the space of a single-page assessment. More to think about for the next round of revisions.
One additional minor/medium flaw I see in the updated version is this: At no point do I ask students to explain their reasoning, justify their thinking, etc. (And word on the street is that those are cool things to do.)
Until next time…
]]>The first post in the series is here. The previous post (Topic 2, Part 2) is here.
When I first drew up this assessment, my goals were to evaluate students’ ability at simplifying linear expressions and solving linear equations. Here’s what the two questions of Form A looked like:
I had the same all-my-eggs-in-one-basket problem with this original Topic 3 assessment as I did in an earlier assessment. If students aced these two questions, I knew they were capable of what they ought to be able to do. However, if they missed one or both, I was stuck without much information. There was no gradation in the all-or-nothing results.
Another issue: the assessment focused entirely on procedural skills and demanded nothing from students in terms of demonstrating deeper conceptual understanding.
As was the case with Topic 2 (detailed in posts here and here), I addressed the above concerns by lengthening the assessment quite a bit. The revised Topic 3 assessment weighs in at two pages and a total of ten questions.
In the first three questions, I try to get a read on whether students understand conceptually what a solution of an equation is. (For the record, what I’m looking for is something along the lines of “this value does/does not satisfy the equation,” along with numerical support—via substitution—of that claim.)
After that, students move through a series of four increasingly difficult linear equations, giving me the leveled progression I was lacking in the original assessment that would help me distinguish the “almost there” from the “completely lost.”
Next up, an error-analysis/explain-your-reasoning style question:
And to close, two more “solve” questions (including one at the same level of difficulty as the original Topic 3 assessment):
The net result of the these changes is a much stronger assessment, with improvements in at least two categories. The new assessment (1) provides me with more specific insight about student strengths and weaknesses, and (2) demands more of students in the way of critical thinking and clear communication.
I fully expect that this new assessment could be improved in half a dozen ways. Part of the beauty of teaching (and writing many of my own lessons and all of my own assessments) is the opportunity for continual improvement over the years. This job will never leave me bored!
Is there anything in particular you liked about the improvements I already made to my Topic 3 assessment? Do you have a few more ideas for making it even better? Share away!
]]>It all started here. In the last post, I looked at additive and multiplicative inverses. Onward!
The second half of my original Topic 2 assessment assessed whether students were able to evaluate expressions involving integers and various operations (including radicals, rational exponents, and a few other things). My original approach included a single question, with everything all smashed together:
For those who were able to evaluate the expression correctly, I got precisely the information I needed (“Johnny can do this, that, and the other thing.”). But for those who answered the question incorrectly… Was it because they were lost on everything? Or because they struggled with one skill in particular? While a close look at their work would often reveal the answer to that latter question, I find that I’ve stripped one of the benefits of SBG (specific insight into specific strengths and weaknesses) right out of the question.
To address that weakness, I bumped this section of the assessment up from a single question to several (three, in fact):
I lose a minute or two more of class time to administer the assessment, though I gain a quick and clear sense of who’s struggling with exponentiation, rational exponents, and simplifying expressions involving multiple radicals. Note that while grouping symbols are entirely absent from #9 above, they make an appearance in some of the other assessment forms, including this one:
Even with this more discrete-ified set of questions—which I view as an improvement over the original—I still feel like this assessment is short on critical thinking and “explaining your reasoning.” A nice quick addition might be to present students with an expression (similar to #9 above) with two (or three) incorrect step-by-step approaches (each of which has exactly one error). Ask the students to identify the error in each approach and then show their own (100% correct) step-by-step solution. Here’s a quick mockup:
I’ve now written four of these “better-assessments-in-sixty-seconds” posts. Since I’ve taken two posts to address each topic (the content fell rather naturally into four categories, rather than only two), I might want to consider breaking these apart for the purpose of grade book entries. I might even leave the assessment handout itself unchanged, but the idea of more refined grade book categories for tracking student mastery certainly has its appeal.
Thoughts on that last thought? Comments on something else? You know what to do.
Cheers!
I’m terrible at coming up with imaginary student names for my handouts. So I often use my students’ names or my kids’ names (I have lots to choose from in this second category, now!). Today I borrowed some names from a list of fictional butlers. Oh, I also have a preference for names to follow an A, B, C, etc., pattern.
]]>Last time in this quick-look-at-improving-assessment series (which began here) I shared my attempt at improving the questions related to distribution on an Algebra 1 assessment. As always, you can check out the topic list here (or here, if you want “I can…” statements as well).
This time we’ll take a look at a series of questions related to operations on numbers. Here’s the rubbish version (from the original Form A):
I was trying to get a read on whether students understood what additive and multiplicative properties are. For reasons similar to those shared in the first post in the series, this question type wasn’t particularly effective. Also, there’s the issue of “What am I actually trying to accomplish with these questions?” I don’t think I had that settled in my mind when I wrote the original assessment, and that led to the lackluster questions shown above.
If this assessment was going to improve at all, I first needed to nail down what I wanted to accomplish. Then I needed to work on better ways to ask questions (even just spicing up the originals with “explain your reasoning” or “defend your answer” would have been a nice start.
At any rate, I decided on three goals, so I wrote three mini-sections of the assessment. Here they are:
And here’s how I attempt to measure that on the new-and-hopefully-improved assessment:
Simple, but to the point. On to the next one…
Here’s how I tried to assess that skill:
I decided that this was actually the main reason we were exploring additive and multiplicative inverses in the first place, so a rather direct assessment question seemed appropriate. On to the third goal related to inverses…
The content isn’t profound or complex, so I thought it might provide a nice opportunity for students to create their first “mathematical” argument, one with complete sentences and mathematical “evidence.” With these two questions, I’m really trying to pave the way for more complex arguments students will make in Geometry, Algebra 2, Precalculus, and Calculus.
Now that I’ve written three of these posts, I’m wondering if I should add student work. I don’t have anything for the original versions, but for some of the revamped assessments I took pictures of strong and weak responses in order to facilitate in-class discussions the following day. If I can dig those images up, would they be worth posting? Share your thoughts (on this last question, or in general) below.
Cheers!
]]>