Tag Archives: lattice

Standard algorithms unteach place value

I found a page full of computations sitting around the house this evening. Naturally, I picked it up and gave it a look.

Griffin (10 years old, 5th grade) had been doing some multiplication in class today. Somehow his scratch paper ended up on our couch.

Here is one thing I saw.

37 times 22 with the standard algorithm. Wrong answer: 202.

Naturally I wanted to ask the boy about it. He consented.

Me: I see you were multiplying 37 by 22 here.

Griffin (10 years old): Yeah. But I got it wrong so I did it again with the lattice.

Me: How did you know you got it wrong?

G: I put it in the answer box and it was wrong.

It turns out they were doing some online exercises. There is an electronic scratchpad, which he found awkward to use with a mouse (duh), plus his teacher wanted to be able to see their work, so was encouraging paper and pencil work anyway.

I was really hoping he would say that 37 times 22 has to be a lot bigger than 202. Alas he did not.

Anyway, back to the conversation.

Me: OK. Now 37 times 2 isn’t 101. But let’s imagine that’s right for now. We’ll come back to that.

G: Wait. That’s supposed to be 37 times 2? I though you just multiplied that by that, and that by that.

He indicated 7 times 2, and then 3 times the same 2 as he spoke.

Me: Yes. But when you do that, you’ll get the same thing as 37 times 2.

A brief moment of silence hung between us.

Me: What is 37 times 2?

G: Well….74.

Let us pause to reflect here.

This boy can think about numbers. He got 37 times 2 faster in his head than I would have with pencil and paper. But when he uses the standard algorithm that all goes out the window in favor of the steps.

THE STEPS WIN, PEOPLE!

The steps trump thinking. The steps trump number sense.

The steps triumph over all.

Back to the conversation.

Me: Yes. 74. Good. I like that you thought that out. Let’s go back to imagining that 101 is right for a moment. Then the next thing you did was multiply 37 by this 2, right?

I gestured to the 2 in the tens place.

G: Yes.

Me: But that’s not really a 2.

G: Oh. Yeah.

Me: That’s a 20. Two tens.

G: Yeah.

Me: So it would be 101 tens.

G: Yeah.

I know this reads like I was dragging him through the line of reasoning, but I assure you that this is ground he knows well. I leading him along a well known path that he didn’t realize he was on, not dragging him trailing behind me through new territory. We had other things to discuss. Bedtime was approaching. We needed to move on.

Me: Now. We both know that 37 times 2 isn’t 101. Let’s look at how that goes. You multiplied 7 by 2, right?

G: Yup. That’s 14.

Me: So you write the 4 and carry the 1.

G: That’s what I did.

Me: mmmm?

G: Oh. I wrote the one

Me: and carried the 4. Yeah. If you had done it the other way around, you’d have the 4 there [indicating the units place], and then 3 times 2 plus 1.

G: Seven.

Me: Yeah. So there’s your 74.

This place value error was consistent in his work on this page.

Let me be clear: this error will be easy to fix. I have no fears that my boy will be unable to multiply in his adolescence or adult life. Indeed, once he knew that he had wrong answers (because the computer told him so), he went back to his favorite algorithm—the lattice—and got correct answers.

I am not worried about this boy. He is and he will be fine.

But I want to point out…I need to point out that this is exactly the outcome you should expect when you go about teaching standard algorithms.

If you wonder why your kids (whether your offspring, your students, or both) are not thinking about the math they are doing, it is because the algorithms we (you) teach them are designed so that people do not have to think. That is why they are efficient.

If you want kids who get right answers without thinking, then go ahead and keep focusing on those steps. Griffin gets right answer with the lattice algorithm, and I have every confidence that I can train him to get right answers with the standard algorithm too.

But we should not kid ourselves that we are teaching mathematical thinking along the way. Griffin turned off part of his brain (the part that gets 37 times 2 quickly) in order to follow a set of steps that didn’t make sense to him.

And we shouldn’t kid ourselves that this is only an issue in the elementary grades when kids are learning arithmetic.

Algebra. The quadratic formula is an algorithm. Every algebra student memorizes it. How it relates to inverses, though? Totally obfuscated. See, we don’t have kids find inverses of quadratics because those inverses are not functions; they are relations. If we did have kids find inverses of quadratics, they could think about the relationship between the quadratic formula:

x=\frac{-b \pm \sqrt{b^2-4ac}}{2a}

and the formula for the inverse relation of the general form of a quadratic:

y=\frac{-b \pm \sqrt{b^2-4ac+4ax}}{2a}

Calculus. So many formulas (algorithms) that force students not to think about the underlying relationships. If we wanted students to really think about rates of change (which are what Calculus is really about), we might have them develop a theory of secant lines and finite differences before we do limits and tangent lines. We might have Calculus students do tasks such as Sweet Tooth from Mathalicious (free throughout October!). There, students think about marginal enjoyment and total enjoyment.

On and on.

This is pervasive in mathematics teaching.

The results are mistaken for the content.

So we teach kids to get results. And we inadvertently teach them not to use what they know about the content—not to look for new things to know. Not to question or wonder or connect.

I’m telling you, though, that it doesn’t have to be this way.

Consider the case of Talking Math with Your Kids. There we have reports from around the country of parents and children talking about the ideas of mathematics, not the procedures.

Consider the case of Kristin (@MathMinds on Twitter), a fifth grade teacher, and her student “Billy”. Billy made an unusual claim about even and odd numbers. She followed up, she shared, we discussed on Twitter. Pretty soon, teachers around the country were engaged in thinking about whether Billy would call 3.0 even or odd.

But standard algorithms don’t teach any of that. They teach children to get answers. They teach children not to think.

I have read about it. I have thought about it. And tonight I saw it in my very own home.

Advertisement

Diagrams, week 10 (bonus)

20121102-115010.jpg

A former calculus student who is tutoring in a local elementary school stopped by to ask about how and why decimal points work in the lattice algorithm for multi digit multiplication. Here’s the residue of our conversation.

This is entirely predictable (Algorithms edition)

Remember that Kamii quote from a few weeks back?

These errors are usually considered careless errors. They are not careless errors; they come from an inability to think.

Oh, and that other one?

Algorithms unteach place value

Yeah. They’re strong claims. Overly strong. “An inability to think” isn’t really right. “Not thinking in this circumstance” is closer to the truth. Kamii is not a woman of nuance. That’s part of what I enjoy about her work. But it does require interpretation.

All of which builds up to an entirely predictable scenario yesterday morning. Griffin (who is seven years old, recall, and whose ability to think has been well documented here) has been well schooled in traditional addition and subtraction algorithms this year, and has learned the lattice algorithm. He loves the lattice, although he struggles to make a lattice neatly enough to do the algorithm (for reasons that will become apparent when you see his handwriting below).

All of these are digit-by-digit, ignore-place-value-as-you-work sorts of things.

Griffin asked at breakfast, Is 86 divided by 22, 43? and wrote the following:

Kamii saw it coming…

Showing v. Telling: Lattice Continued

That last post wasn’t really about the lattice algorithm. This one isn’t either. It’s about supporting claims with evidence. (The last one, in case you’re keeping score, was about crafting tasks that show instead of spiels that tell).

Another claim my future elementary teachers like to make is that the lattice is inefficient for problems with large numbers of digits. The idea is that you have to spend a lot of time making that lattice before you can multiply. This stands in contrast to the standard algorithm, which you can just get started with straight away.

They make the claim, but they don’t back it up with evidence. (This, after all, is part of what a college education is supposed to teach-how to build arguments-so I’m not complaining that they don’t. I understand that it is my job to teach them to do so.)

So I began to wonder whether the lattice-drawing really did set one back.

So I put it to the test. Ten digits by ten digits. On your mark, get set, GO!

By the way, I was going to do three algorithms head-to-head. I was going to do partial products in the middle of the board. But about a third of the way through, I got fed up and quit. That one really is inefficient for large numbers of digits.

Place value and the Lattice Algorithm (or A Simple Task, Years in the Making)

I’m going to assume you know the lattice algorithm for multidigit multiplication. If you do not, and if you would like a primer, here is one.

This post isn’t really about the lattice algorithm, but it’s the context for what I’m really trying to say, which is this: It is worth the time to craft classroom tasks carefully.

I have used the lattice algorithm for years with my future elementary teachers. We learn the steps in class, they go off and practice it. And then they write about it, using the ideas of the course to analyze the algorithm.

After a number of semesters of this, I became tired of reading in their work some variant of the following claim,

The lattice algorithm is very good for teaching place value because you have to pay attention to the places as you work with it.

I could not disagree with this claim more strongly. As I work the lattice, I am going digit-by-digit. I am absolutely NOT thinking about the values of those digits. And I suspect most children are not either. This makes it an efficient algorithm.

Last semester I decided to put that claim to the test. If these future teachers thought the lattice algorithm exposes important ideas of place value, then what task could I give them to demonstrate that it does not?

Well, they have been analyzing the algorithm; they have written papers about it. So if it teaches place value, they should be able to ace any place value task involving the lattice, right?

So here’s the task: Perform the lattice algorithm to multiply 7,343 by 1,568. When you are done, use a marker to highlight each and every tens digit in the lattice.

No follow-up or clarification questions allowed. If the premise is that the lattice helps us to learn place value, then we should know enough about place value to make a commitment to the meaning of a tens digit.

Can you guess which of the answers below is the more popular in my classroom?

When both are presented, a really useful discussion of the algorithm and its position with respect to place value ensues. And that discussion helps to explain the really clever “slide trick” for placing the decimal point (as seen about 2:30 into this video).

But back to my point. I can tell my students that the lattice doesn’t bring place value understanding along for free. Or I can show them. Showing requires carefully crafted tasks. But I find it’s worth the time.

When I have the choice between telling and showing, I nearly always choose to show.

Which is why I’m always running behind on content coverage.

I made my peace with that years ago.