Week 2 of the Decimal Institute begins with a claim that many experienced teachers will find obvious.

Namely: *Decimals are difficult*.

When students struggle with difficult things, it is the teacher’s instinct—indeed the teacher’s job—to help.

When students struggle with decimals, we frequently refer to money. The idea is this: Kids understand money. They are familiar with the notation of money which is based on decimal notation. They are familiar with the language of money: quarters, nickels, pennies, et cetera (bear with us, you folks from non-dollar nations, and do your best to follow the argument; we’ll wait for you if you need to Google something—I would need to do the same for shillings.) Students can bring their knowledge of money to bear on understanding decimals.

While I have no doubt that money has been helpful for students to get correct answers to particular problems, nor even that money can be the basis for students to build particular ideas about decimals (e.g. that ); I do have some critical questions about whether money is a strong foundation for building generalized decimal concepts.

Among these questions are the following.

**1.** If money is such a strong basis for decimal concepts, why do we so often see decimal errors with money?

**The Gallery of Misplaced Decimals
**(You may click to enlarge each one if you like)

**2.** Is it possible, as Max Ray suggests below, that the conception people tend to carry in their minds is of dollars and cents as separate units, as they do feet and inches?

@Trianglemancsd though I am curious what students use to make sense of cents prior to fractions. A dollars unit and a pennies unit, right?

— Max Ray (@maxmathforum) September 27, 2013

I report my height as 6 feet 1 inch. I do not report it as feet, although I know that I could. Likewise I don’t think of 1 hour and 5 minutes as hours, although I know this to be correct.

Is it possible that many people think of $1.25 as 1 dollar and 25 cents, rather than as dollars?

Maybe students are thinking of dollars and cents as *different units* that have a nice conversion rate, rather than of dollars as the natural unit and cents as a partitioning of that unit.

Follow-up questions: (a) Might Max’s insight help to explain the errors in the gallery of misplaced decimals? (b) What are the implications of this for using money to teach decimals?

**3.** Related to the foregoing: even when students do think of dollars and cents as more than just related units, is it possible that students are thinking of cents as the natural unit, and that dollars are built out of them? This would contrast with viewing dollars as the natural unit from which cents are partitioned.

I asked this question on the blog back in January, and readers answered it differently from the class of future elementary teachers I posed it to at the same time. What can we learn from that difference?

Is it just a coincidence that this table includes no fractions?

**4.** Even if we do think of 1 cent as of a dollar, does money support the repeated repartitioning that is essential to decimals? E.g. *Find a number between 0.04 and 0.05. *Does thinking about money support a student in getting to 0.041?

**5.** Finally, ask 100 sixth-graders how much money $0.1 is. I bet at least 30 of them say “1 cent”. Again, money seems to support particular decimal special cases, but does money help students generalize beyond those special cases to the important and challenging ideas underlying decimals?

—

Comments closed here. Let’s talk in the course and on Twitter under #decimalchat.

—

Follow this link to the announcement of this course for more information.

Instructions for joining the course:

This course has enabled open enrollment. Students can self-enroll in the course once you share with them this URL:

https://canvas.instructure.com/enroll/MY4YM3. Alternatively, they can sign up athttps://canvas.instructure.com/registerand use the following join code:MY4YM3