At breakfast one morning, Griffin (7 years old) asked me,
Griffin: What century do you think this is? The 20th or the 21st?
Me: It’s the 21st century.
G: Oh darn. I was hoping you wouldn’t say that. I’ll have to ask Tabitha and Mommy and a bunch of other people this question.
At this point, I was thinking that he was feeling pretty smart for knowing that the year 2012 is the 21st century, even though there are only 20 sets of 100 in 2012. I was wrong. He was making the opposite argument-we should call the present century the 20th.
G: What about when you were born, in 1970? Was that the 20th century or the 19th century?
Me: It was the 20th century.
G: See, I don’t think that’s right. It should be the 19th century.
Me: Because it starts with a 19?
Me: I see. Well, what about the year 50? Not 1950, just the year 50? What century do you think that was?
Me: Right. Well, we agreed that we would start counting with the first century instead of the zeroth century. So the year 50 was in the first century, and the year 150 would have been the second century.
G: Well, it shouldn’t be that way. I want to start counting at zero. So I’ll keep asking people and find people who agree with me.
A couple minutes later
G: So, have there only been people for 2000 years?
Griffin is an independent, contrarian thinker. If there is a way to think about something differently, or even a way to perform some physical deed differently, he’s all in. A critical thinker in the extreme, this boy never accepts “Because I said so” as an answer. This will serve him well in some areas of life and poorly in others. It makes parenting him a unique challenge.
From a mathematical perspective, it doesn’t matter at all whether Griffin thinks of this as the 20th or 21st century. Sooner or later he’ll give in to convention so that he can communicate with the rest of the population of the first world. The important mathematical thing is to explore the basis and the consequences of the argument he is making.
Is the basis of the argument purely the fact that 2012 begins with 20? Or is there an attention to place value? In other words, is he thinking about 2000 as 20 groups of 100, or just as beginning with 20? My question about the year 50 was intended to get at that. There is no beginning with in this case. He had no trouble, which suggests that he is thinking about groups of hundreds-there are no full groups of 100 in 50, so it should be the zeroth century according to his rule.
This is the consequence of his argument. If you’re going to argue that 2012 is part of the 20th century, you need to be ready to accept the idea of a 0th century.
Here’s another centurial weirdness for Griffin. Let’s say, for a minute, that we accept the idea of a Zeroth Century—which, by the way, is totally reasonable. Then what century does the year 100 belong to?
According to the Danielsonian Place Value Calendar, we should enter the 1st Century on January 1st, 100. But, on that date, we’ve still only experienced 0 whole groups of 100 years. There was no Year 0. Even if we start numbering the centuries at zero, we started numbering the years at 1, which is problematic. Remember that small but vociferous group of people who wanted to ignore New Year’s Day in 2000? This was essentially their argument, that the new millennium shouldn’t really start until 2001.
There is more to be said here about counting vs. measuring.
You could encourage him in this thinking by pointing out that many computer scientists and computer programming languages start counting with 0. It actually makes for much cleaner mental models.
I wouldn’t have felt convinced he understood place value…
This issue reminds me of how the Chinese count their age. Instead of saying “I’m 10”, they’d say something that means “I’m in my 11th year”. (If I understand correctly, it’s further complicated by using calendar years instead of counting your own years. So if you were born in December, you’d be in your second year after just a month.)
Sue, that’s an interesting point. For the west, age is measurable; for the Chinese, age is fundamentally countable. Hence the apparent weirdness of being in your second year of life after only, potentially, two days of breathing. It’s just two ways to approach the same problem. Whenever there is dissonance between counting and measuring, we run into these little quirks.
That’s why, Mr. Gasstation…I actually think starting counts at 0 is problematic. Usually, when we’re in measurement mode, we start at 0 and model with real numbers. When we’re in counting mode, we start at one and model with cardinal numbers. There might be great computer science reasons that, e.g., list in Python returns the 1st element in list, but that’s certainly mathematically confusing, because lists aren’t measurable. I’m not sure how it’s any “cleaner” mentally.
Danielson: post forthcoming addressing your whining re my last one.
I’ve not tried to articulate before why 0-based counting is cleaner. Perhaps it is because measuring and counting are more comparable.
Dijkstra gave one line of reasoning at http://www.cs.utexas.edu/users/EWD/transcriptions/EWD08xx/EWD831.html
Wikipedia has a better list of advantages: http://en.wikipedia.org/wiki/Zero-based_numbering
I get very annoyed with biologists whose aversion to 0 is so strong that positions in a gene are numbered 1, 2, 3, 4, … and -1, -2, -3, -4 (with no 0 position). It is like the B.C. and A.D. year numbering: the lack of a zero makes computation much messier.