**Tom Tucker - Colgate University**

What is the population of the United States? How far is it from my house to my office? How much is the national debt? All of these questions have numerical answers, sort of, but we all know it would be absurd to answer 307,285,671 people or 1135 feet, 11 inches or $11,245,734,298,635. First, none of these questions is well-defined. For the population of the US, at what instant are we talking about? Are we including US citizens who are living abroad? For the distance from my house to my office, where in my house? Where in my office? As the crow flies (an expression that should tell you we are in trouble here)? For the national debt, again at what instant? Second, even if the quantities were well-defined, they are all measurements subject to error. For most measurements, you’re lucky if you can get an error as small as 1 part in 100.

A few years ago, I served on a committee to revise the K-12 Mathematical Standards for New York State. I had followed the NCTM Standards since the late 1980s and there was one “strand” that always bothered me: Measurement. In the early grades, there were recommended activities with rulers, such as measuring the height of a desk. By grades 6-8, measurement consisted mostly of formulas for areas and volumes. By grades 9-12, the strand petered out. It was always the thinnest of the five strands. As I sat through the committee meetings, it occurred to me that it should be the thickest. My rule is this: *If it has units and an error, then it’s a measurement*. By that rule, every number of importance in our lives is a measurement.

Mathematics education in the US has done pretty badly on the matter of units, which is mostly ceded to science education. What a shame, since units are enormously helpful in understanding equations, notation, and terminology. Although it now seems generally accepted that mathematics should be taught from the algebraic, graphical, and numerical viewpoint, there is a fourth medium for presenting mathematics: words. And units are words, which reach students in ways that algebra, graphs, and numbers cannot.

The real failure of mathematics education, however, has been its treatment of error, especially relative error. Is an error of a foot a big error? Yes, if you are measuring my height, no if you are measuring the elevation of Mount Everest. Mathematicians know that only relative error makes sense. But relative error is almost nowhere to be found in the mathematics curriculum.

I was on the Mathematics AP Committee for the College Board when calculators were first allowed on the exam. We had to decide how much accuracy we required. Of course, any scientist would give an answer in terms of significant digits. Because we knew, however, that the concept of significant digits was not a standard part of the K-12 mathematics curriculum, we said instead “three digits to the right of the decimal point” (no mention of floating point). A year later, we wrote an exam problem on US soda consumption that entailed numbers in the billions of gallons. If a student chose gallons as their units, rather than billions of gallons, the required answer had 14 digits, more than a calculator could handle at the time. I always thought a clever student should have chosen quadrillions of gallons as her unit and given the correct answer of 0.000. The AP Committee had been painted into a ridiculous corner by the failure of the US mathematics curriculum to deal with relative error.

Estimation skills may sometimes be viewed as just another form of “fuzzy math,” even though when we grow up, we find in our everyday lives that there are two kinds of arithmetic: what we do in our head and what we do with a calculator or computer (everybody who does their tax forms with pencil and paper only, please raise your hands). But I am talking about the role of estimation in measurement, not arithmetic. A proper approach to measurement would also involve estimation. Physicists have long played the Fermi game of trying to estimate some strange quantity—the real estate value of Hamilton NY, the number of people who have ever played Major League baseball, the number of piano tuners in Chicago. All involve seat-of-the-pants estimates based on a few facts, like the population of Chicago. But why aren’t math students playing this game? Why aren’t 6th graders being asked about how many ice cream cones their school eats in a year? Why aren’t 9th graders being asked by their math teacher to go home and come back tomorrow with their estimate of the number of gallons of gas consumed annually by cars waiting at red lights, with a full explanation of their answer?

I understand that the other strands—number and operations, algebra, geometry, data analysis and probability—are important, especially for the scientific infrastructure of the US—but I would feel a lot more comfortable about the public understanding of the costs and benefits of our country’s policies if I knew we could get the measurement strand right.

I think the points raised here about relative error and students' ability to estimate are good ones. It also makes me think that some aspects of number sense could be developed in the process of "thickening" the measurement strand. For instance, students could use estimation and measurement tasks to develop a better feeling for the actual sizes of numbers (e.g., the population, distance, or debt mentioned at the top). I have found that many students (and adults) have an inadequate sense of magnitudes, and working more thoroughly with measurement might help.

ReplyDeleteanother somewhat related point to get across with various real world demonstrations is that we inevitably overestimate our own personal ability to estimate things, that is we repeatedly underestimate our own personal margin of error (by a factor of 2-3 or more most often) when we make estimates in everyday life...

ReplyDeleteWhile on the verge of sounding like your run of the mill social science fluff this fact is easily demonstrated in a few minutes in any math classroom without resorting to trick questions or any other such artificial methods.

Since you cannot fully talk about measurements and sampling without talking about margins of error and confidence intervals I think this makes for a useful math class demonstration that says something useful to the students about the "real world" as well.

PS:

I've always been annoyed by media (and other) reports extolling incomplete statistical results as if they were certain "facts" without stating sample sizes, methodologies, standard deviations, or any kind of confidence level whatsoever.