An *exercise* is a question that tests the student’s mastery of a narrowly focused technique, usually one that was recently ‘covered’. Exercises may be hard or easy but they are never puzzling...the path toward the solution is always apparent. In contrast, a *problem* is a question that cannot be answered immediately. Problems are often open-ended, paradoxical, and sometimes unsolvable, and require investigation before one can come close to a solution. Problems and problem solving are at the heart of mathematics. Research mathematicians do nothing but open-ended problem solving. – Paul Zeitz, *The Art and Craft of Problem Solving p. ix* To put it simply, you have a problem when you are required to act but don’t know what to do. – Ian Robertson, *Problem Solving* **Introduction** Almost every formal goal statement for mathematics programs and courses says that “problem solving” is a key goal. Here is a recent example from the Common Core Standards in Mathermatics: Students are expected to understand the knowledge described in the Core Concepts and in the Coherent Understandings at a depth that enables them to reason with that knowledge—to analyze, interpret and evaluate mathematical problems, make deductions, and justify results. The Core Skills are meant to be used strategically and adaptively to solve problems. (Available at http://www.corestandards.org/) * * Oddly enough, we often get no clear definition of what a real problem is and isn’t. The authors presumably assume that it is obvious what a “problem” is. Alas, as any thorough inspection of middle and high school mathematics classes and tests shows, that presumption is problematic. Most math assessments and assignments involve relatively simple exercises and always have. (See, for example, Archbald & Grant 1999, “What’s on the Test? An Analytical Framework and Findings from an Examination of Teachers' Math Tests” in *Educational Assessment*, Vol. 6 #4.) Why is that? Whatever the answer the problem is not new. It long pre-dates the standards and testing movement. Whitehead bemoaned inert isolated math teaching 100 years ago. Dewey famously contrasted real problems with pseudo-problems in *Democracy in Education*. NAEP and TIMSS results have indicated for years that American students do poorly at anything other than basic math (and it was clear to researchers of results and videotapes that high-performing Japanese students see more real problems than American students). John Goodlad puzzled over this same question more than 25 years ago in his landmark study *A Place Called School*: The impression I get from the topics, materials, and tests of the curriculum is of mathematics as a body of fixed facts and skills to be acquired, not as a tool for developing a particular kind of intellectual power in the student.... Interestingly, mathematics teachers somewhat more than teachers in other academic subjects perceived themselves as seeking in their students processes related more to learning how to learn than to merely acquiring mechanics. Many wanted their students to be logical thinkers, to learn how to attack problems and to think for themselves. Why, then, did so few mathematics teachers in our sample appear to get much beyond a relatively rote kind of teaching and textbook dependency? (*A Place Called School*, pp. 209-210.) This absence cannot just be due to recent reactions to external testing, therefore. This is a problem of long standing, occurring in private as well as public education; from grade school, through high school, and into college. *If problem solving really is the goal, why aren’t we doing more to achieve it? *And why do teachers *believe* – wrongly - that external tests reward only low-level practice in exercises when released tests show this not to be true? **What ***is* a problem? A closer look. Let’s reconsider the core question more carefully, based on Zeitz’ distinction: what are real problems for math courses and how do they differ from helpful though simple exercises in support of that goal? If a real problem is often “open-ended, paradoxical, and requires investigation,” as Zeitz argues and common sense suggests, then what would such problems be in algebra, geometry, and other courses? Let’s make some common sense distinctions, using the Zeitz criteria: Problem | Exercise | A puzzle: not all needed information is provided explicitly. Some “investigation”, inference, logic, and filling in of what is implicit is required. Lots of prior knowledge is tapped and tested. e.g. What is the relationship of perimeter to area in any square? | May be a “hard” challenge to memory, but “never puzzling”: Once the right algorithm is found in memory, we should not feel stopped in our tracks by the demand. Having identified the challenge type and recalled the algorithm, there is no uncertainty; we just need to “plug and chug.” e.g. What is the area of a square whose perimeter is 16? | Paradoxical: on the surface, it seems unsolvable, self-contradictory, i.e. involves leading seemingly impossible conclusions, logic, or assumptions. More broadly it must be a “non-routine” problem. e.g. “Take the number 4, and using it five times, write a mathematical expression that will give you the number 55.” | Straightforward request: once we understand the prompt, we should not be confused or puzzled by the demands; it looks routine. e.g. what number, when added to itself 4 times = 55? | Solution path not apparent: Even when I understand the problem statement and the givens have been explicated I still may not know exactly how to proceed; there are varied plausible approaches; multiple approaches may work; and different approaches may have different strengths and weaknesses. | Solution path is apparent: once recall is tapped properly, the method for finding the answer is known or quickly knowable. e.g. x + x + x + x + x = 55 | Open-ended: once into it, the work may require us to re-frame the problem statement, consider what counts as an answer, and consider that the answer may end up being: “well, it depends...” e.g.: “How much does it cost to take a shower?” [What counts as “cost”?] | Convergent: there is a single and unequivocal right answer. e.g. “Which uses more water: the average bath or the average shower?” | Have to do some investigation: Until we mess around with sample numbers or figures we don’t have any clear sense of how to proceed or perhaps even what the problem is really asking. e.g. How big a warehouse is needed to store a week’s worth of newsprint for use in printing presses by a major daily newspaper in a big city? | Little or no investigation: once the problem is cast in complete mathematical terms, we know how to proceed. e.g. Which is the bigger area: A 3-foot square figure or an equilateral triangle in which each side is 4 feet long? | **Real pure problems – not just applied math** A cautionary note, to forestall a possible misconception: I am *not* suggesting that only “real-world” problems that are immediately relevant to kids’ experiences count as “real problems” in mathematics. There is a limitless number of purely theoretical problems that mathematics students should encounter as part of a good K-12 education (e.g. the kinds of problems often found in math competitions). I am arguing that students rarely see real problems of *any* kind, applied or pure. They mostly confront simple math exercises, whether the content is “pure” or “applied.” In my chart on problems, for example I used the NPR radio show *Car Talk* as a resource. One of Car Talk’s notable features beyond the interchange with listeners about car woes is the weekly Puzzler. Many of those puzzlers are pure-math-related (as befits two alumni of MIT). (see for example: http://www.cartalk.com/content/puzzler/transcripts/200945/index.html) In short, we will get more problem solvers the more secondary math courses are framed around real problems, pure and applied; in which it is clear to students and parents as well as teachers and supervisors, that this is the aim of the courses. And sources for such problems exist everywhere – even on the radio. **The ironic excuse of standardized tests, if problem solving is our goal.** As noted above, we can expect to hear: “Grant, this is all well and good, but the state/provincial/national tests demand that we focus on algorithms and facts. There is no time for real problem solving and it won’t pay off on test day.” This plausible claim turns out to be incorrect, despite conventional wisdom, as I have elsewhere argued (“Time to Stop Bashing the Tests,” in the March 2010 issue of *Educational Leadership*). There are __numerous__ higher-order questions on all tests; disappointingly low results on such questions is the __norm__. What readers may not have fully appreciated is that the *context* of the testing situation can turn seeming exercises into problems. Consider, first, the conditions under which standardized tests are given (compared to those in which teachers give quizzes and tests in school). In a standardized test, the students lack *all* the usual context clues. We don’t know immediately which question came from which chapter, and no hints are available from the teacher or textbook recent history of lessons. On standardized tests, many items have also been deliberately designed to look somewhat unfamiliar, and one or more of the distracters has been carefully chosen to be highly plausible to students. The test – regardless of content - is a problem situation! In other words, *the testing situation is a transfer situation, not a recall situation*. Failure to note this is a key error with all simplistic “test prep” approaches. **What is a math problem and where can we find many more? Some examples** So, what must we see more of in American classrooms? Let’s look at some real problems. The first three problems sketched below involve applied mathematics and the remainder involve pure or mixed mathematics: 1. What’s the price point for maximal sales and profit of home-made sugar cookies at a Varsity basketball game? 2. How much available landfill volume is needed to handle the waste generated each year by our school? How much needlessly clean water is used for flushing toilets in our school – and what might be other viable solutions for using “gray” water? 3. What’s the fairest way to rank order teams where many don’t directly play one other (e.g. national college basketball during the season)? 4. Among grandfather’s papers a bill was found: 72 turkeys $_67.9_ The first and last digit of the number that obviously represented the total price of those fowls are replaced here by blanks, for they are faded and are now illegible. What are the two faded digits and what was the price of one turkey? 5. The length of the perimeter of a right triangle is 60 inches and the length of the altitude perpendicular to the hypotenuse is 12 inches. Find the sides of the triangle. 6. A train is leaving in 11 minutes and you are one mile from the station. Assuming you can walk at 4 mph and run at 8 mph, how much time can you afford to walk before you must begin to run in order to catch the train? 7. Pick any number. Add 4 to it and then double your answer. Now subtract 6 from that result and divide your new answer by 2. Write down your answer. Repeat these steps with another number. Continue with a few more numbers, comparing your final answer with your original number. Is there a pattern to your answers? Can you prove it? You might not love all seven of these, but they fit the criteria reasonably well. The solution path is neither stated nor painfully obvious; there is a bit of a puzzle in each one (usually in terms of implicit assumptions and unobvious solution paths); some seem unsolvable at first glance; and the solution will depend upon some mucking around as well as the development of and careful testing of a strategy. The middle two examples come from the Stanford University Competitive Mathematics Examination for high school students, developed by the great heuristics mentor – the author of *How to Solve It* - Georg Polya. (*The Stanford Mathematics Problem Book*, Polya & Kilpatrick, Dover 1974). The last three problems are noteworthy for a different reason. They are excerpted from the published problem sets given to __all__ 9^{th} grade math students at Phillips Exeter Academy. Math class at Exeter is entirely problem based. Students are given these problem sets each week, and homework consists in being prepared to offer your approach and solutions (or difficulties) in class the next day. In short, Exeter (arguably one of the best schools in the United States) takes it as a given that the point of math class is to learn to solve problems. Content lessons often *follow* upon the attempts to solve them rather than always preceding them. The departmental mission statement makes their aim and methods clear: We believe that problem solving (investigating, conjecturing, predicting, analyzing, and verifying), followed by a well-reasoned presentation of results, is central to the process of learning mathematics, and that this learning happens most effectively in a cooperative, student-centered classroom... To implement this educational philosophy, members of the PEA Mathematics Department have composed problems for nearly every course that we offer. The problems require that students read carefully, as all pertinent information is contained within the text of the problems themselves—there is no external annotation. The resulting curriculum is problem-centered rather than topic-centered. The purpose of this format is to have students continually encounter mathematics set in meaningful contexts, enabling them to draw, and then verify, their own conclusions...The goal is that the students, not the teacher or a textbook, be the source of mathematical knowledge. http://www.exeter.edu/academics/84_801.aspx Why is it so rare for an approach like this to happen in typical classrooms? Once you see what Exeter is doing, you cannot help but wonder about the problem of non-problems. Especially since this is what a student can expect in every college math and science course or workplace use of mathematics. **Conclusion** Until and unless mathematics assessments are built backwards from problems, and until such problems are constructed __before__ the instructional frameworks, we should not be surprised by frustratingly inadequate student achievement on challenging tests. The bad news is that mathematics education is and has been perpetually held back by weak assessments at the local level; the assessments, of course, derive from impoverished syllabi and a failure to link bottom-line course goals with daily practices. The problem of non-problems is thus real and an impediment to all the good things advocated now and over recent decades at the national level for reform of mathematics. Until and unless assessments and course frameworks are designed backwards from genuine problems leading to the need for real problem solving, not just recall of algorithms, we can expect mathematics performance levels to remain static and too low. The good news, of course, is that local assessment is in our complete control as educators. Our problem, therefore – like so many human problems – is a problem of our own unwitting making. We need not invent esoteric new math programs nor lament our fate as unchangeable. All we need do is take a close link at the alignment of our stated goals and our own assessments, design or find good problems to anchor our local tests, and the problem of non-problems in math education will be on the way to being solved. |