In Delaware, Jack Markell, Mike Murphy, Dave Sokola, Rodel, and Hefferman all insist that teachers should be fired when they do not show improvement on  test scores.   Although there is little controversy over the firing  of teachers who are universally deemed incompetent, there appears to be a great problem embedded in using only test scores to implement the personnel change…..

It appears that good people are getting fired over something they have absolutely no control over….   Deep in the bowels of the original Common Core research,  now receiving national attention,  was found research on the accuracy of textbooks in math. Particularly Algebra I and Algebra II….  The textbooks used for teaching were so error filled, that it was a miracle that teachers were able to compensate for them at all….  From the report….  Trust me.  This is scary….

The National Mathematics Advisory Panel commissioned a mathematician to look systematically for mathematical errors in

A. two widely used algebra textbooks, one Algebra I and one Algebra II, and

B. a chapter on linear equations in each of three other popular Algebra I textbooks.

A summary of the results is provided below.

(A) Error density of an Algebra I and Algebra II textbook is defined to be the following quotient expressed as a percent:

the total number of errors  /   the total number of pages in the book  = (density percent)

It was found that for the review noted above:

• Algebra I book has error density 50.2%, and

• Algebra II book has error density 41%.

This means that, for the Algebra I book, there is on average at least one error every two pages. The Algebra II book is slightly better in this regard, with about four errors in every 10 pages on average.

The analysis also provides additional information regarding the errors found within the Algebra I and Algebra II books. There are three types:

Type I: lack of clarity, minor errors, or misprints.

Type III: a gap in a logical argument or an error on a conceptual level.

Type II: an error that falls between those two types of errors.

The following table summarizes the error densities of these errors in both books:

Table B-1: Error Densities of Errors in Algebra I and Algebra II Textbooks

Algebra I                                     Algebra II  

Type I      20.4%                           12.1%

Type II     19.5%                           19.4%

Type III    10.3%                           9.6%

Task Group Reports of the National Mathematics Advisory Panel

An example of a Type I error is the statement that all lines with the same slope are parallel; the correct statement should be: two distinct lines with the same slope are parallel.

Two examples of a Type II error are:  pointing out that the method of solving a radical equation leads to an extraneous solution but without explaining exactly how or why, and stating that two functions are inverse functions of each other (e.g., exp and log) without giving their precise domains of definition.

Several examples of Type III errors are provided here; these are more serious errors:

• Graphing a function with a discrete domain of definition (e.g., the price of n articles) as a (continuous) straight line;

• Interpreting an event with a probability of 0 as an impossible event, and an event with a probability 1 as one that will definitely occur without specifying that this holds only for a finite sample space;

• Giving the first few terms of a pattern and extending it to the n-th terms as if the extension is unique;

• Using technical terms (e.g., linear regression) in a problem without giving their definitions;

• Conflating the definition of the negative powers and rational powers of a number with a theorem;

• Defining the slope of a line using two points on the line without pointing out the independence of the choice of the two points used, and later on;

• Pointing out such an independence without indicating that there is an explanation;

• Proving a general theorem (e.g., a law of exponents) by use of only two or three examples; and

• Giving the procedure of the long division of polynomials without explaining what it is about, i.e., never defining division with a remainder.

Readers should keep in mind that the error density of Type III errors is about 10% in these two Algebra books, i.e., students are going to find one such error every ten pages on average. This is definitely a cause for concern for both students and teachers.

In the second portion of the analysis, one chapter on linear equations in each of three other Algebra I textbooks is analyzed. These three books are referred to as b1, b2, and b3.  Because the corresponding chapter in the Algebra I textbook in (A) is also reviewed, this book is referred to as a. Here are the findings of the error densities in the chapter on linear equations in these four books:

Table B-2: Error Densities in Chapters on Linear Equations

Book           Type I                  Type II                   Type III                    Overall

a              21.7%                        16.7%                      6.7%                     45%

b1            21.2%                        6%                          6%                        33.3%

b2           14.9%                        9.2%                      3.5%                      27.6%

b3           2.9%                          2.9%                      4.4%                     10.3%

Two errors of Type III in the book b3 were inadvertently left out by the contractor, but the above computations of error densities did take these overlooked errors into account. The Type III errors involved are the following: one is on not mentioning the fact that the definition of slope of a line is independent of the choice of the two points in the definition, and the other is on not giving an explanation when this independence is mentioned in an example.

This table leaves open the question of whether the book b3, is in fact, significantly better than the rest of the available texts, regarding errors. An independent careful reading of this book suggests that, like the others, there is a concern relative to error frequency. This analysis again raises concern for teachers, students and all others using textbooks. It is imperative that authors, editors and publishers produce mathematically accurate textbooks.

That was the report embedded int the original task force research into the foundation of Common Core which was recently referred to the general public by Sandra Stotsky.

As anyone mathematically inclined would know, the omissions on these problems would cause the test taker to go in a completely different direction from the intended answer.   Now, considering the threats and blusters recently being made by those pushing Common Core, particularly towards using the tests to show how stupid America’s children are,…  it is not beyond credible reason to assume these mistakes may have been purposefully caused to throw the test results…

Whether they are or are not would be impossible to prove without using both Sodium Pentothal and a polygraph.  However it would not be impossible to prove that with these errors,  these tests should not be used to rate any aspect of our current educational process…

Advertisements