StudentShare
Contact Us
Sign In / Sign Up for FREE
Search
Go to advanced search...
Free

Principles of Data Interpretations - Book Report/Review Example

Cite this document
Summary
The paper "Principles of Data Interpretations" tells that these principles range from items such as, “Make certain you know what statistic is being used when someone is talking about the average…”, to “Do not confuse statistical significance with practical significance"…
Download full paper File format: .doc, available for editing
GRAB THE BEST PAPER95.6% of users find it useful
Principles of Data Interpretations
Read Text Preview

Extract of sample "Principles of Data Interpretations"

? Reading Educational Research: How to Avoid Getting Statistically Snookered Reading Educational Research: How to Avoid Getting Statistically Snookered Gerald W. Bracey’s stature in the academic world is a familiarity to many scholars and readers especially those of the Kappan Magazine, where his monthly column is a must read for those in the fields of study and data examination. His assessment of high profile investigative reports and the various interpretations by critics of the public education sector, and by the media is a ‘defence’ of appropriate methodology as it is of schools, especially those that are public. His book offers various pointers and guidelines to quality and educational research, with him presenting 32 ‘Principles of Data Interpretations’; to be representative advice to both interpreters and users of research. These principles range from items such as, “Make certain you know what statistic is being used when someone is talking about the average…” (p. 45), to “Do not confuse statistical significance with practical significance…” (p. 71). Broadly speaking, throughout the book, he interweaves these and other principles in the four chapters offering insights and factual advice to all those concerned with the fields of research and data analysis. He explores the different ways in which statistics have been and are used deliberately and exploited inadvertently so as to advance a specific line of thinking. He explains, in the introduction, of the origin of statistics that was rooted in the term “political arithmetic”; thereby setting the pace to which each reader should better understand that statistics, just like politics, are both imperfect and biased in nature. In these days, the public education sector is driven by numbers in almost everything and this can lead to misinformation both to teachers and parents; two entities fundamentally intertwined with the education sector. Schools are viewed by many as preparation platforms for the next level in a child’s life, where each segment is in preparation for the next segment in school. As an article on high schools in Texas, (Hacker, 2009) revealed high schools are full of expectations in the preparation of students for college as if ‘college ability’ and ‘high school ability’ are easily identifiable commodities. Two areas of failure in which we have equated school with planning for the next stage of school and life include what people identify and understand about brain development and the comparatively equitation of knowledge with learning (Bracey, 2006). Our understanding and knowledge of brain development have over the years evolved tremendously; this in tandem with little or no change in our views of learning at school. We acknowledge the existence of a sequence to cognitive development (such as the development of abstract thinking after the more familiar literal thinking) but are uncertain about the time/ phase when these developments take place for each child. This can be attributed to the relation between the current system of grading levels that is tied to chronological age or maturity but does not factor in the development of cognitive abilities in each child. The other failure is the arbitrary association of knowledge with learning. There is the classification of a given body of knowledge as being a certain grade’s material, with little consideration as to whether or not that material is what the children of that level need to know, let alone, are even able to know. An example has been the moving of curriculum, such as algebra, down to earlier grades so as to present these standards as rigorous; with the problem being that algebra requires a certain level of abstract thinking that is concerned with the growth of the brain (comparable to age and maturity). Thus, according to Ganem (2009), the asking of a student to do something earlier is not necessarily academically challenging to students; in fact, asking them to do something which he/ she cannot yet do (due to brain development) is counter-educational (Bracey, 2006). In school, children quickly learn that all they need to do is score at certain expected levels (theirs’ and/ or their parents’) on tests and all is well; this soon leading to the ideal that learning is simply the passing of exams through scoring the appropriate marks. Broadly put, schools have been reduced to gateway tests assessments that are done and approved by the “No Child Left Behind” (NCLB) policy, college entrance exams (ACTs or SATs) or the Advanced Placement tests – these often becoming the de facto measures used to gauge our entire school system’s effectiveness. Different educational and assessment experts such as Bracey (2006) and Popham (2003) continue to warn of the stupidity of linking single tests with learning; this despite the increasing focus on test research as a key aspect of school and the pervasive power of tests. It is in the former’s (Bracey) book “Reading Educational Research: How to Avoid Getting Statistically Snookered” that he explains the myriad of ways in which statistics have been deliberately and inadvertently exploited to advance a certain line of thinking. Throughout the book, he uses the 32 principles/ maxims to share relevant examples and to explain some of the fundamentals of data collection and interpretation. He introduces primary concepts like correlation coefficient and standard deviation, among others, in order to give the readers some of the basic tools necessary to grasp the concepts of educational research (Bracey, 2006). The book, divided into four chapters, focuses on different themes that give readers a well-rounded appreciation and understanding of statistical skepticism with each of the four chapters focusing on a major theme. The first chapter titled “Data, Their Uses, and Their Abuses” opens with a pop quiz in which he gives five real-life examples in which critics have/ do use data so at to support a particular opinion on education and education related issues. Here, readers are encouraged to find loopholes in the research with the responses serving as a springboard for further illustrations and discussions throughout the rest of the chapter. He relates, through these examples, how people may arrive at faulty/ wrong conclusions about data based on the use of faulty assumptions (Bracey, 2006). Chapter 2 and 3 continue in their build- up on the fundamentals of data collection, use and interpretation, as presented in the first chapter. “The Nature of Variables” is the title of chapter two, and it deals with the pinpointing of the topic (of an educational study) and matching it with the most appropriate methods used in its measurement. Discussions, of the strengths and weaknesses, of ‘mean, mode and median’ factors are present, with their definition and inclusion in a report also discussed. Chapter 3 is aptly titled “Making Inferences, Finding Relationships, Statistical Significance and Correlation Coefficients.” Here, Bracey stresses the main theme, by highlighting one of his principles of data interpretations – “…make no casual inferences from correlation coefficients...” The reader is warned from creating meaning (s) between variables where none exists necessarily. He provides a perfect example regarding a state’s budget on educational spending and its average student score on the Scholastic Aptitude Test (SAT). There have been arguments by conservative critics that an increase in budget allocation did not positively correlate with improvement in academic performance. For example, in 1993, the state of New Jersey had the largest education budget, but its students ranked thirty-ninth overall in the average SAT score. This was in contrast to what are notoriously low-budget allocation states such as Mississippi and Alabama that substantially scored higher; this regarding the average SAT score (Bracey, 2006). However, what most Americans did not know was that only a mere 4% of seniors in the two states, Mississippi and Alabama attempted the SAT test compared to over 75% of New Jersey seniors. This made Bracey explain that it matters much more what data a pundit chooses to exclude, than what he/ she includes; this from the above example, as the data significantly impacts on the overall outcome especially from a layman’s point of view. Chapter 4 is the most persuasive and thorough of them all. Its title – “Testing, A Major Source of Data- and Maybe Child Abuse”, candidly shows that Bracey is no fan of the standardized testing methodology (Bracey, 2006). Here, he goes to great lengths and detail regarding the overall history of testing, its extensive limitations for most and its popularity by some. Highlighted is the multitude of intrinsic and extrinsic variables that may in one way or another, interfere with a tester’s ability to gauge, truly, a student’s level of achievement. He also brings up the question as to whether those character traits commonly associated with achievement – creativity, resourcefulness and critical thinking – can/ could ever be measured adequately via a multiple choice test. He concludes with a discussion of the common violations of testing such as a narrowing of the curriculum, cheating and the exclusion of students; with his view being that any substantial increment in the importance of standardized tests will bring with it substantial increases in test abuse (Bracey, 2006). As aforementioned, the author uses a total of 32 principles so as to wizen-up the public, and the press eventually, on how easy it is to be deceived by what seems like clear statistical proof of the success with/ through the data. In it, he explains, through different ways and with examples, of confounding concepts and factors such as Simpson’s paradox and the fact that correlation does not at all equate to causation. Readers are able to learn that things are never simply what they seem, and that rather, the data may not be proving anything at all; this related to the data’s intended purpose in the research (Bracey, 2006). The 32 principles include - “Do the arithmetic”, (the first principle); pertaining to the random and easy way in which readers, scholars and the public in general may be able to sift through the data and get a correct analysis. The second “show me the data” and the third “look for and beware of selectivity in the data”; where he stresses that the presence or absence of data is critical to the overall outcome of the research, and this is often used to mislead and, therefore, shape public opinion on a number of issues. The fourth is “when comparing groups, make sure the groups are comparable”; where he advices on the need for equal use of measurement techniques and data groupings. The fifth, “be sure the rhetoric and the numbers match” is to be used so as to verify that the given information is truly credible and not just a person’s own views (Bracey, 2006). The sixth, “beware of convenient claims that whatever the calamity, public schools are to blame” is a pointer to the general viewpoint that only the public school sector is to blame whenever problems do arise. The seventh, “beware of simple explanations for complex phenomena” and the eight “make sure people understand the statistic used when a person talks about the ‘average’ in investigative study” exemplify the need for checking and counter-checking on the available data and resultant explanations that are fronted so as to be able to explain complex issues or results. The ninth, “be aware of whether you are dealing with rates or numbers; or rates with scores” and the tenth “when evaluating the rates or scores over a period of time, one should ensure that the set remain comparable as the time goes by” also are a clear pointer to the need for assurance when verifying on the kind of data used and how they are interrelated in the research conduct and resultant outcomes. Variables should be associated in a continuous form, especially in research, where they need to be comparable over a given period of time. The eleventh, “be aware of whether you are dealing with ranks or scores” also attests to the need for correct information or data grouping so as to get the correct and factually relevant information. The twelfth, “watch out for Simpson’s paradox” relates to what is known as the Yule-Simpson effect, where paradoxically, a correlation that is currently in diverse groups, is overturned when the groups are pooled together. This is often encountered in the medical-science and social-science statistics and in particular when frequency data are given, unduly, causal interpretations. The thirteenth, “do not confuse statistical significance and practical significance”, the fourteenth; “make no causal inferences from correlation coefficients” and the fifteenth, “any two studies can be interrelated. The consequential correlation coefficient could have meaning,” relates to the need, on the reader’s side to be vigilant in the kind of informational data that he or she analyses and the resultant information that is presented (Bracey, 2006). The sixteenth, “ learn to ‘see through’ graphs to determine what information they actually contain”, the seventeenth is that people should ensure that any examination aligned with a standard broadly examines the substance required by the standard”, eighteenth, “on a norm-referenced analysis, generally, 50 percent of examinees are below average, by classification”, the nineteenth, “a norm-referenced homogeneous accomplishment test must examine the aspects that all students have had a chance to learn”, the twentieth, “standardized norm-referenced exams will ignore and make anything that is unique about a school unclear”, the twenty first, “scores from homogeneous examinations are significant only to the point that people know that all examinees have had a chance to learn the things that the test examines” and the twenty second, “any effort to set a transitory score or a scratch score on a test will be subjective” all pertain to the different values that should be considered; this pertaining to data present representing the different education entities. The twenty third, “if a situation really is as alleged, ask, so what?”, the twenty fourth “achievement and aptitude tests vary typically in the things that the students learn in the test skills”, the twenty fifth, “ rising test scores do not necessarily mean rising achievements”, the twenty sixth, “the law of WYTIWYG applies: what you test is what you get”, the twenty seventh, “any tests obtainable should have adequate confirmation of both reliability and validity needed”, the twenty eighth, “ make sure that explanations of facts do not include inappropriate statements that touch the category of scale under use”, the twenty ninth, “do not use a test for the purpose other than what it was designed for”, the thirtieth, “do not make important decisions about groups or individuals on the basis of a single test”, the thirty first, “in examining test results ensure that no students were unfairly excluded from the examination” and the thirty second, “ in evaluating a testing program, look for positive or negative outcomes that are not part of the program” all go towards enhancing the credibility and overall reliability of the data that are used in the different researches that are carried out; especially pertaining to the education sector (Bracey, 2006). In conclusion, the book, which is written in non-technical jargon, would be a welcomed addition to the personal library of readers and scholars, especially classroom teachers or graduate education major students. The only flaw in the book is that it does not tackle the primary question of “now or what?” - He, Bracey, questions, probes and pokes the flaws in educational research but does not himself provide any tangible solutions. The book is a must read for every school board member, as it gives different scenarios in which policy makers and politicians constantly use misleading, distorted and often erroneous data; when they are describing the state of affairs of the public education sector. Readers, armed with the resultant knowledge can thus be able to begin questioning the poor data present in most of the research analysis as pertaining to the public education sector/ system instead of just accepting these reports the way they are. Reference Bracey, G. W. (2006). Reading Educational Research: How to Avoid Getting Statistically Snookered. Portsmouth, NH: Heinemann. Read More
Cite this document
  • APA
  • MLA
  • CHICAGO
(“Reading Educational Research: How to Avoid Getting Statistically Book Report/Review”, n.d.)
Retrieved from https://studentshare.org/psychology/1447084-book-review-for-bracey-gerald-w
(Reading Educational Research: How to Avoid Getting Statistically Book Report/Review)
https://studentshare.org/psychology/1447084-book-review-for-bracey-gerald-w.
“Reading Educational Research: How to Avoid Getting Statistically Book Report/Review”, n.d. https://studentshare.org/psychology/1447084-book-review-for-bracey-gerald-w.
  • Cited: 0 times

CHECK THESE SAMPLES OF Principles of Data Interpretations

Rules of Statutory Interpretation

Rules and approaches to statutory interpretations have been developed by the judges; the Interpretation Act of 1978 provides basic definitions which are necessary during statutory interpretations.... Conventionally, the judge will perform the following tasks during statutory interpretations: consider the legislative intent of the statute, objective meaning of the appropriate text, the traditional canons of the statutory interpretation, and the general purposes and policies behind the legislation among others....
9 Pages (2250 words) Essay

Precautionary Principle

Over the last three decades, the precautionary principle has become an essential element of domestic and international legislative efforts in the fields of environmental conservation, natural resource management, health protection and agricultural trade.... The precautionary principle (also termed 'the precautionary approach') involves moral, political, and ethical responsibilities towards protecting and preserving the integrity of natural systems, and the fallibility of human understanding (Ricci et al....
14 Pages (3500 words) Essay

Psychology: Is It a Science

849) In 1890, the American philosopher, William James wrote the first psychology textbook, "The principles of Psychology".... Science deals with facts interpreted objectively, science does not favour wishful thinking and personal interpretations of facts.... hellip; For this, we must first know all the measurable data like speed, mass, electrical resistance etc, which are organized into specific formulae....
4 Pages (1000 words) Essay

Modern Statutory Interpretation

[However] If the words of an Act admit two interpretations, and if one interpretation leads to an absurdity, and the other does not, the Court will conclude the legislature did not intend the absurdity and adopt the other interpretation.... I will try to note relevant evolutionary changes if any, and then will give an overview of the current available data about "statutory interpretation....
6 Pages (1500 words) Essay

The European Court of Justice

In addition, ECJ in certain cases have applied the general principles of law < http://www.... Controversies created by different interpretations could not be left hanging in the air.... The paper "The European Court of Justice" discusses that the ECJ must accomplish the purpose to which it is created under the treaty....
8 Pages (2000 words) Case Study

Rules of Statutory Interpretation

Rules and approaches to statutory interpretations have been developed by the judges; the Interpretation Act of 1978 provides basic definitions which are necessary during statutory interpretations.... Conventionally, the judge will perform the following tasks during statutory interpretations.... This essay "Rules of Statutory Interpretation" considers the influence of the European Law and discusses the validity of the view that the Rules and Approaches that apply to statutory interpretation give too much latitude to the courts, and it seems there are no underpinning principles....
9 Pages (2250 words) Essay

American Philosopher: Ronald Dworkin

The use of other interpretations and decisions from similar cases prior to interpreting the current case is a constrained process.... Through the interpretation of legal data, courts discover and apply the principles that guide it in fulfilling its objectives.... Rarely do courts use a moral standpoint when interpreting the data....
8 Pages (2000 words) Essay

Hermeneutical Models in Relationship between the Old and New Testament

In the older sense, it is understood as religious principles of exposition that is synonymous with biblical interpretation principles or methodology.... This is a mainstream form used by protestant tradition in considering Christian biblical hermeneutics in the traditional exegesis, to handle various principles that can be applied to scriptural study.... The basic theoretical principles that are used as interpretive formulae for the studies are the Mutually-exclusive model, historical-grammatical principle based model, Ethnic division principle, Christo-centric principle, application principle, and moral principle among others....
8 Pages (2000 words) Report
sponsored ads
We use cookies to create the best experience for you. Keep on browsing if you are OK with that, or find out how to manage cookies.
Contact Us