Linking Competence to Opportunities to Learn
INNOVATIONS IN SCIENCE EDUCATION AND TECHNOLOGY Volume 17 Series Editor ...
71 downloads
533 Views
4MB Size
Report
This content was uploaded by our users and we assume good faith they have the permission to share this book. If you own the copyright to this book and it is wrongfully on our website, we offer a simple DMCA procedure to remove your content from our site. Start by pressing the button below!
Report copyright / DMCA form
Linking Competence to Opportunities to Learn
INNOVATIONS IN SCIENCE EDUCATION AND TECHNOLOGY Volume 17 Series Editor Cohen, Karen C. Weston, MA, USA
About this Series
As technology rapidly matures and impacts on our ability to understand science as well as on the process of science education, this series focuses on in-depth treatment of topics related to our common goal: global improvement in science education. Each research-based book is written by and for researchers, faculty, teachers, students, and educational technologists. Diverse in content and scope, they reflect the increasingly interdisciplinary and multidisciplinary approaches required to effect change and improvement in teaching, policy, and practice and provide an understanding of the use and role of the technologies in bringing benefit globally to all.
For other titles published in this series, go to www.springer.com/series/6150
Xiufeng Liu
Linking Competence to Opportunities to Learn Models of Competence and Data Mining
Xiufeng Liu Graduate School of Education State University of New York at Buffalo Buffalo, NY 14260-1000 USA
ISBN 978-1-4020-9910-6 e-ISBN 978-1-4020-9911-3 DOI 10.1007/978-1-4020-9911-3 Library of Congress Control Number: 2009926489 © Springer Science + Business Media B.V. 2009 No part of this work may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, microfilming, recording or otherwise, without written permission from the Publisher, with the exception of any material supplied specifically for the purpose of being entered and executed on a computer system, for exclusive use by the purchaser of the work. Printed on acid-free paper springer.com
Preface
For many people, a high standard for student learning is desirable. This is what underlies current standard-based science education reforms around the world. As someone who was born and brought up in a less-privileged home and educated in a resource-limited school environment in a developing country, I always had to study hard to meet various standards from elementary to high school to university. My first book in English published over 10 years ago (Liu, X. [1996]. Mathematics and Science Curriculum Change in the People’s Republic of China. Lewiston, NY: The Edwin Mellen Press) provided me an opportunity to examine standards (i.e., Chinese national science teaching syllabi) from a historical and political point of view. I argued that standards are developed for particular political agendas in order to maintain the privileged position of certain groups (i.e., urban residents) in a society at expenses of others (i.e., rural residents). Thus, underneath standards is systematic discrimination and injustice. Since then, I have had opportunities to study the issue of standards in much more breadth and depth. This book, Linking Competence to Opportunities to Learn: Models of Competence and data mining, provides me an opportunity to examine standards from a different perspective: opportunity to learn. Opportunity to learn (OTL) refers to the entitlement of every student to receive necessary classroom, school, and family resources and practices to reach the required learning standard or competence. Although the concept of OTL has been around for over three decades, how specific variables of OTL pertaining to science teachers’ teaching practices in the classroom, student family background and home environment, and school contexts may predict the students’ competence status is still not wellknown. This book aims at filling this gap in the literature. It has two objectives: (a) developing models of competence in terms of opportunity to learn, and (b) introducing a new approach called data mining for developing models of competence. Each model of competence presents a theory on how specific OTL variables and their interactions are associated with a different status of successfully or unsuccessfully reaching competence. Underlying this current book is my continuous belief that learning standards are inherently unfair and high learning standards should be based on equal opportunities for all to learn. It is only fair for a just society to expect this! It is my hope that this book will contribute to theories related to equity in science education. It is also my hope that this book will v
vi
Preface
inform science teaching in the classroom and policy-making at the state and national levels related to standard development and resources allocation. This book is primarily for science education researchers including graduate students who are interested in science curriculum and instructional reforms. For example, it may be used as a main textbook for a graduate (i.e. master’s and doctoral) level course in science education related to science curriculum. Such a course may carry such titles as Seminar on Science Curriculum, Science Education Reform, Research in Science Curriculum, Science Curriculum Theory and Practice, and Current Approaches to Science Curriculum, to name a few. This book may also be used as a reference by national and state education agencies for making decisions related to science curriculum standards and resources allocation, and by school district science curriculum, instruction and assessment specialists to conduct teacher professional development. This book would not have come into being without support from many people. First, I thank my family (wife Lily and children Iris and Murton) for their neverfading love and support. I thank Dr. Miguel Ruiz, formerly of University at Buffalo and currently University of Northern Texas, for introducing me to data mining. I thank Dr. Karen Cohen, editor for the Springer book series Innovations in Science Education and Technology, for inviting me to develop a book proposal and for her ongoing support during the development of this book. I thank Mr. Harmen van Paradijs, acquisitions editor at the Springer, for coordinating the review process for this book and recommending to the Springer board for publishing this book. State University of New York at Buffalo September 2008
Xiufeng Liu, Ph.D.
Contents
Introduction: Equity and Excellence in Standard-Based Education ...........................................................................
1
1
Competence and Opportunity to Learn .................................................
5
Measurement ............................................................................................... Student Population....................................................................................... Content ........................................................................................................ Judgment......................................................................................................
5 7 7 8
2
Models of Competence and Data Mining...............................................
13
3
Models of Competence and Opportunities to Learn in the Classroom .......................................................................
19
Grade 4 Competence Model ........................................................................ Grade 8 Competence Model ........................................................................
24 33
Models of Competence and Opportunities to Learn at Home .............
43
Grade 4 Competence Model ........................................................................ Grade 8 Competence Model ........................................................................
47 57
Models of Competence and Opportunities to Learn in Schools ..........
65
Grade 4 Competence Model ........................................................................ Grade 8 Competence Model ........................................................................
68 75
Pedagogical and Policy Implications ......................................................
83
Pedagogical Implications............................................................................. Policy Implications ...................................................................................... Conclusion ...................................................................................................
84 86 88
4
5
6
References .......................................................................................................
89
vii
viii
Appendix A
Contents
Variables Related to Teaching Practices Measured in 1996 for Grades 4 and 8 NAEP Science ........
95
Variables Related to Family Background and Home Environment Measured in 1996 for Grades 4 and 8 NAEP Science ......................................................................
101
Variables Related to School Context Measured in 1996 for Grades 4 and 8 NAEP Science...........................
105
Appendix D
Accuracy Measures of Competence Models........................
113
Appendix E
Tutorial on the Weka Machine Learning Workbench........
119
Appendix F
Machine Learning Algorithms Implemented in Weka .......
129
Author Index...................................................................................................
135
Subject Index ..................................................................................................
136
Appendix B
Appendix C
Introduction
Equity and Excellence in Standard-Based Education
Imaginary Student A: Developing Country Born in a remote village in a developing country, she was considered, by her classmates, as being “smart.” She always did well on tests of all subjects, particularly math and science. She studied hard; her parents always supported her by providing her with necessary school supplies. However, most of her secondary school teachers did not have university degrees; some of them were high school graduates themselves. She never had any hands-on experiences in her science class, not even a teacher demonstration, because there was no science laboratory; nor were there any science supplies in the school. At the end of high school, she had to compete with millions of her fellow high school graduates all over the country, including those in big cities where teacher quality and school resources were more than adequate. She ended up scoring low on the national unified university entrance examination, but nonetheless passed the minimal acceptance score for a third-tiered college majoring in agricultural science, a subject she was never interested in.
Imaginary Student B: United States Born to a poor family in a large city in the United States, he lived with his mother because his parents were divorced when he was just starting kindergarten. Although his mother did not have a university degree, she always valued education and would do anything to enable her children to pursue university education. He was a good student in high school based on his grades on his report cards. Unfortunately, many of his classmates and their parents did not care about education. As a result, his study was constantly interrupted by violence in the school and community. Not all his teachers, particularly math and science teachers, were certified because certified teachers constantly left for teaching positions in suburban schools and filling the teaching vacancy proved difficult. During his high school years, he had to pass the state mandatory graduation exams. Although he passed those graduation exams, his scores were not that high. He was not able to take any Advanced Placement (AP) X. Liu, Linking Competence to Opportunities to Learn, Innovations in Science Education and Technology 17, © Springer Science + Business Media B.V. 2009
1
2
Equity and Excellence in Standard-Based Education
courses because advanced courses were not offered in his school. As a result, he did not enter his preferred university and settled for a local community college. Are there things in common between Student A and Student B? If yes, do the commonalities matter and in what ways? These are the questions this book intends to answer. It focuses on factors common among individual students. It intends to understand how variables beyond student control in the classroom, school, and home may impact student science achievement. At first glance, Student A and Student B are not comparable as one is in a developing country and the other in a developed country. Also, Student A is from a rural area, and student B from an urban area. We can easily identify many other aspects that are not comparable. Besides apparent differences between Student A and Student B, however, there are many important commonalities. First, both students are “good” students; they are motivated and want to learn. Second, their families fully support them. Third, both students have to pass standardized tests that maintain a common expectation for all students. Fourth and most importantly, both students’ science achievements are compromised by factors beyond their own control. Do these commonalities matter? Answering the above question requires an understanding of the context the two students live in: the current worldwide movement toward standards that are often accompanied by standardized testing. The essence of the standard-based movement is the same expectation, usually phrased as high expectation, for ALL students no matter what personal, family, and school characteristics they may have. No one would deny the desirability of standards, particularly high standards. However, we all know that besides differences among students in their personal characteristics such as intelligence, motivation, and metacognition, there are also tremendous differences among students in their science classrooms, schools, and homes. Are there correlations between student achievements and their classroom, school, and family resources and practices? If yes, how exactly do those resources and practices impact student achievement? These are the central questions this book seeks to answer. Basic premises underlying this book are that there are differences among students beyond their control in terms of classroom, school, and family resources and practices, and when the same standard is applied to all students, it is important to identify those specific resources and practices that significantly effect student achievements so that adequate opportunities are made available in order for all students to achieve the expected standards. The imaginary Students A and B described above can be any students in any country. Research has shown that student science achievement gaps are due to differences in how science is taught (Lee & Luykx, 2007), and ethnic minority and low-income students often have less access to science materials and are exposed to less engaging science learning activities than their higher-income, White counterparts (Ladson-Billings, 2007). Learning resources and practices do matter! The current standard-based education (SBE) movement is not a new phenomenon; its origin was the early competence-based education (CBE). Under the public outcry over the perceived inability of public school education to graduate students who could survive socially and economically as independent young adults,
Imaginary Student B: United States
3
the Oregon State Board of Education, in September 1972, passed new minimum graduation requirements and set standards that involved the introduction of three domains of “survival level” competencies as minimum conditions for high school graduation (Spady, 1977). The Oregon action triggered a nationwide movement toward competence-based education in the late 1970s. Blank (1982) identified the following principles for CBE: (a) all students can master any tasks if high-quality instruction and sufficient time are provided; (b) a student’s ability need not predict how well the student will learn the task; (c) individual differences in levels of mastery are primarily caused by environmental factors, not by individual characteristics; (d) most students become similar to each other in terms of learning ability, rate of learning, and motivation for further learning when provided with favorable learning conditions; (e) we should focus on differences in learning, instead of differences in learners; (f) it is the teacher’s responsibility to ensure that everyone succeeds; and (g) the most important element in teaching is the kind and quality of learning experiences offered to students (Blank, 1982, pp. 39–43). Exactly how each of the above principles is applicable nowadays is debatable, because we know much more about how students learn than before (e.g., Bransford et al., 2000). However, one basic premise should hold true to both SBE and CBE: equal resources must be made available for all students in order for them to meet the common standards, which is an issue of equality. Equality and equity are closely related. Diversity in students and its impact on student learning, particularly in terms of culture, language, ethnicity, and socioeconomic status, is a well-researched field in science education (e.g., Atwater, 1994; Lee & Luykx, 2007; Lynch, 2000). For example, Lynch (2000) described the great inequity that exists in science achievement; teaching and leaning resources; and practices in the classroom, school, and family among various distinct groups. However, diversity within any grouping, such as a culture (e.g., African-American) can be equally great (Atwater, 1994). No two students, regardless of culture, language, and socioeconomic status, are exactly alike. The achievement differences among students are forever an intriguing educational phenomenon. If we start with the assumption that high academic achievement is potentially attainable by most children, then achievement gaps are a product of the learning opportunities available to different groups of students and the degree to which circumstances permit them to take advantage of those opportunities. (Lee & Luykx, 2007, p. 172)
This book is about students as individuals; it is concerned with learning opportunities available to them. While we have known that learning opportunities matter, little is known about exactly what and how learning opportunities may impact an individual student’s science achievement. If we can identify those learning opportunities that really matter in terms of student achievement, then all stakeholders must collaborate to make these opportunities available in order to develop an environment in which all students can be successful in learning science. There have been renewed calls for enhancing US public schools in the climate of the current SBE movement in order to promote a healthy democratic society (Comer, 2004; Fuhrman & Lazerson, 2005), because the basic premise of public
4
Equity and Excellence in Standard-Based Education
schooling is equal opportunities for all to learn, to develop, and to become contributing citizens. Ensuring equity, i.e., no discrimination based on group characteristics, is not enough; high achievements or standards for all students further require equality of learning resources and practices. Equity is necessary but not sufficient for equality. The focus of this book on equality of individuals is a step forward from current literature on equity. While equity is about justice, opposite to discrimination based on group characteristics, equality is about fairness, opposite to disparity and unequal opportunities among individuals. Because education is a necessity of life for an individual as well as for a democratic society (Dewey, 1916), it is important that we identify the critical resources that would enable individual students, regardless of race, culture, or socioeconomic status, to meaningfully experience and participate in learning activities. Meaningful learning takes place only when what is to be learned and how it is to be learned are connected to individual students’ environments. “A society which makes provision for participation in its good of all its members on equal terms and which secures flexible readjustment of its institutions through interaction of the different forms of associated life is in so far democratic” (Dewey, 1916, p. 99). Associating equality in learning resources and practices to learning standards is of significance for policy and new knowledge. The 2001 US No Child Left Behind Act, Title 1 – Improving the Academic Achievement of the Disadvantaged, states that “the purpose of this title is to ensure that all children have a fair, equal, and significant opportunity to obtain a high-quality education and reach, at a minimum, proficiency on challenging state academic achievement standards and state academic assessments.” It has also been argued in the literature that any achievement standard must be accompanied by an opportunity-to-learn (OTL) standard (Ravitch, 1995). However, we do not know exactly what resources and practices are critical for achieving competence or learning standards, and how they impact on students’ learning. These questions are particularly important, given that learning resources are always limited, and prioritization of learning resources is often necessary. This book focuses on learning resources and practices related to the teacher, school, and family. Because standards are at the population level – applicable to an entire state or country – it is important to accurately identify, at the population level, which teacher, school, and family resources and practices are more likely to help students achieve the learning standards. Chapter 1 will define two important constructs this book deals with: competence and opportunity to learn. Chapter 2 will discuss the notion of models of competence and describe briefly the research methodology, i.e., data mining, used to develop competence models. Chapters 3–5 will present specific competence models related to teachers (Chapter 3), families (Chapter 4), and schools (Chapter 5). In addition to presenting competence models, I will also review relevant literature to provide a research context for the models presented in Chapters 3–5. The final chapter, Chapter 6, will discuss policy as well as pedagogical implications of the competence models developed in Chapters 3–5.
Chapter 1
Competence and Opportunity to Learn
Competence and standards are closely related. Competence is “a pattern of effective performance in the environment, evaluated from the perspective of salient developmental tasks”(Masten et al., 1995, p. 1636). Masten et al.’s definition emphasizes that competence is a generalization about a person’s adaptation based on performances. In science education, competence has been defined as levels of student achievement. For example, the 1996 and 2000 National Assessment of Educational Progress (NAEP) defines student science achievement in three competence levels: basic, proficient, and advanced. Figure 1.1 presents the definition of the NAEP science competence levels for grade 4. As can be seen, each competence level is associated with specific performances, and there is a progression from a lower competence level to a higher one. The NAEP competence suggests that there are four essential aspects of competence: (a) measurement; (b) student population, i.e., target students; (c) content, i.e., objectives; and (d) judgment, i.e., a cutoff score.
Measurement Competence is a quantitative term. Although being competent or incompetent is categorical, thus qualitative, underlying this categorization is a valid and reliable scale that allows categories to be made. For example, in the 1996 and 2000 NAEP, the scale is conveniently defined to be a continuum from 0 to 300: the Basic competence level is defined as achieving any score that is between 138 and 170, a Proficient level is defined as achieving any score from 170 through 205, and an Advanced level is defined as achieving any score between 205 and 300. Competence measures are domain-specific; they correspond with a learning outcome space in which learners can be differentiated and item responses can be designed. Competence measures are also abstract, because they are not direct observation based on raw scores; rather they are derived from raw scores by applying a measurement model such as the Rasch model (Liu & Boone, 2006). Finally, competence measures are unidimensional – the increasing or decreasing competence is represented by increasing or decreasing scores of measures. X. Liu, Linking Competence to Opportunities to Learn, Innovations in Science Education and Technology 17, © Springer Science + Business Media B.V. 2009
5
6
1
Competence and Opportunity to Learn
Fig. 1.1 NAEP science achievement competence levels – grade 4
Developing a competence scale can be a complex process. Wilson (2005) articulates an explicit approach to developing a scale that consists of four stages: (a) defining the construct, (b) designing measurement items, (c) defining the outcome space, and (d) applying a measurement model to obtain measures. The Wilson approach to developing scales shows how the process of developing competence measures
Content
7
is purposeful, systematic, and data-driven. Only through such a process can we develop a scale that is both valid and reliable.
Student Population Closely tied to measurement is the student population, to whom the competence measures apply. A student population can be as small as a class or as large as all students in the country. If the student population is a class, then the competence measures apply only to that class. If the student population is all students in a country, such as that for the NAEP competence measures, then the competence measures apply to all students within the country, regardless of class, district, or state. Associating a student population explicitly with competence implies that different student populations may have different competence measures. This should come as no surprise. For example, what we mean by “competent” in understanding a science concept should be different for elementary grade students than for high school students, because what we expect them to know and to do and the way we assess them are different for elementary grades and high school grades. This does not mean that elementary graders and high schoolers cannot be meaningfully compared. A common measure can be used for two different student populations for comparisons, but the criteria for judging competence must be different – an aspect to be discussed later. One implicit assumption of a student population is variation among students. Conventional wisdom tells us that some students are always more advanced than others, thus more competent than others. However, variation among students not only exhibits between characteristic groups, but also among individual students. Statistically, this is the phenomenon of variation or variance. Without variance, measures are meaningless. Imagine all students in a defined population performing the same on a competence measure. In this situation, a differentiation between competence and less-than-competence is not possible, and competence becomes meaningless. One important intention of competence measures is to differentiate competence from incompetence so that appropriate actions, such as remedial instruction and reward, may be provided.
Content The content aspect of competence refers to the domain in which competence applies. Although some may be competent in all domains, it is more common for some to be competent in one domain, but not as competent in another. For example, a student may be competent in biology but not in physics. Competence measures must be explicitly tied to a content domain. This requirement is actually implied in the measurement aspect of competence, because in order to develop a valid and reliable measure, a domain or construct must be clearly defined.
8
1
Competence and Opportunity to Learn
In today’s SBE movement, the content domain of competence is usually the content standard. Content standard defines what students should learn and do. Wilson (NRC, 2006) identified the following features of high-quality content standards (a) be clear, detailed, and complete; (b) be reasonable in scope; (c) be rigorously and scientifically correct; (d) have a clear conceptual framework; (e) be based on sound models of student learning; and (f) describe performance expectations and identify proficiency levels. Features (a) through (e) are typically found in content standards by states or countries, but feature (f) is usually missing. Ideally, performance expectations should consist of two dimensions, the topics and the cognitive levels. This is because a same topic may be learned with different cognitive demands, such as remembering, understanding, applying, analyzing, evaluating, and creating (Anderson & Krathwohl, 2001). Proficiency levels are the degrees of mastering a learning standard by students. Proficiency levels are particularly important in science content standards because a common way to organize these standards is through unified themes or concepts (Bybee, 1998, 2003; Bybee & Ben-Zvi, 1998). Organizing content standards by unified themes means that a same theme or concept may be learned again and again at different grades with increasing complexity and thus increasingly higher expectations of proficiency. Unfortunately, few current content standards are explicit in stating proficiency levels (NRC, 2006).
Judgment In order to explicate competence, a judgment is necessary to explicitly define the connections among measurement, student population, and content. This aspect is commonly referred to as standard setting (Cizek & Bunch, 2007). Essentially, a standard-setting process is to decide on a cutoff point where two levels of competence of mastering a given content standard, such as Proficient and Below Proficient, can be differentiated for a given population of students. The cutoff point of competence reflects what content standards expect in terms of what students should know and how well, the errors in measures, and the variation of target student population. Standard setting is both science and politics. The scientific aspect of standard setting lies in the measurement models and statistics employed in the process, while the politics lies in the consensus-building process among standard-setting panel members who represent different stakeholders with different expectations of students. Given the above, the judgment aspect of competence is both a state and a process. The state of judgment refers to the final cutoff value of measures by which competence and incompetence are differentiated. The process refers to the steps taken to decide on the cutoff value. Judgment reflects values of various stakeholders in the society, and in turn impacts the society, i.e., some stand to gain and some stand to lose by receiving rewards or gaining opportunities. In addition to competence, another key construct this book deals with is “opportunity to learn”. “Opportunity to Learn” (OTL) is not a recent invention (McDonnell, 1995). Carroll (1963) considered instructional time devoted to quality
Judgment
9
instruction, i.e., opportunity to learn, to be one of the important factors affecting student learning. Independently, the International Association for the Evaluation of Educational Achievement’s (IEA) first and second math studies during the 1960s and 1970s (Husén, 1967) conceived OTL to be a necessary condition for assessment to be validly interpreted, i.e., what students were tested on must be based on what students had learned. Thus, the IEA’s focus of OTL was simply the match between the assessment domains and what students actually learned. The overwhelming evidence on the significant correlation between students’ achievement and what students had learned prompted US national indicators’ program, i.e., NAEP, to expand its school and teacher indicator coverage to include items not only related to whether certain courses or content had been offered to students, but also how they were offered – teaching practices (NCES, 1992). Since then, OTL has become not only what schools and teachers offered but also how schools and teachers conducted instruction. OTL first made its way into policy in the report by the National Council on Education Standards and Testing (NCEST, 1992). Later, as part of President Clinton’s Goals 2000 legislation, the US Congress called for OTL standards (Porter, 1993). Eventually, it became a component of NCLB. In NCLB, OTL standards broadly include “the criteria for, and the basis of, assessing the sufficiency or quality of the resources, practices, and conditions necessary at each level of the education system (schools, local educational agencies, and States) to provide all students with an opportunity to learn the material in voluntary national content standards or State content standards” (Pub. L. No. 103-227, § 3 [7]). The conceptual base of OTL is equality. Guiton and Oakes (1995) discussed three possible conceptions of equality for OTL. The Libertarian conception assumes that equitable distribution of human and material resources should be based on students’ abilities. Given that variation in ability is inevitable among students, disproportional distribution of learning resources is warranted. While accepting the basic premise of merit-based distribution of resources, the Liberal conception of equality places a condition on the distribution of resources not to result in, or be based on, irrelevant group characteristics such as Social Economic Status (SES), gender, and religious affiliation. A third conception, the Democratic conception of equality, makes a direct connection between OTL and learning standards. Because standards are considered minimal competence every citizen is required to meet, distribution of resources needs to ensure that all students meet the learning standards. I believe that all three conceptions of equality are sound and they are complementary to each other. However, the Democratic conception of equality should take precedence over the other two, because education is a basic right of every citizen (Comer, 2004; Fuhrman & Lazerson, 2005), and a necessity for democracy (Dewey, 1916). Making school attendance by all students compulsory but not providing them with equal opportunities to learn is unethical. Given that unqualified teachers, inadequate laboratories, and low-quality teaching are important indicators of OTL and prevalent in urban schools in which African-American and ESL (English as a second language) immigrant children are in high concentration, Tate (2001) argues that OTL is a civil right, parallel to the historical struggle in urban schools
10
1
Competence and Opportunity to Learn
against racial segregation in schooling. The historical US Supreme Court ruling in the case of Brown v. Board of Education (1954) was only a beginning toward equity by addressing segregation in schooling. Unequal opportunities available to learn for different groups of students have been called the second generation of segregation, or resegregation (Petrovich & Wells, 2005). It is now time to address this new generation of segregation by providing equal opportunities to learn for all students. Central to OTL is the assumption that there is a relationship between what happens in the classroom, home, and school and what students achieve. Typically, OTL focuses on course content, instructional strategies, teacher background, class size, student readiness (i.e., initial achievement levels), and the availability of physical resources (such as books and equipment). Smithson et al. (1995) proposed that an OTL indicator measure should include questions related to the content and skills in the science assessment exercises students have just completed, frequency of student experience with a range of science classroom activities and instructional practices, and class time spent on science (elementary) and science courses taken (middle and high school). The most commonly used OTL variables are: (a) content coverage – whether or not there is a match between the curriculum taught and the content tested, (b) content exposure – whether or not there is enough time spent on the content tested, (c) content emphasis – whether the teacher provides sufficient emphasis on the content tested, and (d) quality of instructional delivery – whether the teacher has taught the content adequately (Stevens, 1997). How science is learned in formal and informal settings from a sociocultural perspective has become a key aspect of current notions of OTL (Moss et al., 2008). OTL is not a unitary construct; it is multidimensional, and different dimensions of OTL may explain different types of competence (Wang, 1998). One challenge in identifying indicators of OTL is to differentiate direct from indirect causes of student achievement. As an example, there was clear inconsistency in the literature on the effect of school finance on student achievement. This inconsistency was understood when Elliott (1998) found that, first of all, school finance, such as per pupil expenditure, was impacting student achievement through mediating effects of teachers and schools. That is, more per pupil expenditure could be translated into better qualified teachers who used more effective teaching methods such as inquiry. More per pupil expenditure could also be translated into smaller class sizes, better laboratories, and better laboratory supplies. All these mediating factors were directly and significantly correlated with students’ achievement. Second, school finance indirectly affected students’ achievement differently in different subjects. For math, the mediating effect was not as clear as that in science. Elliott’s study shows that studying OTL needs to differentiate direct, indirect, and irrelevant variables of student achievement. OTL studies also need to be domain-specific – be specific on the types of student achievement. For example, Wang (1998) found that content exposure was the most significant predictor of students’ written science test scores, while quality of instructional delivery was the most significant predictor of students’ science performance test scores. Figure 1.2 presents a possible conceptualization of OTL.
Summary
11 OTL
Level School
Indicators Equipment, labs, technology, field trips, etc. Teacher Qualification (knowledge, certification, experience), teaching (content coverage, exposure, emphasis, and delivery), etc. Family Parent education level, books, computer, internet connection, etc.
Student Outcome Subject
Domain
Math, Science (Biology, Chemistry, Earth Science, Physics), ELA, etc.
Conceptual understanding, hands-on skills, critical thinking, etc.
Fig. 1.2 Conceptions of opportunity to learn
Figure 1.2 also shows that the relationship between OTL and student outcome is complex. Different OTL indicators of school, teacher, and family may have different effects on student outcome depending on the subject and domain. It should be noted that Fig. 1.2 does not have a student level. Although student variables such as aptitude, previous achievement, and motivation could also significantly predict students’ achievement as found in Wang (1998), it is important to maintain that OTL is about factors beyond student control. OTL should focus on what students are entitled to receive, instead of what they may contribute. OTL is ultimately about a social contract and partnership among the school, the teacher, and the parent in student learning.
Summary In this chapter, I have described the origin of the current standard-based science education and analyzed the four aspects of competence. I have also developed a conceptual framework of “opportunity to learn” that includes three levels of indicators and student outcome domains. The next chapter, Chapter 2, will operationalize the conception of OTL in Fig. 1.2 through a notion of models of competence, and introduce a new method called data mining.
Chapter 2
Models of Competence and Data Mining
Although we know there is a relationship between OTL and student achievement, and such a relationship is multidimensional and domain-specific, a natural question to ask is how OTL is related to student competence. Because competence is categorical, i.e., reached or not reached, in order to differentiate the relationship between two continuous variables (e.g., per pupil expenditure and student science achievement scores) from the relationship between categorical variables (e.g., competence and teaching practices), I call the relationship between OTL and competence models of competence, or competence models. Many methods are available to develop models of competence. If models of competence assume a causal relationship between OTL and competence, randomized experimental designs or their approximations (such as quasi-experimental designs) are ideal but unfeasible. For example, withholding OTL from some students is unethical and should never be done in educational research. The unfeasibility of experimental studies leaves the methodological options to statistical methods, especially causal comparative methods. Causal comparative methods are based on correlations, supplemented with conceptual justification for causality. Examples of such statistical methods based on correlations are logistic regression, discriminant analysis, and profile analysis. Although there are apparent advantages for using statistical methods to establish competence models, there are also limitations. First, like any statistical method, statistical significance is sensitive to sample size. This can become a problem because competence model studies typically use large samples from national surveys, and a large sample size tends to produce more statistically significant predictors. Another limitation is missing data. Missing data is common in large-scale national surveys. Although various missing data methods are available, such as replacing missing values with median, mean, or through interpolations or multiple imputation, the effect of missing data on the competence model as well as the real nature of missing data remain unknown when using these methods. Besides the above two main limitations, statistical methods produce mathematical models; translating mathematical models into action plans to improve practices is not always straightforward. This book introduces a new methodology to develop competence models. This methodology is called data mining. Data mining is a pattern recognition approach.
X. Liu, Linking Competence to Opportunities to Learn, Innovations in Science Education and Technology 17, © Springer Science + Business Media B.V. 2009
13
14
2
Models of Competence and Data Mining
Rather than using mean and standard deviation as basic units of analysis, as is the case in statistical analysis, data mining uses both conventional and statistical logic to analyze individual cases or instances. Because every case is treated equally and no group statistics are used, missing data is not a problem. Also, the classification accuracy is measured by descriptive statistics, such as percentages of cases classified correctly, no statistical testing is needed, and inflation of statistical power is not an issue. Finally, because data mining produces a classification scheme in a binary format and can be represented visually, the model is easy to understand. What follows is a brief conceptual introduction to data mining. Appendices E and F provide a more detailed technical background of data mining, i.e., data mining algorithms and a step-by-step tutorial on a free Java-based computer program called Weka commonly used for data mining. Data mining is also called knowledge discovery in databases (KDD; Han & Kamber, 2001; Witten & Frank, 2005). Developed from database management systems technology and descriptive statistics, data mining goes beyond retrieving, analyzing, and representing information in databases; it focuses particularly on uncovering hidden patterns in large data sets. Today, data mining involves not only databases and statistics, but also machine learning, information science, and visualization. It is being applied in sciences (e.g., bioinformatics), business, Internet security, and many other fields. Data mining performs two functions: one is to identify patterns among data records (e.g., concept cluster, concept comparison, and discrimination), and the other is to find relations among variables in the data that will predict unknown or future values of the variables. Unlike descriptive and inferential statistical analyses that rely on means and standard deviations, data mining uses both logical and mathematical (deterministic and statistical) reasoning to analyze data records. Data mining is both a bottom-up and top-down approach to discovering patterns. An example of a bottom-up approach is market basket analysis, which generates association rules to identify frequent patterns or correlations in sales data in order to help merchants display certain products in clusters. An example of a top-down approach is using a predefined hierarchy to generate frequent patterns. Top-down data mining is useful in problems involving a large number of variables (or dimensions) but a relatively small number of cases. In this case top-down data mining can group variables that correlate to reduce the dimensionality so that the new variables will have a larger set of cases to support the discovery of patterns (Liu et al., 2006). Data mining is an approach of combining quantitative and qualitative reasoning. In qualitative data analysis, a typical process starts with breaking qualitative data into small units, e.g., sentences or phrases, and then coding them. The codes are then analyzed to create more general categories. The process continues with creating patterns among the categories. This process is inductive in that patterns may not be predetermined; instead they emerge from the above process of categorization. On the other hand, statistical analysis is deductive. A statistical test, such as a t-test, starts with a meaningful hypothesis derived from a theory, and then the statistical test is applied to reject or retain the hypothesis. Data mining uses both inductive and deductive reasoning. A very powerful strategy used in data mining is to divide
2
Models of Competence and Data Mining
15
the database into two equivalent sub-databases. One sub-database is then used to create patterns (inductive reasoning or training/learning), and the other is used to test predictions of identified patterns (deductive reasoning or testing). Because data mining deals with large databases, dividing a data base into two equivalent sub-databases is not a problem. Dividing a database into multiple sub-databases, such as the tenfold cross-validation to be discussed later, is also possible. Let us use a hypothetical data set to demonstrate the process of data mining in developing a model of competence. Table 2.1 is a hypothetical data set. In the above hypothetical data set, information about six students, or instances, is available. Students 1–3 reached competence – Class Yes, but students 4–6 did not – Class No. Differences in OTL among the six students are represented by their content exposure (if topics were taught), content emphasis (if intensive time was spent), and inquiry (how teaching was conducted). A visual inspection of patterns on the difference in OTL between the two classes of students, those who reached competence (i.e., competence Yes) and those who did not (i.e., competence No), suggests that Inquiry was a significant predictor. Thus, a viable competence model can be as follows (Fig. 2.1). Figure 2.1 indicates that among the four who conducted inquiry, only one did not reach competence (the other three reached competence). The two who did not conduct inquiry did not reach competence. Thus, inquiry is a reasonable predictor for reaching competence. However, the above model is not perfect; there is one error – student 4 who conducted inquiry did not reach competence. This shows that other factors may also be Table 2.1 Hypothetical data set for data mining Content Content Instance Competence exposure emphasis
Inquiry
1 2 3 4 5 6
Yes Yes Yes Yes No No
Yes Yes Yes No No No
No Yes No No No Yes
Yes Yes Yes No Yes No
Inquiry
yes Competent (4/1)
Fig. 2.1 A preliminary competence model
no
Incompetent (2/0)
16
2
Models of Competence and Data Mining
Inquiry
no
yes
Incompetent (2/0)
Content Emphasis no Incompetent (1/0)
yes Competent (3/0)
Fig. 2.2 An improved competence model
necessary for predicting competence. If we want to further improve the accuracy of the model, then a second predictor needs to be introduced. The next predictor seems to be content emphasis. All those who reached competence had content emphasis, and student 4 did not have content emphasis and did not reach competence. Thus an improved competence model can be as in Fig. 2.2. Figure 2.2 shows that inquiry is necessary but not sufficient to predict competence. In order to become competent, students need OTL in both inquiry and content emphasis. Content exposure does not seem to be a significant predictor. The above revised competence model is now 100% accurate in explaining the six instances. Imagine when the data set contains hundreds, thousands, or even tens of thousands of instances and hundreds of OTL variables, the difficulty in recognizing patterns becomes much more challenging, and computers become necessary. Data mining makes use of the fast computation capacity of computers to identify patterns. When computers are used to search and identify patterns, specific procedures are needed. In data mining, the procedures are called algorithms. There are many established effective and efficient algorithms in data mining. The algorithms implemented in Weka are listed in Appendix F. Data mining can also incorporate prediction in the creation of models as well. Because models are created specifically for making predictions, data mining typically adopts a tenfold cross-validation (Witten & Frank, 2005). The tenfold crossvalidation approach divides the whole data set randomly into roughly ten equal subsets and then the data mining algorithm uses a combination of nine subsets of data to learn the rules that could be used to build a decision tree or model. This decision tree is then tested on the remaining one subset of data in order to evaluate the accuracy of the predictions. The computer algorithm continues the process using all possible combinations of nine subsets of data to produce a best decision tree with highest classification and prediction accuracy. Measures of classification and prediction accuracy used in data mining are also explained in Appendix D. A step-by-step tutorial on Weka 3.10, a free data mining software, is provided in Appendix E.
2
Models of Competence and Data Mining
17
The data sets used to develop competence models in this book are from the 1996 NAEP Science – Grades 4 and 8. In NAEP 1996 competence is reached when students score 170 or above, i.e., proficient. Detailed definition of competence levels for grade 4 is given in Fig. 1.1. According to Fig. 1.1, students who reached the proficiency level demonstrate the following competences: 1. They are able to create, interpret, and make predictions from charts, diagrams, and graphs based on information provided to them or from their own investigations. 2. They are able to design an experiment and have an emerging understanding of variables and controls. 3. They are able to read and interpret geographic and topographic maps. 4. They have an emerging ability to use and understand models. 5. They can partially formulate explanations of their understanding of scientific phenomena. 6. They can design plans to solve problems. 7. They can begin to identify forms of energy and describe the role of energy transformation in living and nonliving systems. 8. They have knowledge of organization, gravity, and motion within the solar system and can identify some factors that shape the surface of the earth. 9. They have some understanding of properties of materials and an emerging understanding of the particulate nature of matter, especially the effect of temperature on state of matter. 10. They know that light and sound travel at different speeds and can apply their knowledge of force, speed, and motion. 11. They demonstrate a developmental understanding of the flow of energy from the sun through living systems, especially plants. 12. They know that organisms reproduce and that characteristics are inherited from previous generations. 13. They understand that organisms are made up of cells and that cells have subcomponents with different functions. 14. They are able to develop their own classification system based on physical characteristics. 15. They can list some effects of air and water pollution as well as demonstrate knowledge of the advantages and disadvantages of different energy sources in terms of how they affect the environment and the economy. Due to the design of NAEP, every student’s science achievement is estimated five times, which results in five achievement measures. It is necessary to use all five measures in any secondary analysis based on the NAEP data use guideline (NCES, 1999). Due to the equivalence of the NAEP achievement measures, i.e., plausible values, data mining was conducted separately for each of the five plausible values, but only the competence model that has best accuracy measures is presented in the subsequent chapters. One practical challenge to present a model created by data mining is the enormity of its size. The sample size for NAEP science is typically tens of thousands. Because each data mining process classifies all students by searching for best
18
2
Models of Competence and Data Mining
combinations of variables that maximally differentiate students from one group (e.g. having reached the proficient level) to another (i.e. having not reached the proficient level), each model may contain over 100 branches (i.e., leaves in data mining terms) and hundreds of nodes (i.e. size of the decision tree). In order to focus on important variables that differentiate most students, only branches with leaves containing more than 100 instances will be presented in detail in a competence model.
Summary In this chapter, I have developed a notion of models of competence that is conceptualized as the relationship between the status of reaching competence and OTL variables. I have also introduced data mining, the research method used to develop models of competence in this book. In the subsequent chapters, I will present the models of competence in terms of OTL related to teachers (Chapter 3), home (Chapter 4), and school (Chapter 5). Finally, Chapter 6 will discuss how the models of competence developed in this book can inform science education policy and classroom teaching and learning.
Chapter 3
Models of Competence and Opportunities to Learn in the Classroom
This chapter will present models of competence based on teacher teaching practices. In the 1996 NAEP teacher questionnaire for both 4th and 8th grades, there were 61 questions related to science teaching practices pertaining to teaching methods, computer uses, assessment, science subject emphases, and so on. Table 3.1 lists sample teaching practice variables; a complete list of the 61 variables is available in Appendix A. As can be expected, there are a wide variety of teaching practices in science classrooms. Even the 61 questions, or variables, included in the NAEP survey may not capture such a variety of teaching practices. On the other hand, these 61 teaching practices do not act independently in science classrooms. For example, performance assessment typically takes place within an inquiry approach to science teaching. If we think of possible interactions among the 61 variables, i.e., different combinations of the 61 teaching practices, these 61 variables can potentially represent a large number of teaching scenarios, or science teaching profiles. This chapter will identify significant teaching profiles associated with reaching or failing to reach NAEP competence. Before presenting specific competence models, I will first review general principles for good teaching practices so that the competence models may be interpreted within the research context. A National Research Council committee summarizes effective teaching to be based on the following principles: • Teachers must draw out and work with the preexisting understanding that their students bring with them. • Teachers must teach some subject matter in depth, providing many examples in which the same concept is at work and providing a firm foundation of factual knowledge. • The teaching of metacognitive skills should be integrated into the curriculum in a variety of subject areas (Bransford et al., 2000). The first principle above calls for science teachers to focus on student preconceptions and facilitate conceptual change; the second principle calls for science teachers to develop fundamental knowledge and skills in students and help them to organize knowledge and skills in a meaningful way for easy transfer and appli-
X. Liu, Linking Competence to Opportunities to Learn, Innovations in Science Education and Technology 17, © Springer Science + Business Media B.V. 2009
19
20
3
Models of Competence and Opportunities to Learn in the Classroom
Table 3.1 Sample 1996 NAEP teaching practice variables NAEP variable
Grade
NAEP variable label
T060607
4, 8
How often do students do hands-on science activities?
T061102
4, 8
How much emphasis do you place on understanding key science concepts?
T060304
4, 8
How often do you assess students using group projects?
T061621
4, 8
Do you use computers for science by simulations and modeling?
Recoded NAEP variable values 1 = almost everyday 2 = once or twice a week 3 = once or twice a month 4 = never or hardly ever Others = missing 1 = Heavy emphasis 2 = Moderate emphasis 3 = Little/no emphasis Others = missing 1 = once or twice a week 2 = once or twice a month 3 = once a grading period 4 = once or twice a year 5 = never or hardly ever Others = missing 1 = yes 0 = no Others = missing
cations; and the third principle calls for science teachers to develop fundamental thinking skills in students in order for them to become autonomous learners. In order to implement the above principles, the NRC committee further identifies the following learning environments essential for effective teaching to take place: • Student-centered: Schools and classrooms must be organized around students. • Knowledge-centered: Attention must be given to what is taught, why it is taught, and what competence or mastery looks like. • Assessment-centered: There should be regular formative assessment (ongoing assessments designed to make students’ thinking visible to both teachers and students) and summative assessment (assessments at the end of a learning unit to find out how well students have achieved the standards). • Community-centered: Norms for the classroom and school should be developed, as well as connections to the outside world that support core learning values (Bransford et al., 2000). Although the above principles and learning environments are meaningful, implementing them in the science classroom is no easy task. One approach currently promoted in science education reforms around the world is the inquiry approach to science teaching. The National Science Education Standards (NRC, 1996) state: Inquiry is a multifaceted activity that involves making observations; posing questions; examining books and other sources of information to see what is already known; planning investigations; reviewing what is already known in light of experimental evidence; using tools to gather, analyze, and interpret data; proposing answers, explanations, and predictions; and communicating the results. (NRC, 1996, p. 23)
3
Models of Competence and Opportunities to Learn in the Classroom
21
The science teaching standard B in the National Science Education Standards (NRC, 1996) requires teachers to guide and facilitate learning through inquiry by focusing on and supporting inquiries while interacting with students, orchestrating discourse among students about scientific ideas, challenging students to accept and share responsibility for their own learning, recognizing and responding to student diversity, encouraging all students to participate fully in science learning, and encouraging and modeling the skills of scientific inquiry, as well as the curiosity, openness to new ideas and data, and skepticism that characterizes science. Another National Research Council committee on inquiry identified the following essential features of inquiry science teaching (NRC, 2000): • Learners are encouraged by scientifically oriented questions. • Learners give priority to evidence, which allows them to develop and evaluate explanations that address scientifically oriented questions. • Learners formulate explanations from evidence to address scientifically oriented questions. • Learners evaluate their explanations in light of alternative explanations, particularly those reflecting scientific understanding. • Learners communicate and justify their proposed explanations. Research has found that frequent uses of standard-based science teaching practices account for statistically significant amounts of variance in student science achievement scores (Fraser & Kahle, 2007). In a study by Johnson et al. (2007), students’ science achievements over 3 years were examined in terms of teachers who were classified as effective or ineffective in teaching based on the teaching standards in the National Science Education Standards. They found that students who had effective science teachers did significantly better than students who did not. Moreover, effective science teachers could improve students’ achievement regardless of ethnicity. However, Von Secker and Lissitz (1999) cautioned that while the instructional policies recommended by the Standards may be associated with higher achievement overall, they are equally likely to have the unintended consequence of contributing to greater achievement gaps among students with different demographic profiles. Laboratory skills are important components of science inquiry. Although science inquiry takes place in many forms, contexts, and extended time, they commonly involve the use of standard science equipment and tools – the manipulative skills. In addition, students must also actively engage in mental activities related to observing, analyzing and interpreting, and concluding – the thinking skills. Laboratory skills refer to both the manipulative and thinking skills involved in laboratory activities; they are also called process skills. Thus, operation of laboratory tools is only one component of laboratory skills. Other laboratory skills involve more reasoning than manipulating tools, such as generating testable hypotheses, designing controlled experiments, making accurate observation, analyzing and interpreting data, and making valid conclusions. Previous research on the effectiveness of student laboratory experiences on learning has been inconclusive. Earlier reviews of literature concluded that lecture,
22
3
Models of Competence and Opportunities to Learn in the Classroom
demonstration, and laboratory work were equally effective in helping students acquire knowledge (Bates, 1978). Blosser (1983) reported that, among the quantitative studies comparing laboratory work to other teaching methods, only 29 produced significant positive effects in favor of laboratories, 16 produced mixed results, and 139 showed no significant difference. Later review on the effectiveness of laboratories shows the same mixed results as before (Lazarowitz & Tamir, 1994). Therefore, simply having laboratories may not be enough. As Nakhleh et al. (2002) pointed out: “[T]eaching laboratory is a complex environment. In this environment, there are interactions between students and the activity, students and the equipment, students and laboratory instructors, and students and each other” (p. 79). Because of the complex nature of laboratory environment, laboratory teaching and learning can take place in many different ways, with some more effective than others. Thus, the matter is not simply to conduct laboratory work or not, but how to do it. Effective laboratory teaching practice is not a step-by-step simple process; the traditional recipe type of laboratories in which students simply follow instructions to verify a conclusion is not enough to produce effective student learning. In a simple term, hands-on must be accompanied by minds-on. Also, the notion of laboratory skills is always changing as technology advances. For example, probeware-based hands-on laboratory experiments and computer modeling have recently become available and been promoted in elementary and secondary school science teaching (Buckley et al., 2004; Edelson, 1998; Lee & Songer, 2003; Metcalf & Tinker, 2004; Parr et al., 2004; Songer, 1998). This type of technology brings students closer to what scientists are doing (Songer, 1998). Edelson believes that this type of technology can make school science teaching and learning more authentic. According to Edelson (1998), authentic science possesses four essential features: (a) inherent uncertainty in research questions, (b) a commitment to pursue answers, (c) using modern tools and techniques, and (d) engaging in social interaction. Probewarebased hands-on experiences contain all of the above features. Another important component of science inquiry is science demonstration. A demonstration is “a process of showing something to another person or group” (Trowbridge et al., 2004, p. 192). According to Trowbridge et al. (2004), there are many types of demonstrations depending on what aspects of demonstration are used to classify. Based on the reasoning involved, demonstrations can be descriptive, i.e., show-and-tell, inductive, or deductive. Based on who does the demonstration, it can be teacher demonstration, student demonstration, student–teacher demonstration, student-group demonstration, or guest demonstration. Demonstration is not a stand-alone teaching method; it needs to be used in conjunction with other teaching strategies. For this reason, demonstration may be considered as an advance organizer for structuring subsequent information or activities into a meaningful instructional framework (Chiappetta & Koballa, 2006). An effective demonstration can achieve many functions, such as: (a) focusing students’ attention, (b) motivating and interest students, (c) illustrating key concepts, (d) uncover misconceptions, and (e) initiating inquiry and problemsolving (Chiapetta & Koballa, 2006; Treagust, 2007). Demonstration may also be considered a context in which both students and teachers are engaged cognitively,
3
Models of Competence and Opportunities to Learn in the Classroom
23
emotionally, and physically into a ritual in order for students to experience science, talk about science experience, propose questions, suggest patterns, and test those questions and patterns (Milne & Otieno, 2007). Demonstration is not a simple teaching method; the key for a successful demonstration is interaction between the teacher and students. In order to effectively initiate and sustain interaction during a demonstration, a commonly shared experience and knowledge base between the teacher and students is critical. Successful demonstration needs to be an integral part of a course’s objectives and build on students’ preconceptions and experiences (Roth et al., 1997). This is because students perceive demonstration from their own knowledge and experience, i.e., seeing as instead of gazing or observing (Milne & Otieno, 2007). Unfamiliar objects and equipment to students are unlikely to initiate interaction, and questions failing to connect to students’ prior experiences are unlikely to receive satisfactory responses. One example of how focusing on interaction of demonstration can promote science learning is to incorporate explanation technique into demonstration (Treaguest, 2007). Many effective explanation models have been available in science teaching, such as Prediction, Observation, and Explanation (POE) (White & Gunstone, 1992). Explanation during demonstration can help students make meanings by (a) creating a difference between the teacher and students in knowledge, interest, etc., (b) constructing new entities for the observed phenomenon, (c) transforming the newly constructed knowledge through narratives, analogies, etc., and (d) associating meanings with concrete matter/objects (Ogborn et al., 1996; Treagust, 2007). As technology advances, the role of technology in science demonstration may become more and more visible (Treaguest, 2007). Technology can be used to introduce a demonstration, or to discuss demonstration results. Technology can greatly enrich science demonstration by bringing science phenomena that are too expensive, dangerous, or time-consuming to the classroom. Technology-enhanced demonstrations can also open the scope of science content and increase the shared experiences between the teacher and students during demonstration. Assessment, particularly formative assessment, also plays an important role in inquiry science teaching. Research has established both a strong theoretical foundation and empirical evidence that formative assessment improves science achievement (Gallagher, 2007; NRC, 2003). Formative assessment is an essential component of teaching for understanding, and has proved effective in raising student scores on external examinations (Gallagher, 2007). Black and William (1998a, b) reviewed more than 250 books and articles in research on formative assessment. They concluded that formative assessment by teachers, combined with appropriate feedback to students, could have significant positive effects on student achievement. Formative assessment could raise the achievement of students, particularly underachieving students, by producing achievement gains with effect sizes between 0.4 and 0.7. For example, in one intervention study funded by the Nuffield Foundation involving 24 math and science teachers in six schools in the United Kingdom, the formative assessment intervention classes yielded a mean effect size of 0.32, or a 95% confidence interval of 0.16 to 0.48 (Black & William, 2003; William et al., 2004). Gallagher reported an increase in
24
3
Models of Competence and Opportunities to Learn in the Classroom
student proficiency on a middle school state exam by 9% over 4 years through the use of formative assessment (Gallagher, 2007). When quizzes are used as formative assessment, because of the limitations of the paper-and-pencil multiplechoice question format commonly used in quizzes, it is unlikely to achieve the above positive effects. As can been seen from the above, that effective teaching practices in the classroom are multifaceted, they must be carried out systematically. In an effort to synthesize research on the effectiveness of various science teaching strategies on student achievement, Schroeder et al. (2007) conducted a metanalysis of US research published from 1980 to 2004. Sixty-one studies were included. The following eight categories of teaching strategies were found to have an significant effect on students’ achievement (effect sizes in parentheses): questioning strategies (0.74); manipulation strategies (0.57); enhanced material strategies (0.29); assessment strategies (0.51); inquiry strategies (0.65); enhanced context strategies (1.48); instructional technology (IT) strategies (0.48); and collaborative learning strategies (0.95). In another study, Odom et al. (2007) found that student-centered teaching practices have a positive association with student achievement (p < 0.01) and a negative association with teacher-centered teaching practices (p < 0.01). Additionally, student attitudes about science were positively associated with student-centered teaching practices (p < 0.01) and negatively associated with teacher-centered teaching practices (p < 0.01). In particular, near-daily implementation of group experiments and reduction of extensive note-copying during class yielded the greatest positive impact on student achievement. Keeping in mind the above general guidelines for effective science teaching, next I will present various scenarios, or teaching profiles, that are associated with reaching or failing to reach NAEP competence. I will first present the competence model for the 4th grade, followed by the competence model for the 8th grade. I will then conclude this chapter by summarizing the commonalities and differences between the 4th- and 8th-grade models. Those who are interested in technical details of accuracy measures associated with each of the models may refer to Appendix D.
Grade 4 Competence Model Figure 3.1 presents the 4th grade model of competence. In the model, “S” refers to successfully reaching competence, while “U” refers to failing to reach competence. As stated in Chapter 2, branches with leaves containing instances fewer than 100 are not shown; those branches are represented by a “×” symbol over the oval. As shown in Fig. 3.1, the 4th-grade model of competence contains 12 profiles, among which only 1 profile, P#1, predicts the desirable outcome, i.e., reaching NAEP competence; all other profiles predict the undesirable outcome, i.e., failing to reach NAEP competence.
Grade 4 Competence Model
25 How often bring guest speakers 3 or more times a 1 or 2 times a year year or less How often students talk about hands-on results?
How often assess students using homework?
at least once a grading period less than once a grading period How much emphasis on developing How often students interest in science take science tests / quizes atleast moderate little or no emphasis emphasis
once or twice a week or less −most every day Assign projects taking more than a week? yes How often do students go on field trips?
no
once or twice a
Computer availability for science?
1 or 2 times a year or more lever or hardly S ever 248/81
at least once or month or less twice a week
easy accessible lab
U 353 /33
P10 How often assess atleast one students using computer in the P1 How often students multiple-choice tests? classroom talk about handsHow often assess How much emphasis never or on results? students using on science relevance hardly ever portfolio? to society / technology? atleast once or at least once or never or hardly twice a year twice a month ever once a grading atleast moderate period or less emphasis U U little or no atleast once or How often assess 474 / 95 388/52 emphasis twice a month students using self or How much peer evaluation? P2 How often assess P11 time a week students using should be on How often students homework? once or twice a homework? use library? <=1 week once or twice a atleast once or alleast once or month or less twice a year U atleast half a twice a month Never or hardly 185/23 hour never or hardly ever ever P3 How much U How often emphasis on 1538/415 assess students data analysis using short or skills moderate once or twice a U P8 once or twice long written How often emphasis or less month or less 157 /41 response? How much emphasis week How much students on science relevance emphasis on give oral heavy emphasis P12 to society/ lab skills? report? technology?
How much emphasis on communicating science ideas?
atleast moderate emphasis little or no emphasis
atleast moderate emphasis
little or no emphasis
How often do you talk to class about science?
once or twice a week or less a little
at least some
How much emphasis on developing interest in science?
at least once or twice a year
U 164 /70
P5 How often talk to class about science?
How often students write science reports?
P9
almost everyday once or twice a week or less
atleast once or twice a week once or twice a month or less
How often assess students using group projects?
little or no atleast moderate emphasis emphasis
atleast once or twice a week once or twice a month or less
almost every day How much time spent on life science?
never or hardly ever
U 133/14
U 877 / 175
U 1620/435
How much time spent on earch science?
atleast some a little or none
P7
P6
How much time a week on homework?
Heavy emphasis Moderate to no emphasis
half an hour or at least one hour less U 201/70 P4
Fig. 3.1 Grade 4 model of competence based on teaching practices
P#1: {[Bring guest speakers to science class three or more times a year] and [Students talk about hands-on results once or twice a week or less] and [Assign projects that take more than a week] and [Take students on a field trip once or twice a year or more]} PREDICT [Reaching competence] Profile #1 predicts the desirable learning outcome – reaching NAEP competence. This profile has 248 instances, among which 80 instances (32%) were misclassified,
26
3
Models of Competence and Opportunities to Learn in the Classroom
giving a prediction accuracy of 68%. Although the prediction accuracy is not very high, this profile highlights some good science teaching practices that provide students with opportunities to learn science. Two of the four teaching practices in the profile, i.e., bringing guest speakers and taking field trips, pertain to community-centered learning environment, while two other teaching practices, i.e., discussing hands-on results and assigning projects that take more than a week, pertain to open-ended inquiry. Tables 3.2–3.5 show cross-tabulation between each of the four teaching practices and students’ status of reaching the NAEP competence level. Table 3.2 Cross-tabulation between bringing guest speakers to class and reaching NAEP competence
How often do you bring guest speakers? 3 or more times a year Once or twice a year Never or hardly ever Total
Table 3.3 Cross-tabulation between discussing hands-on results and reaching NAEP competence
How often do students talk about hands-on results? Almost everyday Once or twice a week Once or twice a month Never or hardly ever Total
Table 3.4 Cross-tabulation between assigning extended projects and reaching NAEP competence
Do you assign projects that take more than a week? Yes No Total
NAEP competence status No
Yes
Total
145 (2.2%) 1,673 (25.9%) 2,839 (43.9%) 4,657 (72.0%)
205 (3.2%) 720 (11.1%) 888 (13.7%) 1,813 (28.0%)
350 (5.4%) 2,393 (37.0%) 3,727 (57.6%) 6,470 (100.0%)
NAEP competence status No
Yes
Total
320 (5.0%) 1,732 (26.9%) 2,055 (32.0%) 522 (8.1%) 4,629 (72.0%)
103 (1.6%) 828 (12.9%) 712 (11.1%) 159 (2.5%) 1,802 (28.0%)
423 (6.6%) 2,560 (39.8%) 2,767 (43.0%) 681 (10.6%) 6,431 (100.0%)
NAEP competence status No
Yes
Total
3,385 (52.9%) 1,213 (18.9%) 4,598 (71.8%)
1,412 (22.1%) 392 (6.1%) 1,804 (28.2%)
4,797 (74.9%) 1,605 (25.1%) 6,402 (100.0%)
Grade 4 Competence Model Table 3.5 Cross-tabulation between taking students for field trips and reaching NAEP competence
27
How often do students go on science field trips? 3 or more times a year Once or twice a year Never or hardly ever Total
NAEP competence status No
Yes
Total
276 (4.3%) 2,177 (33.6%) 2,204 (34.1%) 4,657 (72.0%)
161 (2.5%) 958 (14.8%) 694 (10.7%) 1,813 (28.0%)
437 (6.8%) 3,135 (48.5%) 2,898 (44.8%) 6,470 (100.0%)
From Table 3.2 we see that there are only 350 (5.4%) students whose teachers bring guest speakers to class three or more times a year, and among them 205 (59%) have reached NAEP competence. Although the overall odds-ratio, i.e., the probability of reaching competence over the probability of failing to reach competence, for all students is 0.39 (28%/72%), different frequencies of bringing guest speakers to class are associated with different odds-ratios. For students whose teachers bring guest speakers to class three or more times a year, the odds-ratio is 1.46 (3.2%/2.2%); this ratio drops to 0.43 (11.1%/25.9%) for students whose teachers bring guest speakers to class only one or two times a year, and this ratio further drops to 0.31 (13.7%/43.9%) for students whose teachers never or hardly ever bring guest speakers to class. Thus, bringing guest speakers to class does increase the likelihood for students to reach competence. Table 3.3 shows that most science teachers talk about hands-on results once or twice a week (2,560 or 40%) or once or twice a month (2,767 or 43%). In terms of odds-ratios associated with different frequencies of talking about hands-on results, the highest odds-ratio is associated with “talking about hands-on results once or twice a week.” More specifically, for students who talk about hands-on results almost everyday, the odds-ratio is 0.32 (1.6%/5.0%); for those who talk about it once or twice a week, it is 0.48 (12.9%/26.9%); for those who talk about it once or twice a month, it is 0.34 (11.1%/32.0%); and for those who never or hardly ever talk about it, it is 0.31 (2.5%/8.1%). All the above odds-ratios are not very different from the overall odds-ratio, which is 0.38 (28%/72.0%). Similarly, Table 3.4 shows that 75% of students have teachers who have assigned projects that take more than a week. The odds-ratio for students whose teachers assign projects that take more than a week is 0.42 (22.1%/52.9%), while for students whose teachers do not assign projects that take more than a week it is 0.32 (6.1%/18.9%). Thus, extended projects seem to increase students’ odds to reach competence. Finally, from Table 3.5, we see that while a large number of students (44.8%) never go on a field trip, about an equally large number (48.5%) do go on a field trip once or twice a year, and only a few students (6.8%) go on field trips three or more times a year. The odds-ratios for students who go on field trips three or more times a year is 0.58 (2.5%/4.3%), while for those who go on a field trip once or twice a
28
3
Models of Competence and Opportunities to Learn in the Classroom
year it is 0.44 (14.8%/33.6%), and for those who never or hardly ever go on a field trip it is 0.31 (10.7%/34.1%). Thus, field trip does seem to increase the odds for students to reach competence. P#2 to P#12 all predict the undesirable outcome – failing to reach the NAEP competence level. P#2: {[Bring guest speakers to class once or twice a year or less] and [Assess students using homework at least once a grading period] and [Place at least moderate emphasis on developing interest in science] and [Easily assessable computer lab for science] and [Never or hardly ever talk about hands-on results]} PREDICT [Failing to reach competence] Profile #2 predicts the undesirable outcome – failing to reach NAEP competence. Key differences between profile #1 and profile #2 are that in the latter teachers bring guest speakers to class once or twice a year or less and students never or hardly ever talk about hands-on results. The effects of bringing guest speakers to class and talking about hands-on results have been discussed earlier on. Other teaching practices in profile #2 are not clearly negative. P#3: {[Bring guest speakers to class once or twice a year or less] and [Assess students using homework at least once a grading period] and [Place at least moderate emphasis on developing interest in science] and [Easily assessable computer lab for science] and [Talk about hands-on results at least once or twice a month] and [Assess students using self- or peer-evaluation once or twice a week]} PREDICT [Failing to reach competence] Profile #3 also predicts the undesirable outcome – failing to reach NAEP competence. Profile #3 shares many common features of profile #2. The two new features profile #3 has are: (a) talking about hands-on results at least once or twice a month, and (b) assessing students using self- or peer-evaluation once or twice a week. The effect of talking about hands-on has been discussed earlier on. As for self- or peer-evaluation, Table 3.6 shows the cross-tabulation between student self- and peer-evaluation and reaching the NAEP competence status.
Table 3.6 Cross-tabulation between self- and peer-evaluation and reaching NAEP competence
How often do you assess NAEP competence status students using self- and peer-evaluation? No Yes Total Once or twice a week Once or twice a month Once a grading period Once or twice a year Never or hardly ever Total
197 (3.1%) 551 (8.6%) 639 (9.9%) 725 (11.3%) 2,510 (39.1%) 4,622 (71.9%)
48 (0.7%) 242 (3.8%) 290 (4.5%) 296 (4.6%) 929 (14.5%) 1,805 (28.1%)
245 (3.8%) 793 (12.3%) 929 (14.5%) 1,021 (15.9%) 3,439 (53.5%) 6,427 (100.0%)
Grade 4 Competence Model
29
From Table 3.7 we can see that assessing students using self- and peer-evaluation is not a frequent practice among teachers (only 3.8% of students have teachers who do this once or twice a week, as compared to 53.5% of students whose teachers never or hardly ever do). Examining the odds-ratios associated with different frequencies of this teaching practice, we see that “once or twice a week” is associated with the lowest odds-ratio (0.23, or 0.7%/3.1%), and “once a grading period” is associated with the highest odds-ratio (0.46, or 4.5%/9.9%). Thus, similar to the effect of talking about hands-on results, too infrequent (e.g., once or twice a year) or too frequent (once or twice a week) use of this teaching practice decreases students’ odds to reach NAEP competence. P#4: {[Bring guest speakers to class once or twice a year or less] and [Assess students using homework at least once a grading period] and [Place at least moderate emphasis on developing interest in science] and [Easily accessible computer lab for science] and [Talk about hands-on results at least once or twice a month] and [Assess students using self- or peer-evaluation once or twice a month] and [Assess students using short or long written responses once or twice a week] and [Place at least moderate emphasis on science relevance to society/technology] and [Place little or no emphasis on communicating science ideas] and [Talk to class about science almost everyday] and [Spend at least some time on life science] and [Assess students using group projects at least once or twice a year] and [Place heavy emphasis on developing interest in science] and [Give at least 1 h a week on homework]} PREDICT [Failing to reach competence] Profile #4 is a low-frequency scenario, with only 201 instances. The prediction accuracy is also relatively low (only 65%). Thus, this profile should be considered with caution. In addition to many common teaching practices as in profiles #2 and #3, profile #4 contains a few new teaching practices. However, the additional teaching practices contained in profile #4 do not seem to be clearly positive or negative, which may explain why the prediction accuracy is relatively low. P#5: {[Bring guest speakers to class once or twice times a year or less] and [Assess students using homework at least once a grading period] and [Place
Table 3.7 Cross-tabulation between giving oral reports and reaching NAEP competence
How often do students give oral reports? Almost everyday Once or twice a week Once or twice a month Never or hardly ever Total
No
NAEP competence status Yes Total
30 (0.5%) 277 (4.3%) 1,743 (27.0%) 2,604 (40.3%) 4,654 (72.1%)
3 (0%) 70 (1.1%) 700 (10.8%) 1,030 (16.0%) 1,803 (27.9%)
33 (0.5%) 347 (5.4%) 2,443 (37.8%) 3,634 (56.3%) 6,457 (100.0%)
30
3
Models of Competence and Opportunities to Learn in the Classroom
at least moderate emphasis on developing interest in science] and [Easily accessible computer lab for science] and [Talk about hands-on results at least once or twice a month] and [Assess students using self- or peer-evaluation once or twice a month] and [Assess students using short or long written response once or twice a month or less] and [Students give oral report at least once or twice a week]} PREDICT [Failing to reach competence] Profile #5 predicts the undesirable outcome – failing to reach competence. It is also a low-frequency scenario with only 133 instances. The prediction accuracy for this profile is 89%, which is high. Profile #5 is similar to profile #3 except that profile #5 has the following additional features: (a) assessing students using self- or peer-evaluation once or twice a month, (b) assessing students using short or long written responses once or twice a month or less, and (c) students giving oral report at least once or twice a week. The effect of practice (a) has been discussed earlier on. Practice (b) does not seem to be clearly negative or positive given that assessment is always a tool and its appropriate use depends on learning objectives and context. Practice (c) is a key new feature of this profile. Table 3.7 shows the relationship between giving oral reports by students and reaching NAEP competence. From Table 3.7, we see that most students (56.3%) never or hardly ever give oral reports; a large number of students (37.8%) give oral reports once or twice a month. The odds-ratio for reaching competence is 0.26 (1.1%/4.3%) for students who give oral reports once or twice a week; but it increases to 0.40 (10.8%/27.0% and 16.0%/40.3%) for students who give oral reports once or twice a month or not at all. Thus, frequent oral reports by students, i.e., once or twice a week, seems to decrease the chance of reaching competence. P#6: {[Bring guest speakers to science class once or twice a year or less] and [Assess students using homework at least once a grading period] and [Place at least moderate emphasis on developing interest in science] and [Easily accessible computer lab for science] and [Students talk about hands-on results at least once or twice a month] and [Assess students using self- or peer-evaluation once or twice a month or less] and [Assess students using short or long written responses once or twice a month or less] and [Students give oral reports once or twice a month or less] and [Talk to class about science almost everyday] and [Students write science reports once or twice a month or less]} PREDICT [Failing to reach competence] Profile #6 contains most instances, i.e., 1,620, with a prediction accuracy of 73%. It predicts the undesirable outcome – failing to reach competence. Most teaching practices are the same as in profiles #2 to #5, which has already been discussed. Other teaching practices do not seem to be clearly negative. P#7: {[Bring guest speakers to science class once or twice a year or less] and [Assess students using homework at least once a grading period] and [Place at least moderate emphasis on developing interest in science] and [At least easily accessible computer lab for science] and [Students talk about handson results at least once or twice a month] and [Assess students using self- or peer-evaluation once or twice a month or less] and [Assess students using short
Grade 4 Competence Model
31
or long written responses once or twice a month or less] and [Students give oral reports once or twice a month or less] and [Talk to class about science once or twice a week or less] and [Spend at least some time on earth science]} PREDICT [Failing to reach competence] Profile #7 also predicts the undesirable outcome – failing to reach competence. It is similar to profile #6, except that it has many fewer instances (877) and a higher prediction accuracy (80%). Compared to profile #6, one key new feature in profile #7 is talking to class about science once or twice a week or less. Table 3.8 shows the relationship between talking to class about science and reaching NAEP competence. Talking to class about science refers to teacher lecturing. Given that it is impossible for teachers not to talk in any science class no matter what teaching approach is used, taking to class about science may also indicate the frequency at which science is taught at the 4th grade. From Table 3.8, we see that most students (57.9%) have teachers talking to them almost everyday. However, there are also a large number of students (36.5%) who only have their teachers talking to them once or twice a week. The odds-ratio for students whose teachers talk to them almost everyday is 0.46 (18.3%/39.6%), as compared to that for students whose teachers only talk to them once or twice a week, which is 0.31 (8.7%/27.8%). For students whose teachers only talk to them once or twice a month, the odds-ratio is even smaller (0.21, 0.9%/4.2%). Thus, the more frequently the teacher talks to the class about science, the higher the odds are for students to reach competence. P#8: {[Bring guest speakers to class once or twice a year or less] and [Assess students using homework at least once a grading period] and [Place at least moderate emphasis on developing interest in science] and [At least one computer in the classroom] and [At least moderate emphasis on science relevance to society/technology] and [Assign at least 0.5 h a week on homework]} PREDICT [Failing to reach competence] Profile #8 has the second-largest instances (i.e., 1,538) among all the profiles. The profile predicts the undesirable outcome – failing to reach competence, with a prediction accuracy of 73%. Most of the teaching practices have already been discussed Table 3.8 Cross-tabulation between talking to class about science and reaching NAEP competence
How often do you talk to class about science? Almost everyday Once or twice a week Once or twice a month Never or hardly ever Total
No
NAEP competence status Yes Total
2,563 (39.6%) 1,799 (27.8%) 270 (4.2%) 31 (0.5%) 4,663 (72.0%)
1,184 (18.3%) 563 (8.7%) 60 (0.9%) 7 (0.1%) 1,814 (28.0%)
3,747 (57.9%) 2,362 (36.5%) 330 (5.1%) 38 (0.6%) 6,477 (100.0%)
32
3
Table 3.9 Cross-tabulation between homework and reaching NAEP competence
Models of Competence and Opportunities to Learn in the Classroom Time per week students should spend on homework None 0.5 h 1h 2h More than 2 h Total
No
NAEP competence status Yes Total
801 (12.8%) 1,712 (27.4%) 1,565 (25.1%) 327 (5.2%) 82 (1.3%) 4,487 (71.9%)
267 (4.3%) 614 (9.8%) 719 (11.5%) 139 (2.2%) 17 (0.3%) 1,756 (28.1%)
1,068 (17.1%) 2,326 (37.3%) 2,284 (36.6%) 466 (7.5%) 99 (1.6%) 6,243 (100.0%)
earlier. One new feature in this profile is the frequency of homework. Table 3.9 shows the relationship between homework and reaching NAEP competence. From Table 3.9, we see that most students’ teachers think that students should have 0.5 or 1 h a week of homework (about 37%), as compared to 17% who opt for no homework and 7.5% for 2 h of homework. However, the highest odds-ratio is associated with students having 1 h a week of homework (0.46, 11.5%/25.1%), followed by 0.42 (2.2%/5.2%) for students having 2 h a week of homework. No homework has an odds-ratio of 0.34, while having homework of more than 2 h has a lowest odds-ratio of 0.23. Thus, 1 h a week for homework is associated with the highest odds-ratio for students to reach competence. P#9: {[Bring guest speakers to class once or twice a year or less] and [Assess students using homework at least once a grading period] and [Place at least moderate emphasis on developing interest in science] and [At least one computer in the classroom] and [Place little or no emphasis on science relevance to society/technology] and [Students use the library at least once or twice a month] and [Place moderate emphasis or less on data analysis skills] and [Place at least moderate emphasis on lab skills]} PREDICT [Failing to reach competence] Profile #9 has 164 instances, and a very low prediction accuracy rate (57%). This profile is not very informative. P#10: {[Bring guest speakers to class once or twice a year or less] and [Assess students using homework less than once a grading period] and [Students take science tests/quizzes once or twice a month or less]} PREDICT [Failing to reach competence] P#11: {[Bring guest speakers to class once or twice a year or less] and [Assess students using homework less than once a grading period] and [Students take science tests/quizzes at least once or twice a week] and [Assess students using multiple-choice tests at least once or twice a year]} PREDICT [Failing to reach competence]
Grade 8 Competence Model
33
P#12: {[Bring guest speakers to class one or two times a year or less] and [Never or hardly ever assess students using homework] and [Students take science tests/quizzes at least once or twice a week] and [Never or hardly ever assess students using multiple-choice tests] and [Assess students using portfolios once a grading period or less]} PREDICT [Failing to reach competence] Profiles #10 to #12 all predict the undesirable outcome – failing to reach competence. Most of the teaching practices contained in these profiles relate to assessment practices. Generally speaking, assessment is only a tool to support learning. The three scenarios described in these profiles do not seem to indicate that assessment is used to support learning; it is unlikely that science instruction in these scenarios is assessment-centered.
Grade 8 Competence Model Figure 3.2 presents the 8th grade competence model. Same as for the 4th-grade model of competence in Fig. 3.1, “S” in the 8th-grade model of competence presented in Fig. 3.2 refers to “Reaching the NAEP Competence Level,” and “U” refers to “Failing to Reach the NAEP Competence Level.” Branches containing leaves with less than 100 instances are not shown. As shown in Fig. 3.2, the 8th-grade model contains eight branches or profiles. Among the eight profiles, only one, profile #5, predicts the desirable outcome – reaching NAEP competence; all other profiles predict the undesirable outcomes – failing to reach NAEP competence. The variable at the root is about emphasis on science laboratory skills; profile #1 is solely predicted by this variable. P#1: [Little or no emphasis on laboratory skills] PREDICTS [Failing to reach competence] Profile #1 is about a teacher’s emphasis on science laboratory skills. There are two benefits of placing sufficient emphasis on laboratory skills. First, because a portion of NAEP assessment items are performance assessment tasks requiring students to use tools and reasoning to perform, an adequate emphasis on laboratory skills can directly help students achieve well on NAEP. Second, an adequate emphasis on laboratory skills may help students develop conceptual understanding of science concepts, in addition to increasing student interest and motivation in learning science. Table 3.10 shows the relationship between the emphasis on laboratory skills and reaching NAEP competence. From Table 3.10, we see that among the 18% of students whose teachers place little or no emphasis on developing laboratory skills, only 3% reached NAEP competence and 15% did not, a 0.19 odds-ratio. Compared to other categories, the odds-ratio for “Moderate emphasis” and “Heavy emphasis” is 0.42 (12.7%/30.6%) and 0.40 (11.1%/27.6%) respectively. Thus, placing little or no emphasis on developing laboratory skills is associated with the lowest likelihood for students to reach competence.
34
3
Models of Competence and Opportunities to Learn in the Classroom
How much emphasis on lab skills
at least moderate emphasis
little or no emphasis
How often use computers for science?
U 1399/228 P1
almost every day
at least once or twice a week
How often do you do science demonstration?
How often assess students using portfolios?
at least once or twice a year
U 2968/667
at least once or twice a month
How often you talk to class?
once or twice a at least once or month or less twice a week
never or hardly ever
never or hardly ever How much emphasis on data analysis?
How often students read textbooks?
at least moderate emphasis
little or no emphasis
U 319/4
P2 P3 almost every day once or twice a week or less How often assess students using lab notebooks/journals?
How often students works with others on activities?
never or hardly ever
at least once or twice a month
at least once or twice a year never or hardly ever U 634/188 P4
How much emphasis on problem solving?
How much emphasis on science facts / terminology?
S 164/71 P5
How much time a week should homework?
1 hour or less 2 hours or more
How often assess students using inclass essays?
at least once or twice a month
How much time spent on life science?
at least once or twice a week once or twice a month or less
at least a little How much emphasis on communicating science ideas?
U 164/40 P6
How often students do hands-on?
U 906/208 P7
once a grading period or less
1 hour
not available at least one computer lab
heavy emphasis moderate or little to no emphasis
heavy emphasis moderate emphasis How much time a week should homework be?
half an hour or less
Computer availability?
at least moderate emphasis
none
little or no emphasis
U 409/103 P8
Fig. 3.2 Grade 8 model of competence based on teaching practices
P#2: {[Place at least moderate emphasis on laboratory skills] and [Use computers for science at least once or twice a week] and [Talk to class about science at least once or twice a week] and [Do science demonstration at least once or twice a month] and [Assess students using portfolios at least once or twice a year]} PREDICT [Failing to reach competence] Profile #2 has the largest predicted instances (2,968) for failing to reach the NAEP competence level, with a 77% prediction accuracy. The five teaching practices contained in the profile involve a wide variety of aspects of science teaching, including
Grade 8 Competence Model Table 3.10 Cross-tabulation between emphasis on laboratory skills and reaching NAEP competence
35 Emphasis on developing laboratory skills Heavy emphasis Moderate emphasis Little/no emphasis Total
No
NAEP competence status Yes Total
1,765 (27.6%) 1,959 (30.6%) 968 (15.1%) 4,692 (73.3%)
711 (11.1%) 813 (12.7%) 184 (2.9%) 1,708 (26.7%)
2,476 (38.7%) 2,772 (43.3%) 1,152 (18.0%) 6,400 (100.0%)
the curriculum emphasis, using computers, talking to class, science demonstration, and portfolio assessment. All these teaching practices do not seem to be negative, although may not be exemplary. This profile may represent a scenario in a typical science classroom, which is not associated with very high likelihood for students to reach competence, because the overall percentage of 8th graders who have reached competence is only around 27%, an odds-ratio of 0.37 (27%/73%). P#3: {[Place at least moderate emphasis on laboratory skills] and [Use computers for science at least once or twice a week] and [Talk to class about science at least once or twice a week] and [Do science demonstration at least once or twice a month] and [Never or hardly ever assess students using portfolios] and [Place little or no emphasis on data analysis]} PREDICT [Failing to reach competence] Profile #3 shares most features of profile #2; both predict the undesirable learning outcome – failing to reach competence. One additional feature in profile #3 is emphasis on data analysis. Table 3.11 shows the relationship between emphasis on data analysis and reaching NAEP competence. From Table 3.11, we see that while most students (61%) have teachers who place moderate emphasis on developing data analysis skills, 25% students have teachers who place a heavy emphasis on developing data analysis skills. Comparing the odds-ratios among the three emphases, we see that the highest odds-ratio is associated with “heavy emphasis” (0.42, 7.4%/17.5%), followed by 0.38 (16.8%/43.9%) for “moderate emphasis,” and the lowest of 0.20 (2.4%/11.9%) for “little/no emphasis.” Given the importance of data analysis in science inquiry, the inclusion of emphasis on developing data analysis in this profile is not a surprise. P#4: {[Place at least moderate emphasis on laboratory skills] and [Use computers for science at least once or twice a week] and [Talk to class about science at least once or twice a week] and [Do science demonstration at least once or twice a month] and [Never or hardly ever assess students using portfolios] and [Place at least moderate emphasis on data analysis] and [Students read textbooks almost everyday] and [Assess students using lab notebooks/journals at least once or twice a year]} PREDICT [Failing to reach competence] Profile #4 predicts the undesirable outcome again – failing to reach competence. This profile has 634 instances with a prediction accuracy of 70%. In addition to the teaching practices in profile #3, one key feature in this profile is related to
36
3
Models of Competence and Opportunities to Learn in the Classroom
Table 3.11 Cross-tabulation between emphasis on data analysis and NAEP competence status
Emphasis on developing data analysis skills Heavy emphasis Moderate emphasis Little or no emphasis Total
Table 3.12 Cross-tabulation between reading textbooks and reaching NAEP competence
How often do students read science textbooks? Almost everyday Once or twice a week Once or twice a month Never or hardly ever Total
No
NAEP competence status Yes Total
1,119 (17.5%) 2,812 (43.9%) 762 (11.9%) 4,693 (73.3%)
No
475 (7.4%) 1,077 (16.8%) 156 (2.4%) 1,708 (26.7%)
1,594 (24.9%) 3,889 (60.8%) 918 (14.3%) 6,401 (100.0%)
NAEP competence status Yes Total
1,561 (25.0%) 1,926 (30.9%) 598 (9.6%) 473 (7.6%) 4,558 (73.0%)
597 (9.6%) 663 (10.6%) 254 (4.1%) 169 (2.7%) 1,683 (27.0%)
2,158 (34.6%) 2,589 (41.5%) 852 (13.7%) 642 (10.3%) 6,241 (100.0%)
having students read textbooks. This teaching practice is a typical characteristic of transmissive approach to science teaching, not particularly beneficial for students to develop understanding. Table 3.12 shows the relationship between reading textbooks and reaching NAEP competence. Table 3.12 shows that 42% students read textbooks once or twice a week, 35% almost everyday, 14% once or twice a month, and 10% never or hardly ever read textbooks. The odds-ratios of reaching competence among the categories are 0.43 (4.1%/9.6%) for “once or twice a month,” 0.38 (9.6%/25.0%) for “almost everyday,” 0.35 (2.7%/7.6%) for “never or hardly ever,” and 0.34 (10.6%/30.9%) for “once or twice a week.” Therefore, infrequently reading textbooks (i.e., once or twice a month) is associated with the highest odds for students to reach competence. P#5: {[Place at least moderate emphasis on developing laboratory skills] and [Use computer for science at least once or twice a week] and [Talk to class about science at least once or twice a week] and [Do science demonstrations at least once or twice a week] and [Never or hardly ever assess students using portfolios] and [Place at least moderate emphasis on developing data analysis skills] and [Students read textbooks almost everyday] and [Never or hardly ever assess students using laboratory notebooks/journals] and [Place heavy
Grade 8 Competence Model
37
emphasis on scientific facts/terminology] and [Students spend at least 1 h per week on homework]} PREDICT [Reaching NAEP competence] Profile #5 predicts the desirable outcome – reaching NAEP competence; it is the only pattern in the model that predicts the desirable outcome. Profile #5 contains ten teaching practices, among which three are related to curriculum emphases (i.e., laboratory skills, data analysis, and science facts and terms), three are related to teacher-centered teaching (i.e., talking to class, science demonstrations, and reading textbooks), two are related to assessment methods (i.e., assess students using portfolios and laboratory notebooks/journals), one is related to using computers, and one is related to student homework. Keep in mind that this pattern is not a popular one, with only 164 instances, and 71 out of the 164 (43%) actually failed to reach competence level. With such a low accuracy rate (57%), the pattern should be interpreted with great caution. Because 57% is close to a 50:50 chance, i.e., either reaching or failing to reach competence, this profile may in fact represent a teaching scenario in a typical science classroom. P#6: {[Place at least moderate emphasis on developing laboratory skills] and [Use computer for science at least once or twice a week] and [Talk to class about science at least once or twice a week] and [Do science demonstrations at least once or twice a week] and [Never or hardly ever assess students using portfolios] and [Place at least moderate emphasis on developing data analysis skills] and [Students read textbooks once or twice a week or less] and [Students work with others on activities at least once or twice a month] and [No computers available for science] and [Place heavy emphasis on problem solving] and [Assess students using in-class essays once a grading period or less]} PREDICT [Failing to reach NAEP competence] Profile #6 departs from profiles #4 and #5 on how often students read textbooks. One new feature with this profile is related to computer availability. Table 3.13 shows the relationship between computer availability and reaching NAEP competence. Table 3.13 shows that while most students have access to computers either in computer labs (46%) or in classrooms (37%), some (16.7%) do not have computer access at all. The odds-ratio for students who do not have access to computers is 0.33 (4.1%/12.6%), which is slightly lower than 0.35 (9.5%/27.4%) for “computers
Table 3.13 Cross-tabulation between computer availability and reaching the NAEP competence
Computer availability for science
NAEP competence status No Yes Total
None available
741 (12.6%) 1,616 (27.4%) 1,991 (33.8%) 4,348 (73.9%)
Available in classroom Available in computer lab Total
243 (4.1%) 559 (9.5%) 735 (12.5%) 1,537 (26.1%)
984 (16.7%) 2,175 (36.9%) 2,726 (46.3%) 5,885 (100.0%)
38
3
Models of Competence and Opportunities to Learn in the Classroom
available in the classroom” and 0.37 (12.5%/33.8%) for “computers available in the computer lab.” P#7: {[Place at least moderate emphasis on developing laboratory skills] and [Use computer for science at least once or twice a week] and [Talk to class about science at least once or twice a week] and [Do science demonstrations at least once or twice a week] and [Never or hardly ever assess students using portfolios] and [Place at least moderate emphasis on developing data analysis skills] and [Students read textbooks once or twice a week or less] and [Students work with others on activities at least once or twice a month] and [At least one computer lab is available for science] and [Assign 1 h or less per week on homework]} PREDICT [Failing to reach NAEP competence] Profile #7 also predicts the undesirable outcome – failing to reach competence. In addition to common teaching practices contained in previous profiles, one key feature in this profile is the frequency of student homework. Table 3.14 shows the relationship between student homework and reaching the NAEP competence status. Table 3.14 shows that most students (36.6%) have teachers who think that students should spend 1 h per week on homework, and a large number (32.7%) have teachers who think that students should spend 2 h on homework. Examining the odds-ratio between the probability of reaching competence and failing to reach competence, we can see an overall increase trend that the more homework students do, the higher is the odds-ratio. Specifically, the odds-ratios are: 0.21 (0.7%/3.4%) for “none,” 0.25 (2.5%/9.9%) for “0.5 h,” 0.37 (9.9%/26.8%) for “1 h,” 0.41 (9.5%/23.2%) for “2 h,” and 0.51 (4.8%/9.4%) for “more than 2 h.” Thus, in general, more time on homework is preferable. P#8: {[Place at least moderate emphasis on developing laboratory skills] and [Use computer for science at least once or twice a week] and [Talk to class
Table 3.14 Cross-tabulation between student homework and reaching NAEP competence
Time per week students should spend on homework None 0.5 h 1h 2h More than 2 h Total
No
NAEP competence status Yes Total
203 (3.4%) 600 (9.9%) 1,618 (26.8%) 1,400 (23.2%) 569 (9.4%) 4,390 (72.7%)
41 (0.7%) 151 (2.5%) 595 (9.9%) 576 (9.5%) 287 (4.8%) 1,650 (27.3%)
244 (4.0%) 751 (12.4%) 2,213 (36.6%) 1,976 (32.7%) 856 (14.2%) 6,040 (100.0%)
Grade 8 Competence Model
39
about science at least once or twice a week] and [Do science demonstrations at least once or twice a week] and [Never or hardly ever assess students using portfolios] and [Place at least moderate emphasis on developing data analysis skills] and [Students read textbooks once or twice a week or less] and [Students work with others on activities at least once or twice a month] and [At least one computer lab is available for science] and [Assign 2 h or more per week on homework] and [Students do hands-on at least once or twice a week] and [Spend at least a little time on life science] and [Place at least moderate emphasis on communicating science ideas]} PREDICT [Failing to reach NAEP competence] Profile #8 also predicts the undesirable outcome – failing to reach the competence status. Different from profile #7, profile #8 includes the teaching practice of assigning 2 h or more per week for homework. One significant additional teaching practice is related to the frequency of student hands-on activities. Table 3.15 shows the relationships between student hands-on activities and reaching NAEP competence. From Table 3.15 we can see that most students (55.4%) have teachers who engage students in hands-on activities once or twice a week. Examining the oddsratios between failing to reach competence and reaching competence, we can see a clear trend that the more frequent the hands-on activities by students, the higher is the odds-ratio. Specifically, the odds-ratios for the different categories in Table 3.15 are: 0.45 (6.4%/14.1%) for “almost everyday,” 0.42 (16.3%/39%) for “once or twice a week,” 0.24 (4.1%/17.4%) for “once or twice a month,” and 0.08 (0.2%/2.5%) for “never or hardly ever.” Profile #8 contains the practice of assigning 2 h or more homework a week, which is preferable according to the odds-ratios discussed earlier. Individually, the teaching practices included in profile #8 are in general positive. But the fact that profile #8 still predicts failing to reach competence indicates that average teaching practices are not enough; better-than-average or best practices are needed to help students reach competence.
Table 3.15 Cross-tabulation between student hands-on and reaching NAEP competence
How often do students do hands-on science activities? Almost everyday Once or twice a week Once or twice a month Never or hardly ever Total
No
NAEP competence status Yes Total
886 (14.1%) 2,446 (39.0%) 1,087 (17.4%) 155 (2.5%) 4,574 (73.0%)
402 (6.4%) 1,022 (16.3%) 255 (4.1%) 12 (0.2%) 1,691 (27.0%)
1,288 (20.6%) 3,468 (55.4%) 1,342 (21.4%) 167 (2.7%) 6,265 (100.0%)
40
3
Models of Competence and Opportunities to Learn in the Classroom
Summary There are 12 profiles in the 4th-grade competence model, each describing one combination of teaching practices. The most frequent profile with largest number of instances is profile #6; it predicts the undesirable outcome – failing to reach competence. Forming this teaching profile are the following teaching practices: (a) bringing guest speakers to science class once or twice a year or less, (b) assessing students using homework at least once a grading period, (c) placing at least moderate emphasis on developing interest in science, (d) easily accessible computer labs for science, (e) students talking about hands-on results at least once or twice a month, (f) assessing students using self- or peer-evaluation once or twice a month or less, (g) assessing students using short or long written responses once or twice a month or less, (h) students giving oral reports once or twice a month or less, (i) talking to class about science almost everyday, and (j) students writing science reports once or twice a month or less. Although these teaching practices individually may not be considered ineffective, together they represent a typical or average science teaching profile that is associated with low likelihood for students to reach competence. In other words, average teaching practices are not enough to predict student competence. The only profile in the 4th-grade competence model predicting reaching competence is profile #1. This profile contains the following teaching practices: (a) bringing guest speakers to science class three or more times a year, (b) students talking about hands-on results once or twice a week or less, (c) assigning projects that take more than a week, and (d) taking students on a field trip once or twice a year or more. These teaching practices individually may not be sufficient for helping students to reach competence, but together they represent a teaching profile that is associated with a high chance for students to reach competence. In addition to general teaching profiles contained in the 4th-grade competence model, the following individual teaching practices are found to be associated with higher odds-ratios for students to reach competence: (a) talking about hands-on results once or twice a week, (b) assigning projects that take more than a week, (c) taking students for field trips at least once or twice a year (preferably three or more times a year), (d) assessing students using self- or peer-evaluation once or twice a grading period, (e) students giving oral reports once or twice a month or less, (f) talking to class about science everyday, and (g) having students to do homework 1–2 h a week. Profile #1 contains some of these teaching practices. There are eight profiles in the 8th-grade competence model. Like the 4th-grade model, most profiles predict the undesirable outcome – failing to reach competence. The most frequent profile with the largest number of instances is profile #2; it predicts failing to reach competence. Profile #2 contains the following teaching practices: (a) placing at least moderate emphasis on developing laboratory skills, (b) using computers for science at least once or twice a week, (c) talking to class about science at least once or twice a week, (d) doing science demonstration at least once or twice a month, and (e) assessing students using portfolios at least once
Summary
41
or twice a year. Similar to the most common profile in the 4th-grade competence model, the above teaching practices individually may not be considered ineffective; however, they together represent an average teaching profile that is associated with a low likelihood for students to reach competence. Although profile #5 in the 8th-grade competence model predicts the desirable outcome – reaching competence, the prediction accuracy is low, and thus may not be informative. Individual teaching practices that are found to be associated with high odds-ratios are: (a) placing a moderate emphasis on developing laboratory skills, (b) placing a heavy emphasis on developing data analysis skills, (c) infrequent reading (i.e., once or twice a month) of science textbooks, (d) computers available either in classrooms or computer labs, (e) spending more than 2 h a week on homework, and (f) students do hands-on activities almost everyday. Most of these teaching practices are different from those found in the 4th-grade model. The only common teaching practice found for both 4th and 8th grades is related to homework. For the 4th grade, the highest odds-ratio is associated with students spending 1–2 h per week on homework, while for the 8th grade, the highest oddsratio is associated with students spending at least 2 h a week on homework. Overall, predicting students reaching competence for the 4th grade and 8th grade requires different sets of teaching practices.
Chapter 4
Models of Competence and Opportunities to Learn at Home
Various learning opportunities are available at home. According to Bransford et al. (2000), during an academic year of 180 school days and 6.5 h per school day, a typical American child spends 53% of time at home and in the community, 33% sleeping, and only 14% in schools. From watching television at home to visiting a science museum with family, students learn science at all times. These types of learning opportunities have been termed free-choice learning (Falk, 2001). However, these learning opportunities vary from family to family. Which learning opportunities at home may help with student attainment of competence? What variables representing these learning opportunities are significant predictors of student attainment of competence? These are questions the present chapter will answer. As a part of the student achievement survey, NAEP also collects information on student family background and home environment deemed relevant to students’ science achievement. The questions pertain to student activities and resources at home and are related to parents’ education; language spoken at home; available books, magazines, and newspapers; time spent on watching television/video; extra reading not connected to school; and parents’ employment status. Altogether, there are 20 such questions in the 1996 4th- and 8th-grade science surveys. Table 4.1 presents sample NAEP background questions; for a complete list of the 20 questions please refer to Appendix B. It is noted that the present categorization of race and ethnicity is different from that used in the 1996 NAEP science as shown in Table 4.1. For example, we currently use “African-American” instead of “Black.” Since the data source for this book is the 1996 NAEP science survey, I will still use the 1996 NAEP categorization for consistency and simplicity. It is apparent that NAEP background questions are selective; they have been included based on best knowledge available about the relationship between students’ science achievement and their family background and home environment. Before presenting the models of competence based on student family background and home environment, let us briefly review literature pertaining to the relationship between students’ home and their achievement. One consistent theme of research related to the relationship between student achievement and family background is the achievement gap among students of
X. Liu, Linking Competence to Opportunities to Learn, Innovations in Science Education and Technology 17, © Springer Science + Business Media B.V. 2009
43
44
4
Models of Competence and Opportunities to Learn at Home
Table 4.1 Sample 1996 NAEP science survey variables pertaining to family background and home environment NAEP variable Grade NAEP variable label Recoded NAEP variable values B003001A
4, 8
Which race/ethnicity best describes you?
B003201A
4, 8
B008601A
4, 8
How often is a language other than English spoken at home? How much education did your mother receive?
B008801A
4, 8
About how many books are in your home?
B009101A
4, 8
How many hours of extra reading per week not connected with school?
B009301A
4, 8
How often do you use a home computer for schoolwork?
1 = White 2 = Black 3 = Hispanic 4 = Asian/Pacific 5 = American Indian 6 = Others 1 = Never 2 = Sometimes 3 = Always Others = missing 1 = Did not finish high school 2 = Graduated from high school 3 = Some education after high school 4 = Graduate from college Others = missing 1 = None 2 = 1–10 3 = 11–25 4 = 26–100 5 = more than 100 Others = missing 1 = none 2 = 1–2 h 3 = 3–4 h 4 = 5–6 h 5 = 7–8 h 6 = 9–10 h 7 = more than 10 h Others = missing 1 = Almost every day 2 = Once or twice a week 3 = Once or twice a month 4 = Never or hardly ever Others = missing
different races and ethnicities. It has been consistently documented over the past 4 decades that there exist wide gaps, i.e., differences in achievement scores, among students of different racial or ethnic groups, with White and AsianAmerican students consistently outperforming Black and Hispanic students (Coleman et al., 1966; Jencks & Phillips, 1998; Peng & Hill, 1995; Rodriquez, 1998). For example, based on the 2000 NAEP science assessment, while about 39% White 4th graders, 40% White 8th graders, and 22% White 12th graders reached the proficiency level, only 9% Black 4th graders, 8% Black 8th graders, and 2% Black 12th graders reached the proficiency level (National Center for Education Statistics [NCES], 2001). More focused analysis on the achievement
4
Models of Competence and Opportunities to Learn at Home
45
gaps at different performance levels revealed that Black and Hispanic students underperformed White students even more at the higher quantiles (i.e., 50th, 75th, or 90th percentiles; Haile & Nguyen, 2008). Research has also pointed out that although the achievement gaps gradually narrowed during the 1970s and 1980s, this narrowing trend became stabilized and then began to reverse toward widening again during the 1990s (J. Lee, 2002). One theory explaining the achievement gap among different race and ethnicity groups is cultural discontinuity (Tyler et al., 2008). This theory hypothesizes that there is significant difference between students’ home cultural values and the practiced school cultural values for minority students (e.g., African-American, Hispanic). Because school cultural values represent those of the majority group, i.e., White students, minority students often experience a cultural discontinuity in schools, which places them at a disadvantaged position, and thus contributes to their underachievement. Although this theory is conceptually sound and intuitively appealing, systemic and conclusive empirical studies are missing (Tyler et al., 2008). Various studies have identified determinants of achievement gaps. Muller et al., (2001) found that socioeconomic status to be strongly correlated to 8th-grade achievement across all racial–ethnic and gender subgroups. In a comprehensive literature review on the relationship between family environment and children’s school outcome, Marjoribanks (2005) conceptualized a family model consisting of family background (e.g., human and economic capital, parents’ aspirations, and cultural contexts); intermediate family settings (e.g., sibling structure, family type, and family disruption); and immediate family settings (e.g., family social and cultural capital). This family model could explain the differential learning opportunities available among students. According to Bourdieu (1977, 1986), cultural capital validates individual positions in the social structure, and in turn guides the social mobility, social interaction, and accumulation of social capital. Cultural capital exhibits in three ways: (a) the habits and tastes acquired by individuals as they grow up in different family settings; (b) cultural objects such as paintings, antiques, and books accumulated by individuals or families; and (c) formal educational qualifications attained by individuals. Bourdieu (1984, 1998) considers economic capital (i.e., financial resources and assets) to be a component in a family social space. A family social space consists of two dimensions: one is the total volume of economic and cultural capital possessed, and the other is the relative amount of economic capital to cultural capital. Educational outcomes and social mobility are related to the availability of capital resources. Changes in capital resources result in changes in educational outcomes and social positions – social mobility. Specifically, individuals in a given family space are thought to be conditioned to certain dispositions, tastes, habits, preferences, lifestyles, etc., and as a result, they are expected of certain educational and occupational outcomes. Thus, “their habitus begins to form and develop in response to the consistent or changing nature of their social condition” (Marjoribanks, 2005, p. 650). Similarly, Coleman (1988, 1990) differentiated three types of family capital: economic, human, and social. Researchers have argued that family capital needs to be activated in order to
46
4
Models of Competence and Opportunities to Learn at Home
influence student achievement (Lareau & Horvat, 1999). The amount and quality of academically oriented interaction between parents and children at home provide children with access to parents’ human capital (Coleman, 1997). Parent–child interactions in families are found to be more powerful predictors of children’s school outcomes than measures of family capital (Marjoribanks, 2005). Lareau (2000, 2003) showed that parents of different social classes vary in their use of time for children’s leisure activities, language use in the home, and interventions in schooling. J.-S. Lee and Bowen (2006) found that for 3rd, 4th, and 5th graders, parents with different demographic characteristics exhibited different types of involvement in children’s learning, and the types of involvement exhibited by parents of the dominant group (i.e., White) had the positively strongest association with achievement. Poverty and parent education were significant predictors of students’ achievement, i.e., higher achievement was associated with students of families not in poverty, with European-Americanism, and with higher parent education. Similarly, Zady (2001) found that mothers of low achievers showed an overdependence on the printed directions while helping their children to complete homework-like science tasks. She suggested that the dependence feature is related to limited parental literacy, resulting in mediational constraints that serve as a major mechanism through which the achievement gap is reconstructed in each generation of children from at-risk populations despite their potential competence. On the other hand, among minority students, i.e., non-White students, Smith and Hausafus (1998) found that students have higher test scores if parents help them see the importance of taking advanced science and mathematics courses, emphasize the importance of mathematics in today’s careers, set limits, and visit science/ mathematics exhibits and fairs with their child. However, McNeal (1999) found that parental involvement generally explained behavioral but not cognitive outcomes and had greater effects for more affluent and White students. More evidence on the relationship between home and student learning outcomes is available. Kellaghan et al., (1993) identified the following five significant predictors of students’ learning outcomes: (a) work habits of the family (i.e., preference for educational activities over other activities), (b) academic guidance and support (i.e., parents’ guidance on school matters and the provision of facilities in the home for school learning), (c) stimulation to explore and discuss ideas and events, (d) the language environment of homes, and (e) parents’ academic aspiration. Kingston (2001) found cultural capital measured by reading patterns and the number of books at home to be a significant predictor of children’s educational attainment. Using National Education Longitudinal Study (NELS:2000) data sets, Haile and Nguyen (2008) found that family background factors (parental education, parental employment status, parental occupation, family structure, etc.) statistically significantly predict students’ achievement across all performance proficiency levels for both male and female students, and such positive effects are getting stronger as the performance proficiency level goes up (i.e., higher quantiles). In particularly, the father’s occupation is found to have a significant effect on achievement for both female and male students, although this effect is significantly larger at lower quantiles for females and at higher quantiles for males. Students from single-parent
4
Models of Competence and Opportunities to Learn at Home
47
families significantly underperformed compared to students from two-parent families in 3rd- and 4th-grade science (Pong, 2003). Educational attainment of parents is related to the educational achievement of students; the more television the White students view, the lower is their science achievement (Gorman & Yu, 2000). Parental education and encouragement are also important factors in the improvement of student achievement over years (Wang & Wildman, 1995). The relationship between home and student learning outcomes has also been reported in other countries. For example, Tamir (1993) analyzed the difference between the top 5% science achievement in 9th- and 12th-grade students and other students in Israel and found that high achievers had a tendency to be male, from small families, and had parents with more formal education (often in sciences). Norberg-Schonfeldt (2008) analyzed Swedish upper secondary school students’ achievement and their family background, and found that there was a statistically significant positive relationship between parental income and grade point average. Further, mothers working less than full time had a statistically significant positive effect on children’s grades throughout the schooling, but the significant effect of the hours of work that fathers put in is only found during upper secondary school. However, based on the results of the 2006 Program for International Student Assessment (PISA) that assesses 15 year-old students’ science literacy in terms of their ability to apply science knowledge and skills to reason and solve everyday science problems, Cavanagh (2007) claimed that among the industrialized countries, American students’ academic achievement was more likely to be affected by their family wealth or poverty and family background. Specifically, an estimate of 18% of the variation in American students’ science scores were related to students’ socioeconomic status measured by such variables as family possessions (TV, computers, books, etc.), parents’ education and occupation, wealth, family lives, etc., compared to an average of 14% for all industrialized countries. Within the above research context, next in this chapter I will present the 4thgrade and 8th-grade competence models based on students’ family background and home environment variables. Same as the competence models related to teaching opportunities to learn (OTL), I will only present profiles that contain leaves with more than 100 instances. Those who are interested in technical details of the model accuracy can refer to Appendix D.
Grade 4 Competence Model Figure 4.1 presents the 4th-grade competence model. As can be seen from the figure, there are 13 prevalent profiles (P#1–P#13), among which 4 (i.e., P#6–P#9) are associated with successfully reaching competence, and 9 (i.e., P#1–P#5 and P#10–P#13) are associated with failing to reach competence. Profile #1 has the most instances (1,668), accounting for 23% of the total sample. This profile has the prediction accuracy of 91.4%, with only 143 out of 1,668 instances predicted wrong.
48
4
Models of Competence and Opportunities to Learn at Home How many books are in your home/ > 25
<= 25 U 1668/143 U 793/67
P1
other
U 532/74
at least once or Does father/ twice a month Yes Never or hardly stepfather live with No How often other than ever you? English spoken in U home? No Yes 166/28 sometimes to Always P4 never How much TV/ U How long Video watch on 145/27 lived in a school day? US? P5 all my life 1 hour or less 2 or more hours Discuss More than 5 years studies at U at least once or home? U 145/39 twice a month 184/43 P13 Never or hardly P6 ever U 426/132
Hours extra reading per week not connected to school? 7 or more
Hours of TV/ Video watch on a school day?
Your mother’s education ? high school or less
Yes
at least 1 hour
Is there an encyclopedia in your home?
No
U 378/133
high school or less
post secondary Does mother/ stepmother work at a job for pay?
Times changed school since 1st grade?
P8
No
Your father’s education?
No
<=1
No
>100
Yes
Does your family get newspaper regularly?
Do you have your own study desk at home?
How many book at home? 26-100
Does your family get magazines regularly?
None
Yes
P7 <=4
>5 post-secondary
American Indian
Discuss studies at home?
Does mother/ stepmother live with you?
P3
Hours extra readings per week not connected to school?
Asian
Hispanic White
P2
6 or fewer
What is your race/ethnicity?
Black
Yes
No
S 175/48
once or less more than once How many book in your home?
P12
> 100
<= 100
Your father’s education? high school or less post-secondary Live with father/ stepfather? Yes
How often other than English spoken at home? Never
<=1/2 hour
1 hour Is there an encyclopedia in your home?
some time to always
Yes
Does mother/ stepmother work at job for pay? Yes
No
How often use home computer for school work?
No
at least once or twice a month
Times changed school since 1st grade? None
No
Time spent on homework?
Do you have your own study desk at home?
at least once
Yes
S 112/46
never or hardly ever
S 110/31 P11
No
Discuss studies at
P9
atleast once or once or twice a twice a week month or less S 152/55 P10
Fig. 4.1 4th-grade competence model based on home OTL variables
P#1: [25 or fewer books at home] PREDICTS [Failing to reach competence] Profile #1 shows that the number of books in a home is a significant predictor of students’ competence. More specifically, students with 25 or fewer books at home are associated with failing to reach competence. This profile does not suggest such a causality that purchasing more than 25 books would enable students to reach competence; instead, it suggests that learning opportunities available at home
4
Models of Competence and Opportunities to Learn at Home
49
reflected by the number of books may affect students in terms of their achieving competence status. Number of books available at home could imply such learning opportunities as reading on interested topics, family culture and orientation toward learning, parents’ knowledge and skills, availability of resources relevant to schoolwork, and so on. Table 4.2 presents the relationship between the numbers of books available at home and students’ status of reaching NAEP competence. From Table 4.2, we see that the total percentage of students who had 25 or fewer books at home and also reached NAEP competence was 1.9% compared to 20.9% failing to reach competence. For this group of students, the odds-ratio, i.e., probability of reaching competence over the probability of failing to reach competence, was 0.09. However, the percentage of students who had more than 25 books at home and reached NAEP competence was 25.4% compared to 51.8% failing to reach competence. For this latter group, the odds-ratio was 0.49, a more than five times increase from the previous group. Thus, the number of books at home is indeed a good predictor of students’ competence status. P#2: {[More than 25 books at home] and [Are Black]} PREDICT [Failing to reach competence] P#3: {[More than 25 books at home] and [Are Hispanic]} PREDICT [Failing to reach competence] Profiles #2 and #3 predict that Black and Hispanic students, even with more than 25 books at home, could still likely fail to reach NAEP competence. The prediction accuracy was 91.5% for Blacks and 86.1% for Hispanics. Compared to other profiles for Whites and Asian-Americans to be discussed later, profiles #2 and #3 show that Black and Hispanic students have less likelihood for reaching NAEP competence. How does the status of reaching NAEP competence differ in general among students of different race and ethnicities? Table 4.3 presents the relationship between students’ race or ethnicity and NAEP competence status.
Table 4.2 Cross-tabulation of the number of books at home and the NAEP competence status
How many books are in your home? None 1–10 (Few) 11–25 (Fill a shelf) 26–100 (Fill a case) More than 100 Total
No
NAEP competence status Yes Total
110 (1.5%) 467 (6.5%) 930 (12.9%) 1,604 (22.3%) 2,125 (29.5%) 5,236 (72.7%)
1 (0%) 24 (0.3%) 113 (1.6%) 605 (8.4%) 1,221 (17.0%) 1,964 (27.3%)
111 (1.5%) 491 (6.8%) 1,043 (14.5%) 2,209 (30.7%) 3,346 (46.5%) 7,200 (100.0%)
50 Table 4.3 Cross-tabulation of race/ethnicity and the NAEP competence status
4
Models of Competence and Opportunities to Learn at Home
Which race/ethnicity best describes you? Unknown Asian/Pacific Islander Black Hispanic American Indian Other White Total
No
NAEP competence status Yes Total
145 (2.0%) 268 (3.7%) 1,199 (16.4%) 750 (10.3%) 224 (3.1%) 294 (4.0%) 2,437 (33.4%) 5,317 (72.8%)
38 (0.5%) 98 (1.3%) 74 (1.0%) 84 (1.2%) 65 (0.9%) 85 (1.2%) 1,541 (21.1%) 1,985 (27.2%)
183 (2.5%) 366 (5.0%) 1,273 (17.4%) 834 (11.4%) 289 (4.0%) 379 (5.2%) 3,978 (54.5%) 7,302 (100.0%)
From Table 4.3, we see that although the overall percentage of students who reached NAEP competence was 27.2% (i.e., 1,985 out of 7,302), the percentages of students reaching competence differed greatly among students of different race or ethnicity. For Black students, the percentage of students reaching competence was 5.8% (i.e., 74 out of 1,273); for Hispanic students, the percentage was 10.1% (i.e., 84 out of 834). However, the percentage for White students was 38.7% (i.e., 1,541 out of 3,978), and for Asian/Pacific Islander students 26.8% (i.e., 98 out of 366). It is clear that students of different race or ethnicity have different likelihoods for reaching NAEP competence. P#4: {[More than 25 books at home] and [Are White] and [Do not live with mother/stepmother]} PREDICT [Failing to reach competence] P#5: {[More than 25 books at home] and [Are White] and [Live with mother/stepmother] and [Always speak language other than English at home]} PREDICT [Failing to reach competence] P#6: {[More than 25 books at home] and [Are White] and [Live with mother/stepmother] and [Sometimes or never speak language other than English at home] and [Have lived in the United States for more than 5 years]} PREDICT [Failing to reach competence] P#7: {[More than 25 books at home] and [Are White] and [Live with mother/stepmother] and [Sometimes or never speak language other than English at home] and [Have lived in the United States all through life] and [Never or hardly ever discuss studies at home]} PREDICT [Failing to reach competence] P#8: {[More than 25 books at home] and [Are White] and [Live with mother/stepmother] and [Sometimes or never speak language other than English at home] and [Have lived in the United States all through life] and
4
Models of Competence and Opportunities to Learn at Home
51
[Discuss studies at home at least once or twice a month] and [Spend 1–6 h per week on reading not connected to school] and [Mother had high school or less education] and [Have own study desk at home]} PREDICT [Failing to reach competence] Profiles #4 to #8 all predict the undesirable outcome – failing to reach NAEP competence. Keep in mind that all these profiles pertain to White students who have at least 25 books at home. These profiles show the importance of opportunities to learn available at home represented by such variables as living with mother/stepmother, mother’s education, US native born, speaking English at home, discussion of schoolwork at home, and reading beyond schoolwork. For example, not living with mother could indicate less attention by parents to schoolwork, less discussion of schoolwork at home, and fewer family vacations, etc. For those not born in the United States or still speaking a language other than English at home it could mean that these students have fewer home practice opportunities for the language and literacy (including science literacy) skills learned in school. For those whose mothers’ education is high school or less, the lack of learning opportunities could be less help from mothers on homework, little attention to schoolwork, and fewer resources (e.g., computers) available at home for schoolwork. Finally, for those who read only 1–6 h or fewer per week on things not connected to school it could mean that they do not have sufficient opportunities to broaden their knowledge and skills related to school subjects, and as a result their potential to develop knowledge and interest could be limited. Table 4.4 presents the relationship between living with mother/stepmother and the NAEP competence status for White students. As can be seen from Table 4.4, the majority of White students lived with mother/ stepmother (94.4%). Although the overall odds-ratio for White students to reach competence is 0.64 (38.9%/61.1%), this odds-ratio for those who live with mother/ stepmother is 0.68 (38.1%/56.3%), compared to the odds-ratio for those who do not live with mother/stepmother which is only 0.17 (0.8%/4.8%). Thus, living with mother/stepmother is associated with higher likelihood for reaching competence. Table 4.5 presents the relationship between speaking English at home and NAEP competence for White students only. From Table 4.5, we can see that most White students (69%) never spoke a language other than English at home. The odds-ratios for reaching competence Table 4.4 Cross-tabulation of living with mother/stepmother and the NAEP competence status for White students only NAEP competence status Does your mother or stepmother live at home with you? No Yes Total Yes No Total
2,218 (56.3%) 188 (4.8%) 2,406 (61.1%)
1,502 (38.1%) 32 (0.8%) 1,534 (38.9%)
3,720 (94.4%) 220 (5.6%) 3,940 (100.0%)
52
4
Models of Competence and Opportunities to Learn at Home
Table 4.5 Cross-tabulation of English speaking at home and the NAEP competence status for White students only NAEP competence status How often is a language other than English spoken at home? No Yes Total Never Sometimes Always Total
1,620 (41.1%) 611 (15.5%) 179 (4.5%) 2,410 (61.1%)
1,112 (28.2%) 395 (10.0%) 27 (0.7%) 1,534 (38.9%)
2,732 (69.3%) 1,006 (25.5%) 206 (5.2%) 3,944 (100.0%)
Table 4.6 Cross-tabulation of length of living in United States and the NAEP competence status for White students only NAEP competence status How long have you lived in the United States? No Yes Total All my life More than 5 years 5 years or less Total
2,180 (55.0%) 167 (4.2%) 80 (2.0%) 2,427 (61.2%)
1,485 (37.4%) 39 (1.0%) 16 (0.4%) 1,540 (38.8%)
3,665 (92.4%) 206 (5.2%) 96 (2.4%) 3,967 (100.0%)
are 0.69 (28.2%/41.1%) for those “Never speaking language other than English at home,” 0.65 (10.0%/15.5%) for those “Sometimes speaking language other than English at home,” and 0.16 (0.7%/4.5%) for those “Always speaking language other than English at home.” Therefore, always speaking language other than English at home is associated with a much lower likelihood for reaching competence. Table 4.6 presents the relationship between the length of time living in the United States and the status of reaching competence for White students only. As can be seen from Table 4.6, a majority of White students (92.4%) have lived in the United States for all their lives, while about 7.6% White students have not lived in United States since birth. Comparing the odds-ratios for reaching competence, students who lived in the United States all their lives had an odds-ratio of 0.68 (37.4%/55%), while students who lived in the United States not for all their lives but for more than 5 years had an odds-ratio of 0.24 (1.0%/4.2%), and those who lived in the United States for 5 years or less had an odds-ratio of 0.2 (0.4%/2.0%). Thus, there is a huge drop in odds-ratio for those who did not live in the United States since birth. Table 4.7 presents the relationship between discussing studies at home and the NAEP competence status for White students only.
4
Models of Competence and Opportunities to Learn at Home
53
Table 4.7 Cross-tabulation of discussing studies at home and the NAEP competence status for White students NAEP competence status How often do you discuss studies at home? No Yes Total Almost everyday Once or twice a week Once or twice a month Never or hardly ever Total
1,285 (32.4%) 492 (12.4%) 162 (4.1%) 486 (12.3%) 2,425 (61.2%)
941 (23.7%) 334 (8.4%) 99 (2.5%) 166 (4.2%) 1,540 (38.8%)
2,226 (56.1%) 826 (20.8%) 261 (6.6%) 652 (16.4%) 3,965 (100.0%)
Table 4.8 Cross-tabulation of extracurricular reading at home and the NAEP competence status for White students NAEP competence status How many hours of extracurricular reading per week? No Yes Total None 1–2 h 3–4 h 5–6 h 7–8 h 9–10 h >10 h Total
458 (11.6%) 1,183 (29.8%) 341 (8.6%) 149 (3.8%) 87 (2.2%) 62 (1.6%) 149 (3.8%) 2,429 (61.3%)
154 (3.9%) 656 (16.5%) 289 (7.3%) 125 (3.2%) 96 (2.4%) 69 (1.7%) 147 (3.7%) 1,536 (38.7%)
612 (15.4%) 1,839 (46.4%) 630 (15.9%) 274 (6.9%) 183 (4.6%) 131 (3.3%) 296 (7.5%) 3,965 (100.0%)
Table 4.7 shows that although over half of the students (56.1%) discuss studies at home almost everyday, there are also a large number (16.4%) who never or hardly ever discuss studies at home. The odds-ratio for reaching competence for students who discuss studies at home almost everyday is 0.73 (23.7%/32.4%), while for those who never or hardly ever discuss studies at home it is only 0.34 (4.2%/12.3%). There is a clear trend that the more frequent discussion of studies at home, the higher is the odds-ratio. Table 4.8 presents the relationship between reading at home on things not connected to school and the NAEP competence status for White students only.
54
4
Models of Competence and Opportunities to Learn at Home
We can see from Table 4.8 that students spent various amounts of time on extracurricular reading, with most students (46.4%) spending 1–2 h a week. The odds-ratios for reaching competence associated with different amounts of extracurricular reading are as follows: 0.34 (3.9%/11.6%) for “None,” 0.55 (16.5%/29.8%) for “1–2 h,” 0.85 (7.3%/8.6%) for “3–4 h,” 0.84 (3.2%/3.8%) for “5–6 h,” 1.1 (2.4%/2.2%) for “7–8 h,” 1.1 (1.7%/1.6%) for “9–10 h,” and 0.97 (3.7%/3.8%) for “more than 10 hours.” Thus, there is a clear general trend that a higher odds-ratio is associated with more extracurricular reading. The highest odds-ratio is associated with reading for 7–10 h a week. Table 4.9 presents the relationship between mother’s education and the NAEP competence status for White students. From Table 4.9 we can see that most students (59.1%) had a mother with college education. The odds-ratios associated with various levels of mother education are as follows: 1.1 (30.6%/27.6%) for “Graduated from college,” 0.94 (5.8%/6.2%) for “Some education after high school,” 0.5 (8.0%/16.0%) for “Graduate from high school,” and 0.21 (1.0%/4.9%) for “Did not finish high school.” It is clear that there is a linear trend between mother’s education and the odds-ratio for reaching competence, i.e., the higher the mother’s education, the higher is the odds-ratio. Table 4.10 presents the relationship between having own study desk at home and the NAEP competence status among White students. Table 4.9 Cross-tabulation of mother’s education and the NAEP competence status for White students NAEP competence status How much education did your mother receive? No Yes Total Did not finish high school Graduated from high school Some education after high school Graduated from college Total
118 (4.9%) 386 (16.0%) 149 (6.2%) 664 (27.6%) 1,317 (54.6%)
23 (1.0%) 193 (8.0%) 140 (5.8%) 737 (30.6%) 1,093 (45.4%)
141 (5.9%) 579 (24.0%) 289 (12.0%) 1,401 (58.1%) 2,410 (100.0%)
Table 4.10 Cross-tabulation of having own study desk at home and the NAEP competence status among White students NAEP competence status Do you have your own study desk or table at home? No Yes Total Yes No Total
1,835 (46.7%) 565 (14.4%) 2,400 (61.1%)
1,218 (31.0%) 309 (7.9%) 1,527 (38.9%)
3,053 (77.7%) 874 (22.3%) 3,927 (100.0%)
4
Models of Competence and Opportunities to Learn at Home
55
Table 4.10 shows that while 77.7% students had a study desk or table at home, 22.3% did not. The odds-ratio for students having a study desk at home is 0.66 (31.0%/46.7%), but 0.56 (7.9%/14.4%) for students not having a study desk at home. Thus, having a study desk at home is associated with a slightly higher odds-ratio. P#9: {[More than 100 books at home] and [Are White] and [Live with mother/stepmother] and [Always speak English at home] and [Have lived in the United States all through life] and [Discuss studies at home at least once or twice a month] and [Spend 6 or fewer hours per week on reading not connected to school] and [Mother had postsecondary education] and [Family gets magazines regularly] and [Family gets newspaper regularly] and [Have never changed school since 1st grade] and [Father had postsecondary education] and [Spend 0.5 h or less on homework] and [The mother/stepmother works at a job for pay]} PREDICT [Reaching competence] P#10: {[More than 100 books at home] and [Are White] and [Live with mother/stepmother] and [Sometimes or never speak language other than English at home] and [Have lived in the United States all through life] and [Discuss studies at home at least once or twice a week] and [Spend 6 or fewer hours per week on reading not connected to school] and [Mother had postsecondary education] and [Family gets magazines regularly] and [Family gets newspaper regularly] and [Have changed school since 1st grade at most once] [Father had postsecondary education] and [Spend at least 1 h per week on homework] and [There is an encyclopedia at home] and [Using home computer for homework at least once or twice a month] and [Have own study desk]} PREDICT [Reaching competence] P#11: {[More than 100 books at home] and [Are White] and [Live with mother/stepmother] and [Sometimes or never speak language other than English at home] and [Have lived in the United States all through life] and [Discuss studies at home at least once or twice a month] and [Spend 6 or fewer hours per week on reading not connected to school] and [Mother had postsecondary education] and [Family gets magazines regularly] and [Family gets newspaper regularly] and [Have changed school since 1st grade at most once] [Father had postsecondary education] and [Spend at least 1 h per week on homework] and [There is an encyclopedia at home] and [Never or hardly ever used home computer for homework]} PREDICT [Reaching competence] P#12: {[More than 100 books at home] and [Are White] and [Live with mother/stepmother] and [Sometimes to never speak language other than English at home] and [Have lived in the United States all through life] and [Discuss studies at home at least once or twice a month] and [Spend 7 or more hours on reading not connected to school] and [Watch television/video 4 or fewer hours on a school day] and [There is an encyclopedia at home] and [Father had postsecondary education] and [The mother/stepmother works at a job for pay]} PREDICT [Reaching competence] Profiles #9 to #12 are associated with the desirable outcome – reaching NAEP competence. Again, these profiles pertain to White students. These profiles consist of variables that afford a wide variety of learning opportunities available at home that
56
4
Models of Competence and Opportunities to Learn at Home
could facilitate students reaching competence. These variables include: (a) adequate resources at home such as sufficient number of books, encyclopedias, study desks, subscription to magazines and newspapers, and home computers; (b) living with both parents who have postsecondary education, who care about their children’s education by regularly discussing schoolwork at home, and who have stable employment with infrequent moving; (c) born in the United States with English as their native language; and (d) spending adequate amount of time on extra reading, doing homework instead of watching excessive amount of television/video. The learning opportunities afforded by the above variables include material resources, time for study, and home language and literacy practices. All of the above are conducive to student learning at home, which in turn is beneficial for students to reach NAEP competence. P#13: {[More than 25 books at home] and [Are Asian/Pacific Islanders] and [Discuss studies at least once or twice a month] and [Live with father/stepfather] and [Watch 2 or more hours television/video on a school day]} PREDICT [Failing to reach competence] Profile #13 pertains to students of Asian and Pacific Islander origin. This profile predicts the undesirable learning outcome – failing to reach the NAEP competence level. Profile #13 suggests that for Asian and Pacific Islander students, although they may have sufficient number of books at home and parents may care about their education by discussing schoolwork regularly, and live with their fathers, watching too much television/video on a school day could indicate a loss of learning opportunities for such activities as doing homework, reading books and magazines, participation in extra curricula, etc. Living with their fathers was included in this profile due to the fact that in the Asian tradition fathers are typically the family’s main income source; thus, living with their fathers could mean better resources at home. Table 4.11 presents the relationship between time on watching television/video and the NAEP competence status for Asian students. Table 4.11 Cross-tabulation of amount of time on TV/video and the NAEP competence status for Asian students How much television/video do you watch on a school day? None Less than 1 h 2h 3h 4h 5h More than 6 h Total
NAEP competence status No
Yes
Total
15 (4.1%) 56 (15.4%) 47 (12.9%) 45 (12.4%) 29 (8.0%) 23 (6.3%) 51 (14.0%) 266 (73.1%)
15 (4.1%) 32 (8.8%) 22 (6.0%) 14 (3.8%) 6 (1.6%) 6 (1.6%) 3 (0.8%) 98 (26.9%)
30 (8.2%) 88 (24.2%) 69 (19.0%) 59 (16.2%) 35 (9.6%) 29 (8.0%) 54 (14.8%) 364 (100.0%)
4
Models of Competence and Opportunities to Learn at Home
57
Table 4.11 shows that the Asian student sample is very small, with only 364 students. Among these students, most of them (24.2%) watched less than 1 h television/video on a school day. The odds-ratio for students who do not watch television/video at all is 1.0 (4.1%/4.1%), 0.57 (8.8%/15.4%) for watching less than 1 h on a school day, and 0.47 (6.0%/12.9%) for watching 2 h on a school day. The odds-ratios become smaller and smaller as more and more time is spent on watching television/video on a school day. Thus, for Asian students, less time spent on watching television/video on a school day is associated with a higher odds-ratio for reaching competence.
Grade 8 Competence Model Figure 4.2 presents grade 8 competence models based on home OTL variables. From Fig. 4.2, we see that there are 13 profiles in the competence model. Same as in the grade 4 competence model, the variable at the root is the number of books available at home, followed by the variable on race and ethnicity. Among the 13 profiles, 11 profiles, i.e., profiles #1 to #10 and profile #13), predict the undesirable outcome – failing to reach competence; and two of the profiles, i.e., profiles #11 and #12, are associated with the desirable outcome – reaching competence. P#1: {[Less than 100 books at home] and [Are Black]} PREDICT [Failing to reach competence] P#2: {[Less than 100 books at home] and [Are Hispanic]} PREDICT [Failing to reach competence] P#3: {Less than 100 books at home] and [Are American Indian]} PREDICT [Failing to reach competence] Profiles #1 to #3 predict the undesirable outcome – failing to reach competence. Among all those students who have less than 100 books at home (close to half of the total sample), race and ethnicity is a main predictor of student competence status. More specifically, Black and Hispanic students are predicted not to reach competence. The prediction accuracy is 97.6% for Black students, 92.1% for Hispanic students, and 86.3% for American Indian students. Profiles #1 to #3 suggest that being Black, Hispanic, and American Indian, compared to being White and Asian/Pacific Islanders, is associated with a lower likelihood for reaching competence. Table 4.12 presents the relationship between students’ race or ethnicity and their NAEP competence status among only those students who have fewer than 100 books at home. From Table 4.12, we see that among 4,517 students (a little more than 50% of the total sample) who had fewer than 100 books at home, the overall percentage of reaching the NAEP competence status was 15.7% compared to 84.3% failing to reach competence. However, this ratio of reaching competence over failing to reach competence, i.e., the odds-ratio, varies greatly among different races and ethnicities, with 0.021(0.5%/23.4%) for Black, 0.082 (1.5%/18.2%) for Hispanic, 0.16 (0.3%/1.9%) for American Indian, 0.23 (1.1%/4.4%) for Asian/Pacific Islander,
58
4
Models of Competence and Opportunities to Learn at Home
How many book are in your home?
<=100 What is your race/ ethnicity
<=100
What is your race/ ethnicity?
Other
Black U 1112/27 How many book are in your home?
P1
American Indian Asian White Hispanic
U 102/14
Father’s education U 920/73
10 or fewer > 10
P2 Live with mother/ stepmother?
other Black
P3
high school or Post-secondary less
U 406/43
Father’s education?
P4 How often use home computer for school work?
U 975/184
post-secondary
No
P7
Sometime to always
Never
U 299/69
high school and less than high above How many school book are in your home?
Hours extra reading per week not connected to school?
U 171/40
2 or less
Discuss studies at home?
Never or hardly at least once or ever twice a month Mother’s education?
U 272/62 P5
high school or less
Yes How often other than English spoken in home?
American Indian Hispanic white
Asian
P8
at least once or twice a month Never or hardly ever
25 or fewer more than 25
How much TV/ Video watch on a school day?
5 hours or less 6 hours or more How often other than English is spoken at home?
How often other than English is spoken at home?
Some times to never Always
Some times to Always never
3 or more U 135/20
P9
P6
How extra reading per week not connected with school?
None How often use home computer for schoolwork?
How much TV/ Video watch on a school day?
How extra reading per week not connected with school?
1 hour or less 1 hour or more
at least 1-2 hours
Does your family get newspaper regularly?
Once or twice a Almost every day week or less
Yes
at least 1-2 hours
U 106/28 P13
Discuss studies at home?
U 139/36 P10
No
None
at least once or twice a week
Once or twice a month or less
Time spent or homework? 1/2 hour or less more than an Father’s hour education?
Living with father/stepfather?
less than college Yes
at least college S 137/49 P11
No
S 156/38 P12
Fig. 4.2 8th-grade competence model based on home OTL variables
and 0.35 (11.6%/33.0%) for White students. Students of different races and ethnicities clearly have different likelihoods of reaching the competence level, suggesting that there are differential opportunities to learn at home among students of different races and ethnicities. P#4: {[More than 100 books at home] and [Are Black]} PREDICT [Failing to reach competence]
4
Models of Competence and Opportunities to Learn at Home
Table 4.12 Cross-tabulation of race/ethnicity and the NAEP competence status among students who have fewer than 100 books at home
Which race/ethnicity best describes you? Unknown Asian/Pacific Islander Black Hispanic American Indian Other White Total
No
59 NAEP competence status Yes Total
44 (1.0%) 200 (4.4%) 1,055 (23.4%) 823 (18.2%) 84 (1.9%) 110 (2.4%) 1,490 (33.0%) 3,806 (84.3%)
7 (0.2%) 48 (1.1%) 24 (0.5%) 70 (1.5%) 14 (0.3%) 22 (0.5%) 526 (11.6%) 711 (15.7%)
51 (1.1%) 248 (5.5%) 1,079 (23.9%) 893 (19.8%) 98 (2.2%) 132 (2.9%) 2,016 (44.6%) 4,517 (100.0%)
P#5: {[More than 100 books at home] and [Are Hispanic]} PREDICT [Failing to reach competence] Profiles #4 and #5 are associated with the undesirable outcome for both the Black and Hispanic students who have more than 100 books at home. Although the total instances for these two profiles are low compared to that for profiles #1 and #2, the prediction accuracy is overall high (89.4% for Black and 77.2% for Hispanic). Profiles #4 and #5 suggest that even with more than 100 books at home, Black and Hispanic students still have less chance of reaching competence. Table 4.13 presents the relationship between students’ race or ethnicity and their NAEP competence status among only those students who have more than 100 books at home. From Table 4.13, we see that although the overall percentage for reaching the NAEP competence status among the students who have more than 100 books at home is 40.9%, the percentages of reaching competence for students of different races and ethnicities differ greatly. Specifically, for Black students, the oddsratio between reaching competence and failing to reach competence is 0.12 (1.3%/11.2%). For Hispanic students the odds-ratio is 0.29 (1.9%/6.5%), and for American Indian students 0.43 (0.6%/1.4%), compared to 0.98 (33.6%/34.4%) for White students, and 0.76 (1.6%/2.1%) for Asian/Pacific Islander students. Same as for students who have fewer than 100 books at home discussed earlier, for students who have more than 100 books at home, the chance for White and Asian/Pacific Islander students to reach competence is much higher than that for other racial and ethnic groups. P#6: {[10–100 books at home] and [Are Asian/Pacific Islanders] and [Live with mother/stepmother] and [Sometimes to always speak language other than
60 Table 4.13 Cross-tabulation of race/ethnicity and the NAEP competence status among students who have more than 100 books at home
4
Models of Competence and Opportunities to Learn at Home
Which race/ethnicity best describes you? Unknown Asian/Pacific Islander Black Hispanic American Indian Other White Total
No
NAEP competence status Yes Total
32 (1.0%) 66 (2.1%) 349 (11.2%) 201 (6.5%) 42 (1.4%) 81 (2.6%) 1,069 (34.4%) 1,840 (59.1%)
13 (0.4%) 51 (1.6%) 41 (1.3%) 60 (1.9%) 20 (0.6%) 40 (1.3%) 1,046 (33.6%) 1,271 (40.9%)
45 (1.4%) 117 (3.8%) 390 (12.5%) 261 (8.4%) 62 (2.0%) 121 (3.9%) 2,115 (68.0%) 3,111 (100.0%)
English at home] and [Spend 2 or fewer hours per week on extra reading not connected to school]} PREDICT [Failing to reach competence] Profile #6 pertains to Asian/Pacific Islander students only. This profile only has 135 instances with 20 of them (14.8%) classified incorrectly. The key features in this profile are sometimes or always speaking a language other than English at home, and spending less than 2 h per week on extra reading not connected to school. This seems to describe an ESL Asian family who may have been in the United States for not too long. The lack of opportunities to learn at home for these students seems to be primarily associated with not English speaking at home and not extracurricular reading. Table 4.14 presents the relationship between English speaking at home and the NAEP competence status for Asian students. From Table 4.14, we can see that the Asian/Pacific Islander student sample in the NAEP survey is very small with only 363 students. Although the overall odds-ratio for this group of students to reach competence is 0.38 (27.3%/72.7%), the odds-ratios for students with different levels of English spoken at home vary greatly. Specifically, the odds-ratio for students who never speak a language other than English at home is 0.60 (3.0%/5.0%), but for those who sometimes speak a language other than English at home it is 0.30 (11.6%/39.1%), and for students who always speak a language other than English at home it is 0.44 (12.7%/28.7%). Thus, never speaking a language other than English at home is associated with the highest likelihood for reaching competence. Table 4.15 presents the relationship between the amount of extracurricular reading per week and the NAEP competence status for Asian students.
4
Models of Competence and Opportunities to Learn at Home
61
Table 4.14 Cross-tabulation of English-speaking at home and the NAEP competence status for Asian students NAEP competence status How often is a language other than English spoken at home? No Yes Total Never Sometimes Always Total
18 (5.0%) 142 (39.1%) 104 (28.7%) 264 (72.7%)
11 (3.0%) 42 (11.6%) 46 (12.7%) 99 (27.3%)
29 (8.0%) 184 (50.7%) 150 (41.3%) 363 (100.0%)
Table 4.15 Cross-tabulation of extracurricular reading and the NAEP competence status for Asian students NAEP competence status How many hours per week on reading not connected to school? No Yes Total 2 h or less More than 2 h Total
193 (52.9%) 73 (20.0%) 266 (72.9%)
50 (13.7%) 49 (13.4%) 99 (27.1%)
243 (66.6%) 122 (33.4%) 365 (100.0%)
From Table 4.15, we see that most Asian students spend 2 h or less on extracurricular reading. The odds-ratio of reaching competence for students who read 2 h or less is 0.26 (13.7%/52.9%), compared to an odds-ratio of 0.67 (13.4%/20.0%) for students who read more than 2 h a week. More extracurricular reading is associated with a higher odds-ratio for reaching the competence status. P#7: {[100 or fewer books at home] and [Are White] and [Father has high school or less education]} PREDICT [Failing to reach competence] P#8: {[100 or fewer books at home] and [Are White] and [Father has postsecondary education] and [Never or hardly ever use home computer for schoolwork]} PREDICT [Failing to reach competence] P#9: {[25 or fewer books at home] and [Are White] and [Father has postsecondary education] and [Use home computer for schoolwork at least once or twice a month] and [Mother has postsecondary education]} PREDICT [Failing to reach competence] P#10: {[25–100 books at home] and [Are White] and [Father has postsecondary education] and [Use home computer for schoolwork once or twice a week or less] and [Mother has postsecondary education] and [Spend no time on extra reading per week not connected with school] PREDICT [Failing to reach competence] Profiles #7 to #10 pertain to White students only. All these profiles are associated with the undesirable outcome – failing to reach competence. Two key features
62
4
Table 4.16 Cross-tabulation of number of books at home and the NAEP competence status for White students
Models of Competence and Opportunities to Learn at Home
About how many books are in your home? 0–25
No
NAEP competence status Yes Total
502 (12.1%) 988 (23.9%) 1,069 (25.9%) 2,559 (61.9%)
26–100 More than 100 Total
108 2.7% 418 (10.1%) 1,046 (25.3%) 1,572 (38.1%)
610 (14.8%) 1,406 (34.0%) 2,115 (51.2%) 4,131 (100.0%)
Table 4.17 Cross-tabulation of extracurricular reading and the NAEP competence status for White students NAEP competence status How many hours per week on reading not connected to school? No Yes Total None 1–2 h 3 h or more Total
919 (22.4%) 1,060 (25.8%) 561 (13.6) 2,540 (61.8%)
353 (8.6%) 642 (15.6%) 573 (14.0%) 1,568 (38.2%)
1,272 (31.0%) 1,702 (41.4%) 1,134 (27.6%) 4,108 (100.0%)
in these profiles that could reduce the likelihood for reaching competence status are having fewer books at home and infrequently doing extracurricular reading at home. Tables 4.16 and 4.17 show relationships between these two features and the NAEP competence status. Table 4.16 shows that although most White students (51.2%) had more than 100 books at home, a large number of students had fewer than 100. The oddsratio for students who have more than 100 books at home is 0.98 (25.3%/25.9%), but for those who have 26–100 books it is only 0.42 (10.1%/23.9%), and 0.22 (2.7%/12.1%) for those who have fewer than 25 books. Thus, having more books at home is associated with a higher likelihood for reaching the competence status. From Table 4.17, we see that most students (41.4%) read 1–2 h per week on things not connected with school. Students who read 3 or more hours per week have an odds-ratio of 1.0 (14.0%/13.6%) for reaching competence, compared to 0.60 (15.6%/25.8%) for students who read 1–2 h per week and 0.38 (8.6%/22.4%) for students who do not read at all. Thus, the more time spent on reading not connected with school, the higher is the likelihood for reaching competence. P#11: {[More than 100 books at home] and [Are White] and [Father has at least college education] and [Discuss studies at home at least once or twice a week] and [Sometimes to never speak language other than English at home] and [Watch television/video 1 h or less a week on a school day] and [Get newspaper regularly] and [Spend 0.5 h or less on homework]} PREDICT [Reaching competence]
Summary
63
P#12: {[More than 100 books at home] and [Are White] and [Father has postsecondary education] and [Discuss studies at home at least once or twice a week] and [Sometimes to never speak language other than English at home] and [Watch television/video 1 h or less on a school day] and [Get newspaper regularly] and [Spend more than 1 h a week on homework] and [Live with father/stepfather]} PREDICT [Reaching competence] Profiles #12 and #13 once again pertain to White students only, but are associated with the desirable outcome – reaching competence. Key features that differentiate them from profiles #7 to #10 are (a) the father’s or parents’ postsecondary education, (b) speaking English at home, (c) discussion of studies at home regularly, (d) spending adequate amount of time on homework and not watching television/ video excessively, and (e) more resources (more than 100 books and subscription of newspaper) at home. All of the above should indicate more opportunities to learn at home. P#13: {[More than 100 books at home] and [Are White] and [Father has high school or less education] and [Watch television/video 5 h or less a week] and [Sometimes to never speak language other than English] and [Spend no time on extra reading not connected to schoolwork]} PREDICT [Failing to reach competence] Profile #13 once again predicts the undesirable outcome – failing to reach competence. This profile pertains to White students who have more than 100 books at home. Key features in this profile are the father’s high school or less education and spending no time on extra reading not connected to schoolwork. This profile seems to suggest that even if there are sufficient resources at home as indicated by more than 100 books at home, if parents do not encourage children to read beyond schoolwork, the children will lose valuable learning opportunities, thus reducing the likelihood of reaching competence.
Summary The 4th-grade competence model has 13 prevalent profiles. Profile #1 has the most instances and predicts the undesirable outcome – failing to reach competence. The significant predictor for this profile is having 25 or fewer books at home. Black and Hispanic students have a much lower likelihood to reach competence than White students. The most prevalent profile predicting the desirable outcome, i.e., reaching competence, is P#12. This profile contains the following variables: (a) more than 100 books at home, (b) being White, (c) living with mother/stepmother, (d) sometimes or never speaking a language other than English at home, (e) having lived in the United States all their life, (f) discussing studies at home at least once or twice a month, (g) 7 or more hours a week on reading not connected to school, (h) watching television/video less than 4 h on a school day, (i) an encyclopedia at home, (j) father with postsecondary education, and (k) mother/stepmother working at a job for pay.
64
4
Models of Competence and Opportunities to Learn at Home
For White students, variables associated with a higher likelihood for reaching competence are: (a) more than 25 books at home, (b) living with mother/stepmother, (c) never speaking language other than English at home, (d) US born, (e) discussing studies at home almost everyday, (f) extracurricular reading of 7–10 h a week, (g) mother with college education, and (h) having own study desk. For Asian/ Pacific Island students, a higher likelihood for reaching competence is associated with watching less television/video during a school day. The 8th-grade competence model also has 13 prevalent profiles. Profile #1 has the most instances and predicts the undesirable outcome – failing to reach competence. This profile contains the following variables: (a) less than 100 books at home, and (b) are Black. Profile #12 has the most instances predicted for reaching competence. This profile contains the following variables: (a) more than 100 books at home, (b) being White, (c) father with postsecondary education, (d) discussing studies at home at least once or twice a week, (e) sometimes or never speaking language other than English at home, (f) watching television/video 1 h or less on a school day, (g) getting newspaper regularly, (h) more than 1 h a week on homework, and (i) living with father/stepfather. White and Asian students have a much higher likelihood for reaching the competence level. For White students, a higher likelihood for reaching competence is associated with more than 100 books at home and extracurricular reading of more than 3 h per week. For Asian students, a higher likelihood for reaching competence is associated with never speaking a language other than English at home and extracurricular reading for more than 2 h per week. There are many common significant predictors between the 4th- and 8th-grade competence models. These common predictors are related to race or ethnicity, number of books at home, whether or not living with father or mother/stepmother, whether or not speaking language other than English at home, frequency of discussing studies at home, amount of time watching television/video on a school day, father’s or mother’s/stepmother’s education, and the amount of extracurricular reading.
Chapter 5
Models of Competence and Opportunities to Learn in Schools
The NAEP survey collects information on a wide range of variables pertaining to schools. These variables represent opportunities to learn due to such school aspects as student grouping (by ability, math, reading, etc.), school curriculum emphasis and priority (science, art, etc.), frequencies of science instruction, computer availability, curricula, frequency of field trips, library staffing, parent participation, student behavior, teacher morale, and so on. Table 5.1 presents sample questions from the NAEP school background questionnaire and variables created based on them; a complete list of the variables is available in Appendix C. When students are in schools, they are placed in individual classrooms. However, individual classrooms exist within a school context. A school as a unit has its own culture that affords various opportunities to learn. This varying effect of school culture on student learning has been called hidden curricula or implicit curricula as compared to explicitly stated curricula implemented in classrooms (Cornbleth, 1984). “In school, students seem to learn much that is not publicly set forth in official statements of school philosophy or purpose or in course guides and syllabi. … Implicit curricula consist of the messages imparted by the classroom and school environment” (Cornbleth, 1984, p. 29). Implicit curricula in schools convey a variety of messages, sometimes contradictory to each other. It is up to individual students to make sense of them. Besides school curricula that may be formal, informal, or hidden, schools impact students through their organization (staffing plans, physical organization, etc.) and curriculum orientations (traditional, constructivist, and personal relevance) (Barbour & Barbour, 1997). Schools can differ greatly in terms of their educational policies, and such between-school differences are particularly evident in the United States in which education is overall decentralized in terms of governance, curriculum, and testing (Burstein et al., 1980). It is commonly known that schools in the United States have varying sources of funding due to their varying tax bases from neighborhood to neighborhood. As a result, schools impact on students through “a combination of peer effects from classmates such as role models and norms, the advantages of the school such as teacher quality and resources, and the advantages of the neighborhood such as better infrastructure, adult role models and access to jobs” (Levine & Painter, 2008, p. 460). In a study using the National Education Longitudinal Survey (NELS) data
X. Liu, Linking Competence to Opportunities to Learn, Innovations in Science Education and Technology 17, © Springer Science + Business Media B.V. 2009
65
66
5
Models of Competence and Opportunities to Learn in Schools
Table 5.1 Sample 1996 NAEP science survey variables pertaining to schools NAEP variable Grade NAEP variable label Recoded NAEP variable values C034512/ C031205
8, 4
How often do 8th/4th graders receive science instruction?
C031607
8, 4
Has science been identified as a priority?
C035701
8, 4
Are computers available all the time in the classroom?
C037302
8, 4
Do you follow district/state science curriculum?
C039602
8, 4
C032409
8, 4
Are there 8th/4th graders in summer programs for science? Is lack of parent involvement a problem in school?
1 = Everyday 2 = Three or four times a week 3 = Once or twice a week 4 = Less than once a week 5 = Subject not taught Others = Missing 1 = Yes 0 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Serious 2 = Moderate 3 = Minor 4 = Not a problem Others = Missing
sets, Levine and Painter (2008) found that a one standard deviation (SD) increase in average school test scores raises the prediction of an individual student’s test scores by 0.82 SD, and even after further controlling for the effects of family backgrounds, such school effect remains significant (with a standardized regression coefficient of 0.43). Further, students moving to a better high school than their junior high school could result in an increase in their test scores by 0.34 SD. School resources have been found to correlate significantly with students’ achievement; for example, there is a positive relationship between the number of computers available in schools and students’ science achievement (Wen et al., 2002). Technology in general does not automatically increase student achievement in science. Based on analyses of data sets from multiple surveys, Hilton (2003) found that technology training for teachers increased their use of computers for instruction but students’ final science course grades did not improve. Students’ NAEP science test scores were found to be significantly correlated with school library’s media center video collection, although school library media specialists were not consistently and confidently poised to be instructional collaborators to directly impact on students’ achievement (Mardis, 2005). Other school context variables also impact on student achievement. When faculty in a school science department form collegial relationships, the effects of their instructional practices on students’ achievement growth intensify (Yasumoto et al., 2007). Based on
5
Models of Competence and Opportunities to Learn in Schools
67
statewide science assessment scores at grades 3, 5, and 8, Miller-Whitehead (2001) found that free and reduced lunch accounted for 54% of variance. At grade 5, meeting class-size standards accounted for the greatest amount of explained variability across schools in science-scale scores. Per pupil expenditure had a positive effect on the science-scale score achievement for grade 5 students as well. Parent involvement in schools has been found to have a strong positive effect on students’ achievement, and such an effect is relatively large compared to that of school resources (Houtenville & Conway, 2008). There are various ways for parents to be involved in schools. Epstein (1994) identified six reciprocal parental and school practices for parent involvement. They are: (a) basic parental obligations (i.e., providing safe and healthy conditions at home), (b) effective communication (i.e., parents receiving information about children and school activities), (c) parent volunteers/audiences (i.e., parents attending school activities and acting as volunteers in the classroom and school), (d) family involvement in the home (i.e., parent involvement in student homework and other academic programs), (e) parent decision making (i.e., parent participation in advocacy, leadership, and school decision-making processes through such organizations as parent–teacher association, school-site management, etc.), and (f) community exchange and collaboration (i.e., parent participation in forming a wider community involving home, school, and other organizations). Similarly, Jordan et al. (2001) identified the following 11 types of parental connections with school: (a) homework help, (b) supportive home environment, (c) home–school communication and interaction, (d) parent participation in school activities, (e) literacy development in the home, (f) parents as tutors, (g) parental academic and emotional support and future expectation, (h) out-of-school learning opportunities, (i) home discussion about school issues, (j) parents as role models, and (k) parents as educational advocates. Empirical evidence exists supporting the relationship between parent involvement in schools and student academic achievement. In a study involving a representative sample of 3rd through 5th graders in a suburban community, Lee and Bowen (2006) found that parent involvement at school significantly predicted students’ academic achievement. However, European American parents reported more frequent involvement at school and less frequent efforts to manage their children’s time use at home than both Hispanic/Latino and African-American parents. Relative to parents whose children did not take part in the lunch program, parents whose children received free or reduced-price lunches at school reported less frequent involvement at school. Further, parents who had 2-year college education or higher post-secondary education degrees reported significantly more frequent involvement at school. Parent involvement in student homework has also been found to significantly increase student grades. For example, in a quasi-experimental study involving 6th and 8th grade students, van Voorhis (2003) found that students who were in the Teachers Involve Parents in Schoolwork (TIPS) program received significantly higher science grades than students who were not. Thus, the effect of schools on student learning is complex; various factors interact. How do various interactions of factors within a school impact on students’ competence status? This chapter will identify patterns that significantly predict
68
5
Models of Competence and Opportunities to Learn in Schools
students’ competence status based on school context variables. Next in this chapter, I will present the 4th grade and 8th grade competence models based on school context variables. Same as the competence models presented in previous chapters, I will only present profiles that contain leaves of more than 100 instances. Accuracy measures of these models are available in Appendix D.
Grade 4 Competence Model Figure 5.1 presents the 4th grade model of competence based on school OTL variables. From Fig. 5.1, we see that the model has 13 prevalent profiles, i.e., with more than 100 instances each. Among the 13 profiles, only two (i.e., profiles #11 and #13) predict the desirable outcome – reaching competence; all other profiles predict the undesirable outcome – failing to reach competence. P#1: {[Less than 76% parents attending the open house/back to school night] and [Lack of parent involvement a minor to serious problem in school]} PREDICT [Failing to reach competence] Profile #1 has the most instances, i.e., 2,815, which is over one third of the total sample. The accuracy rate for profile #1 is 84.8%, which is quite high. What this profile highlights is parents’ involvement in school activities such as open house, back to school night, and so on. What are possible learning opportunities implied by active parent involvement in schools? First, active parent participation in school activities may indicate overall parents’ valuing of their children’s learning, which may translate into activities supporting children’s learning by monitoring homework completion, making available necessary learning resources (e.g., school supplies, books of wide interests, etc.), and organizing school-wide extracurricular activities (e.g., field trips, fun night, etc.). Second, active parent involvement in schools may indicate better communication between classroom teachers and students. This better communication can enhance students learning at home by better monitoring and supporting student schoolwork at home and by informing the teacher to better plan and conduct instructional activities. Third, active parent participation may also imply better school resources. For example, parents may volunteer staffing the school library or after-school clubs. Thus, parent involvement in schools could indicate a wide variety of learning opportunities to students, thus making it an important predictor of students’ competence status. Unfortunately, a large number of schools experience a lack of active parent involvement. Table 5.2 presents the cross-tabulation between the percentage of parent involvement and the status of students reaching NAEP competence. From Table 5.2, we see that although the overall odds-ratio for reaching NAEP competence is 0.37 (27.1%/72.9%), this ratio decreases rapidly as the percentage of parents attending open house/back to school night decreases. For the category of 76–100% participation, the ratio is 0.54 (19.1%/35.4%). However, for the category of 51–75% participation, the ratio is 0.23; for the categories of 0–25% and 26–50%
Grade 4 Competence Model
69 % of Parents in open house/back to school night? < 76 >=76%
Is lack of parent involvement a problem in school?
School receives Chapter 1/Title 1 fundinig? No
minor to serious Not a problem problem
Yes
Has a parent volunteer program?
U 2815/428 Have a parennt volunteer program?
PI
Parent support for student achievement
Yes
Are physical conflict a problem in school?
Yes
No
% of 4th graders held back repeating 4th grade?
minor to no problem
No
moderate to serious
0%
>0% Is school a national lunch program?
Very positive Somewhat positive to very negative Is teacher absenteesim a problem?
minor to serious problem
U 104/24
U 225/70
Yes
Is student absenteeism a problem?
P7 Moderate to serious
No Has reading been identified as a priority?
Not a problem
Is student health a problem?
yes
minor to no problem
Is math identified as a priority?
Is math identified a priority?
No No
Yes
Yes P2
minor to serious Not a problem problem
U 212/82
No
Is student health a problem?
U 1652/366 Involve parents as aided in classroom?
P4 P3 Yes
U 186/84
Assign homework to do with parents?
P5
Yes
No
Yes
U 308/100
% of students still enrolled at the end of school?
Yes
No
>25%
<=25%
No
>95
No
% of parents in volunteer program?
Has subject integration a priority?
Yes
No Is student tardiness a problem?
Computers groups in separate lab and accessible?
U 107/35
S 148/58
P12
P13
P8 Yes
No
How often 4th graders receive arts instructions?
<=95%
everyday less than everyday
U 228/70 P6
U 109/28
Have parents sign/ review homework?
P9 occasional to no
Routine
U 141/62
Teacher morale
P10 very positive
less than very positive
Is school in a national lunch program? Yes
No
S 247/104
P11
Fig. 5.1 Model of competence for 4th grade based on school OTL variables
participation, the ratio for both is 0.17. The decrease in odds-ratio from over 76% parent participation to less than 76% participation is more than half or 50%. Table 5.3 presents the cross-tabulation between the perceived degree of problem with parent involvement in schools and students’ status of reaching NAEP competence.
70
5
Models of Competence and Opportunities to Learn in Schools
Table 5.2 Cross-tabulation between parents attending the open house/ back to school night and the status of reaching NAEP competence NAEP competence status Percentage of parents in open house/back to school night No Yes Total 0–25% 26–50% 51–75% 76–100% Total
126 (1.8%) 559 (8.1%) 1,899 (27.5%) 2,441 (35.4%) 5,025 (72.9%)
18 (0.3%) 94 (1.4%) 444 (6.4%) 1,315 (19.1%) 1,871 (27.1%)
144 (2.1%) 653 (9.5%) 2,343 (34.0%) 3,756 (54.5%) 6,896 (100.0%)
Table 5.3 Cross-tabulation between perceived problem with parent involvement and status of students reaching NAEP competence NAEP competence status Is lack of parent involvement a problem in school? No Yes Total Serious Moderate Minor Not a problem Total
453 (6.6%) 1,636 (23.9%) 1,692 (24.7%) 1,193 (17.4%) 4,974 (72.7%)
54 (0.8%) 358 (5.2%) 736 (10.8%) 721 (10.5%) 1,869 (27.3%)
507 (7.4%) 1,994 (29.1%) 2,428 (35.5%) 1,914 (28.0%) 6,843 (100.0%)
Table 5.3 shows that when there is no perceived problem with parent involvement in school; the odds-ratio for students to reach NAEP competence is 0.60 (10.5%/17.4%). However, when it is perceived that there is a minor to serious problem with parent involvement in schools, the odds-ratio decreases to 0.30 (16.8%/55.2%), a 50% drop. P#2: {[Less than 76% parents attending open house/back to school night] and [Lack of parent involvement is not a problem] and [There is a parent volunteer program] and [Parent support for student achievement is very positive] and [Teacher absenteeism is a minor to serious problem]} PREDICT [Failing to reach competence] P#3: {[Less than 76% parents attending open house/back to school night] and [Lack of parent involvement is not a problem] and [There is a parent volunteer program] and [Parent support for student achievement is very positive] and [Teacher absenteeism is not a problem] and [Student health is not a problem]} PREDICT [Failing to reach competence]
Grade 4 Competence Model
71
Profiles #2 and #3 both predict the undesirable outcome – failing to reach competence. Because the number of instances belonging to these two profiles is small (104 and 212), the significance of these two profiles may not be high. A key feature in profile #2 is teacher absenteeism. It can be anticipated that if teachers are often absent from teaching, student learning can be expected to suffer. On the other hand, profile #3 appears to indicate overall positive school culture. Because there are only 212 instances, and the prediction accuracy is relatively low (61%), this profile may not be very informative. P#4: {[76% or more parents attend the open house/back to school night] and [School receives Chapter I/Title I funding] and [Physical conflict is a minor to no problem] and [School is in the national lunch program] and [Reading is identified as priority]} PREDICT [Failing to reach competence] P#5: {[76% or more parents attend the open house/back to school night] and [School receives Chapter I/Title I funding] and [Physical conflict is a minor to no problem] and [School is not in the national lunch program] and [Math is identified as a priority]} PREDICT [Failing to reach competence] P#6: {[76% or more parents attend the open house/back to school night] and [School receives Chapter I/Title I funding] and [Physical conflict is a minor to no problem] and [School is in the national lunch program] and [Reading is not identified as a priority] and [School involves parents as aides in classroom] and [Subject integration is not a school priority] and [95% or less students are still enrolled at the end of the school]} PREDICT [Failing to reach competence] Profiles #4 to #6 pertain to those schools that received Title I/Chapter I federal funding. In 1965, Title I was enacted “to provide financial assistance to local education agencies serving areas with concentrations of children from low-income families to expand and improve their education programs by various means” (Public Law 89–10). In 1981, Title I was replaced by Chapter 1. Therefore, these three profiles apply to those schools with a high concentration of children from low-income families. Profile #4 describes a scenario in that the school has high parent participation in school activities and physical conflict is almost not a problem, but the school is participating in the national lunch program and has identified reading to be a priority. Identifying reading as a school priority is a common practice in many elementary schools. Having reading as a school priority may not necessarily undermine learning opportunities for other school subjects. However, it is possible that, when teachers overemphasize reading at the expense of other subjects such as science, learning opportunities for science will be lost. Table 5.4 presents the cross-tabulation between identifying reading as a school priority and students’ status of reaching NAEP competence. From Table 5.4, we see that 82% of students are in schools that identify reading as a priority, compared to only 18% who are in schools not identifying reading as a priority. Comparing the odds-ratio between the two groups, we can see that not identifying reading as a priority has a slightly higher odds-ratio, i.e., 0.55 (6.4%/11.6%) compared to 0.34 (20.7%/61.3%). Profile #5 differs from profile #4 in that schools are not in the national lunch program, indicating that students are from relatively higher-income families. The
72
5
Models of Competence and Opportunities to Learn in Schools
Table 5.4 Cross-tabulation between reading as a priority and status of reaching NAEP competence NAEP competence status Has reading been identified as a priority? No Yes Total Yes No Total
4,174 (61.3%) 789 (11.6%) 4,963 (72.9%)
1,409 (20.7%) 434 (6.4%) 1,843 (27.1%)
5,583 (82.0%) 1,223 (18.0%) 6,806 (100.0%)
Table 5.5 Cross-tabulation between identifying math as a school priority and the status of reaching the NAEP competence NAEP competence status Has math been identified as a priority? No Yes Total Yes No Total
3,825 (57.4%) 1,010 (15.2%) 4,835 (72.6%)
1,310 (19.7%) 519 (7.8%) 1,829 (27.4%)
5,135 (77.1%) 1,529 (22.9%) 6,664 (100.0%)
key feature in this profile is that schools have math as a school priority. Similar to the scenario in profile #4, having math as a school priority does not necessarily reduce learning opportunities for science, unless teachers overemphasize math at the expense of science, which could be the case profile #5 refers to. Keep in mind that profile #5 is a low-instance scenario with only 186 students, and the prediction accuracy is only 55%. Table 5.5 presents the cross-tabulation between identifying math as a school priority and the status of reaching NAEP competence. From Table 5.5, we see that 77% of students attend schools that identify math as a priority. Similar to the scenario in which reading is identified as a priority, the odds-ratio for students to reach competence when math is identified as a priority is slightly higher than that when math is not identified as a priority, i.e., 0.51 (7.8%/15.2%) compared to 0.34 (19.7%/57.4%). Profile #6 has a few additional features not contained in profiles #4 and #5. Involving parents as aides in the classroom should be a positive factor for students to reach competence. Subject integration is not a school priority, which means that there is no possibility for science to be buried under other subjects. The less than 95% enrollment of students at the end of the school year could be a critical variable in this profile predicting students failing to reach competence. Profile #6 is not a common scenario, because there are only 228 instances. What profile #6 contains could be a more serious and negative factor impacting student learning – dropping out or moving. When more than 5% of students are not enrolled at the end of the school year, there could be many unhelpful events happening in the community related to overall employment, family stability, school morale, and so on.
Grade 4 Competence Model
73
P#7: {[76% or more parents attend open house/back to school night] and [The school does not receive Chapter I/Title I funding] and [Has a parent volunteer program] and [Some 4th graders are held back]} PREDICT [Failing to reach competence] P#8: {[76% or more parents attend open house/back to school night] and [The school does not receive Chapter I/Title I funding] and [Has a parent volunteer program] and [No 4th graders are held back] and [Student absenteeism is a minor or no problem] and [Math is identified as a school priority] and [Assign homework to do with parents] and [25% or less parents in volunteer program]} PREDICT [Failing to reach competence] P#9: {[76% or more parents attend open house/back to school night] and [The school does not receive Chapter I/Title I funding] and [Has a parent volunteer program] and [No 4th graders are held back] and [Student absenteeism is a minor or no problem] and [Math is identified as a school priority] and [Assign homework to do with parents] and [More than 25% parents are in volunteer program] and [Receive arts instruction everyday]} PREDICT [Failing to reach competence] P#10: {[76% or more parents attend open house/back to school night] and [The school does not receive Chapter I/Title I funding] and [Has a parent volunteer program] and [No 4th graders are held back] and [Student absenteeism is a minor or no problem] and [Math is identified as a school priority] and [Assign homework to do with parents] and [More than 25% parents are in volunteer program] and [Receive arts instruction less than everyday] and [Parents occasionally do or do not sign/review homework]} PREDICT [Failing to reach competence] Profiles #7 to #10 all pertain to schools that do not receive Chapter I/Title I funding. Thus, students in these profiles are from relatively moderate- to high-income families. Keep in mind that all these profiles have fewer than 350 instances. Profile #7 suggests that retaining students at the 4th grade negatively predicts students’ status of reaching NAEP competence. Table 5.6 presents the relationship between student retention and the status of reaching competence.
Table 5.6 Cross-tabulation between 4th grade retention and reaching the NAEP competence status NAEP competence status Percentage of 4th graders held back and repeating 4th grade No Yes Total 0% 1–2% 3–5% 6–10% Total
3,202 (46.4%) 1,455 (21.1%) 343 (5.0%) 25 (0.4%) 5,025 (72.9%)
1,421 (20.6%) 377 (5.5%) 67 (1.0%) 6 (0.1%) 1,871 (27.1%)
4,623 (67.0%) 1,832 (26.6%) 410 (5.9%) 31 (0.4%) 6,896 (100.0%)
74
5
Models of Competence and Opportunities to Learn in Schools
From Table 5.6, we see that while 67% of students are in schools that do not retain 4th graders to repeat the grade, the remaining 33% are in schools that do. In terms of the odds-ratio between reaching competence and failing to reach competence, schools with no retention have an odds-ratio of 0.44 (20.6%/46.4%), compared to 0.26 for schools with 1–2% retention (5.5%/21.1%), 0.2 for schools with 3–5% retention (1.0%/5.0%), and 0.24 for schools with 6–10% retention (0.1%/0.4%). Thus, retaining 4th graders to repeat the grade tends to decrease the chance of reaching NAEP competence. Profiles #8 to #10 also predict the undesirable outcome – failing to reach competence. Some of the key features in these profiles deal with parent involvement (e.g., percentage of parents in volunteer programs) and subject priority (e.g., math and arts). Because of the low instances of these profiles, the patterns may not be significant. P#11: {[76% or more parents attend open house/back to school night] and [The school does not receive Chapter I/Title I funding] and [Has a parent volunteer program] and [No 4th graders are held back] and [Student absenteeism is a minor or no problem] and [Math is identified as a school priority] and [Assign homework to do with parents] and [More than 25% parents are in volunteer program] and [Receive arts instruction less than everyday] and [Parents routinely sign/review homework] and [Teacher morale is very positive] and [School is in the national school lunch program]} PREDICT [Reaching competence] Profile #11 predicts the desirable outcome – reaching competence. This profile includes a number of positive factors that define a conducive school learning environment. Some of these positive factors are that more than 25% of parents are in volunteer programs, there is no grade retention policy, teachers assign homework to do with parents, parents routinely sign or review homework, and teacher morale is high. However, this profile has only 247 instances, and 104 of them were classified wrong (i.e., only 58% prediction accuracy); this profile should be considered with caution. P#12: {[76% or more parents attend open house/back to school night] and [The school does not receive Chapter I/Title I funding] and [Has a parent volunteer program] and [No 4th graders are held back] and [Student absenteeism is a minor or no problem] and [Math is not identified as a school priority] and [Student health is not a problem] and [Student tardiness is a problem]} PREDICT [Failing to reach competence] P#13: {[76% or more parents attend open house/back to school night] and [The school does not receive Chapter I/Title I funding] and [Has a parent volunteer program] and [No 4th graders are held back] and [Student absenteeism is a minor or no problem] and [Math is not identified as a school priority] and [Student health is not a problem] and [Student tardiness is not a problem]} PREDICT [Reaching competence] Profile #12 predicts failing to reach competence, and profile #13 predicts reaching competence. The key difference between profile #12 and profile #13 is whether or not student tardiness is a problem. A problem with student tardiness may indicate such issues as lack of student and parent commitment to learning or lack of school and/or teacher discipline. These issues would result in less opportunities
Grade 8 Competence Model
75
for students to learn. However, both profiles #12 and #13 are low instances, with fewer than 150, and the prediction accuracy for both profiles is relatively low (67% for profile #12 and 61% for profile #13). These two profiles should be considered with caution.
Grade 8 Competence Model Figure 5.2 presents the 8th grade model of competence based on the school OTL variables. As can be seen from Fig. 5.2, the 8th grade model contains six profiles, among which profile #5 predicts the desirable learning outcome – reaching competence; all other profiles predict the undesirable outcome – failing to reach competence.
Is school in national school lunch program?
Yes
No
% of parents in volunteer programs? <=50%
Is lack of parent involvement a problem?
Primary way library is staffed
>50% Not a problem
% of students still enrolled at the end of school year?
Are 8th graders assigned to arts by ability?
Minor to serious
No staff <98% U 4598/840
Yes
>=98%
Is math identified as a priority?
No
at least part-time staff School offers 8th graders algebra for high school credit?
Yes P1
Is student cheating a problem?
moderate to serious
U 522/170
minor to no problem
Is student misbehavior a problem?
U 1216/318
minor to serious Not a problem
Yes
No
Student attitudes towards academic achievement?
No
P3 U 411/13
P4
Very positive
less than very positive Primary way library is staffed
S 333/123 P5
P2
No full-time staff
Full-time staff
Assign homework to do with parents
Yes
No U 103/48
P6
Fig. 5.2 Competence model for the 8th grade based on school OTL
76
5
Models of Competence and Opportunities to Learn in Schools
P#1: {[School is in the national lunch program] and [50% or less parents are in volunteer programs] and [Less than 98% students are still enrolled at the end of the school year]} PREDICT [Failing to reach competence] Profile #1 is a very common scenario with 4,598 instances, over 65% of the total sample. The prediction accuracy of this profile is 82%, which is relatively high. Profile #1 highlights two key characteristics for predicting failing to reach the competence status: (a) lack of parent participation and (b) lack of student full retention during the school year. When the above two characteristics are combined, together with such factors as low family income indicated by participation in the national lunch program, the implied school culture is not a conducive one for student learning. Table 5.7 shows the relationship between the percentage of parents in school volunteer programs and status of students reaching competence. From Table 5.7, we can see that over 58.5% of students attend schools that have only 25% or fewer parents in volunteer programs. For schools with more than 50% parents in volunteer programs, the odds-ratio for students to reach NAEP competence is 0.76 (5.9%/7.8%). On the other hand, schools with only 50% or fewer parents in volunteer programs have an odds-ratio of only 0.29 (19.6%/66.7%). Thus, more than 50% parents in volunteer programs could more than double the odds for students to reach competence from 50% or fewer parents in volunteer programs. Table 5.8 presents the relationship between the percentage of students who are still enrolled at the end of the school year and the status of reaching competence. From Table 5.8, we see that 31.4% of students are in schools that have a 98% or above retention rate during the school year, compared to 68.8% who are in schools that have less than 98% retention rate. The odds-ratio for students to reach competence when there is a 98% or above retention rate is 0.6 (11.8%/19.6%); the oddsratio for students to reach competence when there is a less than 98% retention rate is 0.34 (25.6%/74.4%), an almost 50% drop from the previous one. Less than 98% retention rate during the school year at 8th grade could indicate some important opportunities lost for students to learn. Possibilities for lost learning opportunities when there is a less than 98% retention rate are student dropout, expulsion from school due to misbehavior, a higher than usual frequency of family relocation, etc.
Table 5.7 Cross-tabulation between percentage of parents in volunteer programs and the status of reaching competence
Percentage of parents in volunteer programs 0–25% 26–50% 51–75% 76–100% Total
No
NAEP competence status Yes Total
3,279 (47.1%) 1,365 (19.6%) 359 (5.2%) 179 (2.6%) 5,182 (74.5%)
788 (11.3%) 575 (8.3%) 292 (4.2%) 119 (1.7%) 1,774 (25.5%)
4,067 (58.5%) 1,940 (27.9%) 651 (9.4%) 298 (4.3%) 6,956 (100.0%)
Grade 8 Competence Model
77
P#2: {[School is in the national lunch program] and [50% or less parents are in volunteer programs] and [98% or more students are still enrolled at the end of the school year] and [Student cheating is a minor or no problem] and [Student misbehavior is a minor to serious problem]} PREDICT [Failing to reach competence] Profile #2 also predicts the undesirable outcome – failing to reach competence. This profile has 1,216 instances, among which 318 were predicted wrong. Thus, the accuracy rate for profile #2 is 74%. Key features in profile #2 that contribute to predicting the undesirable outcome are lack of parent participation in volunteer programs and student misbehavior. The relationship between percentage of parents in volunteer programs and status of students reaching competence has already been discussed earlier. The negative relationship between student misbehavior and the status of reaching competence should be no surprise. There are many possible losses of learning opportunities due to student misbehavior, such as time spent on classroom management, lack of student motivation and interest in learning, to name a few. Table 5.9 shows the relationships between student misbehavior and status of reaching competence.
Table 5.8 Cross-tabulation between percentage of student enrollment and the status of competence NAEP competence status Percentage of students still enrolled at the end of the year No Yes Total ≥ 98% < 98% Total
1,369 (19.6%) 3,834 (54.8%) 5,203 (74.4%)
823 (11.8%) 966 (13.8%) 1,789 (25.6%)
2,192 (31.4%) 4,800 (68.6%) 6,992 100.0%
Table 5.9 Cross-tabulation between degrees of student misbehaviors and status of reaching NAEP competence NAEP competence status Is student misbehavior a problem in your school? No Yes Total Serious Moderate Minor Not a problem Total
36 (0.5%) 1,924 (27.6%) 2,989 (42.8%) 245 (3.5%) 5,194 (74.5%)
1 (0.0%) 409 (5.9%) 1,187 (17.0%) 185 (2.7%) 1,782 (25.5%)
37 (0.5%) 2,333 (33.4%) 4,176 (59.9%) 430 (6.2%) 6,976 (100.0%)
78
5
Models of Competence and Opportunities to Learn in Schools
From Table 5.9, we can see that only 6.2% of students are in schools that do not have student misbehavior problems; almost 60% are in schools that have minor student misbehavior problems, and 33.4% are in schools that have moderate student misbehavior problems. For schools with no student misbehavior problems, the odds-ratio for students to reach competence is 0.77 (2.7%/3.5%). For schools with minor student misbehavior, the odds-ratio is 0.40 (17.0%/42.8%). For schools with moderate student misbehavior, the odds-ratio is 0.21 (5.9%/27.6%). The trend is clear: the more serious the student misbehavior, the less likelihood for students to reach competence. P#3:{[School is in the national lunch program] and [More than 50% parents are in volunteer programs] and [8th graders are not assigned to arts by ability]} PREDICT [Failing to reach competence] Profile #3 predicts the undesirable outcome – failing to reach competence. None of the three characteristics contained in profile #3 seems to be apparently negative. The prediction accuracy is 67%, which is moderate. This profile may not be considered very informative. P#4: {[School is not in the national lunch program] and [Lack of parent involvement is a minor to serious problem] and [Math is identified as a priority]} PREDICT [Failing to reach competence] Profile #4 pertains to schools with students coming primarily from middle- to high-income families because schools are not in the national lunch program. The two other characteristics contained in profile #4 are lack of parent involvement and math is identified as a priority. Table 5.10 shows the relationship between parent involvement and the status of students reaching competence. From Table 5.10, we can see that most schools have minor to moderate problems with parent involvement. Comparing the odds-ratios between schools with no problem of parent involvement and those with minor to serious problems of parent involvement, we can see a major difference. For schools with no problem of parent involvement, the odds-ratio for students to reach competence is 0.65 (7.9%/12.2%). For other schools with minor to serious problems of parent
Table 5.10 Cross-tabulation between parent involvement and the status of reaching NAEP competence NAEP competence status Is lack of parent involvement a problem in school? No Yes Total Serious Moderate Minor Not a problem Total
568 (8.1%) 1,844 (26.4%) 1,928 (27.6%) 854 (12.2%) 5,194 (74.5%)
104 (1.5%) 403 (5.8%) 727 (10.4%) 548 (7.9%) 1,782 (25.5%)
672 (9.6%) 2,247 (32.2%) 2,655 (38.1%) 1,402 (20.1%) 6,976 (100.0%)
Grade 8 Competence Model Table 5.11 Cross-tabulation between identification of math as a school priority and status of reaching competence
79 Has math been identified as a priority? Yes No Total
No
NAEP competence status Yes Total
3,987 (57.8%) 1,150 (16.7%) 5,137 (74.5%)
1,215 17.6% 545 (7.9%) 1,760 (25.5%)
5,202 (75.4%) 1,695 (24.6%) 6,897 (100.0%)
involvement, this odds-ratio is 0.29 (17.7%/62.1%), an almost 50% drop from the previous one. Therefore, parent involvement increases the likelihood for students to reach competence. As for identifying math as a school priority, Table 5.11 presents the relationship between identifying math as a school priority and reaching competence. From Table 5.11, we can see that 75% of schools have identified math as a school priority. In terms of the odds-ratio between reaching competence and failing to reach competence, schools without math as a school priority have a slightly higher odds-ratio than schools with math as a school priority (the odds-ratios are 0.47 and 0.30, respectively). Given the importance of all core subjects at the 8th grade level, having math as a school priority could suggest that students may be struggling with math, and by placing more emphasis on math could inadvertently compromise learning opportunities for science. Keep in mind that this effect is a much smaller one compared to the effect of parent involvement in this profile. P#5: {[School is not in the national lunch program] and [Lack of parent involvement is not a problem] and [The library is primarily staffed by parttime or full-time employees] and [The school offers 8th grade algebra for high school credit] and [Student attitude toward academic achievement is very positive]} PREDICT [Reaching competence] Profile #5 is the only one predicting the desirable outcome – reaching NAEP competence. Although the prediction accuracy is 63%, not particularly high, the profile does highlight a number of positive factors that could benefit students for reaching competence. These factors include students from middle- to high-income families, high parent involvement, library staffed by paid employees, offering 8th grade algebra for high school credit, and positive school attitude toward achievement. These factors together could indicate a positive school environment that creates ample learning opportunities for students to learn science. P#6: {[School is not in the national lunch program] and [Lack of parent involvement is not a problem] and [The library is primarily staffed by fulltime employees] and [The school offers 8th grade algebra for high school credit] and [Student attitude toward academic achievement is less than very positive] and [No homework is assigned to do with parents]} PREDICT [Failing to reach competence]
80
5
Table 5.12 Cross-tabulation between assigning homework to do with parents and the status of reaching the NAEP competence
Models of Competence and Opportunities to Learn in Schools NAEP competence status Assign homework to do with parents No Yes Total Yes No Total
3,756 (54.9%) 1,320 (19.3%) 5,076 (74.2%)
1,332 (19.5%) 435 (6.3%) 1,767 (25.8%)
5,088 (74.4%) 1,755 (25.6%) 6,843 (100.0%)
Profile #6 predicts the undesirable outcome – failing to reach competence. This is a low-instance profile with only 103 instances. The key difference between profile #5 and profile #6 is that students’ attitude toward academic achievement is less than positive, and no homework is assigned to do with parents. Students’ attitude could be a more important factor than assigning homework to do with parents for this profile, because the science content at 8th grade could be above some parents’ comfort zone. Table 5.12 shows the distribution of schools in terms of assigning homework to do with parents. Table 5.12 shows that most schools (74%) do assign homework for students to do with parents. Schools with such a practice have an odds-ratio for students to reaching competence of 0.36 (19.5%/54.9%), compared to schools without such a practice having an odds-ratio of 0.32, which is not a huge difference.
Summary The 4th grade competence model has 13 prevalent profiles. Profile #1 has the most instances with a prediction accuracy rate of 84.8%. Significant predictors for failing to reach the competence status identified in this profile are less than 76% parents attending the open house/back to school night and lack of parent involvement to be a minor to serious problem in school. The most prevalent profile associated with reaching competence is profile #11. However, this profile only has a 58% prediction accuracy rate, and thus should be accepted with caution. Significant predictors for reaching competence identified in this profile are: (a) 76% or more parents attend open house/back to school night; (b) the school does not receive Chapter I/ Title I funding, (c) the school has a parent volunteer program; (d) no 4th graders are held back; (e) student absenteeism is a minor or no problem; (f) math is identified as a school priority; (g) teachers assign homework to do with parents; (h) more than 25% parents are in volunteer programs; (i) students receive arts instruction less than everyday; (j) parents routinely sign/review homework; (k) teacher morale is very positive; and (l) school is in the national school lunch program. In addition to the above general teaching profiles contained in the 4th grade competence model, the following individual school aspects are found to be associated with higher odds-ratios for students to reach competence: (a) over 75% parents
Summary
81
attending the open house or back to school night; (b) involving parents in school activities and events; (c) reducing 4th grade retention or holding back to repeat the grade; and (d) not identifying reading or math as a school priority. The 8th grade competence model has six prevalent profiles. Profile #1 has the most instances and predicts the undesirable outcome – failing to reach competence. This profile accounts for 65% of the total sample and has a prediction accuracy rate of 82%. Significant predictors for failing to reach the competence status identified in this profile are: (a) the school is in the national lunch program; (b) 50% or less parents are in volunteer programs; and (c) less than 98% students are still enrolled at the end of the school year. The most prevalent profile associated with reaching competence is profile #5. Profile #5 has only a moderate prediction accuracy rate of 63%. Significant predictors for reaching the competence status identified in this profile are (a) the school is not in the national lunch program; (b) lack of parent involvement is not a problem; (c) the library is primarily staffed by part-time or full-time employees; (d) the school offers 8th grade algebra for high school credit; and (e) student attitudes toward academic achievement is very positive. In addition to the above general profiles contained in the 8th grade competence model, the following individual school aspects are found to be associated with higher odds-ratios for students to reach competence: (a) maintaining more than 50% of parents in the volunteer programs; (b) maintaining a 98% student retention rate during the school year; (c) reducing student behavior problems, (d) assigning students homework to do with parents; and (e) not identifying math as a school priority. The 4th grade competence model and 8th grade competence model are quite distinct with a few common predictors. One common theme present in many profiles associated with failing to reach competence in both the 4th and 8th grade competence models is parent involvement in school activities and events such as participation in parent volunteer programs and attending the school open house or back to school night. Similarly, a few predictors are common in the profiles associated with reaching competence in both the 4th and 8th grade competence models. One theme common in both the 4th and 8th grade competence models for predicting reaching competence is also related to parent involvement. Better parent involvement in schools such as participation in volunteer programs or attending open house is associated with a high likelihood for reaching competence.
Chapter 6
Pedagogical and Policy Implications
Chapters 3–5 have presented models of competence based on opportunities to learn in classrooms, at home, and in schools. Each of the models shows how various variables interact and are associated with a different outcome status – reaching or failing to reach the competence level. Before we discuss the implications of the competence models, we need to recognize the limitations of the competence models created based on NAEP data sets. First, only one data set, i.e., the NAEP, was used in creating competence models. NAEP only included a limited number of variables pertaining to teaching, home, and school; many other relevant variables may have been missed. Second, the competence models rely on data collected from responses to the NAEP background questions related to classroom teaching, home environment, and school context. While the questions have been written as unambiguously as possible, principals’, teachers’ and students’ interpretations of them may nonetheless have differed. Third, NAEP is a cross-sectional survey. Secondary analysis of NAEP data can only give an overall picture on similarities and differences on student achievement by different categories, and relationships between student achievement and various background variables; it does not answer research questions pertaining to causes and effects. Specifically, models of competence with identified significant predictors of student attainment of competence are associations in nature. Neither the existence nor the absence of associations between student performance and variables may be taken as conclusive evidence regarding the relationship among them. Fourth, the competence models reflect only a snapshot on a learning continuum, i.e., based on student performance at one time; they may not capture students’ cumulative learning experiences over the years. Finally, models of competence are general patterns on a national scale; they may not pertain to individual classrooms, homes, or schools. Given the above limitations, the models of competence presented in previous chapters should be considered tentative, or as hypotheses to be further tested empirically. As a result, pedagogical and policy recommendations may not be directly derived from the models of competence. However, significant research findings are available pertaining to the effects of classroom teaching practices, home environment, and school context, and Chapters 3–5 have reviewed a selection of those findings. Within the context of research findings in the literature, the competence models presented in Chapters 3–5 can inform science teaching and
X. Liu, Linking Competence to Opportunities to Learn, Innovations in Science Education and Technology 17, © Springer Science + Business Media B.V. 2009
83
84
6
Pedagogical and Policy Implications
science education policy. In this chapter, I will discuss some implications of the competence models for classroom instruction and for science education policies by simultaneously considering research findings in the literature. Even with a consideration of literature, these implications should still be regarded as being probabilistic rather than deterministic. The recommendations pertain to the population level as a whole, not to a specific ethnic group, gender, or individual classrooms. No signal measure is sufficient to help students to reach competence; consideration of multiple measures and their contexts is necessary when developing interventions.
Pedagogical Implications The science classroom is the primary location where science learning opportunities are available to students, and teachers’ teaching practices are the primary means for making such learning opportunities available. Different teaching practices are necessary for elementary school grades and middle school grades to help students reach competence. For elementary grades, students should have sufficient hands-on learning opportunities. Hands-on is a very broad term; it includes all teaching practices that provide students with opportunities to interact with objects, as opposed to students simply listening and watching. Examples of hands-on experiences are laboratory work, student demonstrations, outside-school inquiry projects, field trips, and science fair. No matter what hands-on experiences are provided, it is important that students engage in a diverse set of hands-on activities. Equally important and as part of student hands-on experiences, students should have opportunities to talk about hands-on experiences. Such discussion of hands-on experiences may take forms of group presentation, posters, questioning, or group discussion. Student handson experiences should be coherent and systematic; ad hoc or isolated hands-on experiences are unlikely to help students reach competence. One way to achieve this is to assign long-term inquiry projects that take more than 1 week. Such long-term inquiry projects should be an integral part of teachers’ unit planning. In addition to learning opportunities created around hands-on experiences, elementary science teachers should also make efforts to provide students with other learning opportunities. In particular, science teachers should try to make full use of learning resources and opportunities outside the school, such as bringing in guest speakers into the classroom. Many parents can be resources on various topics of science; actively involving parents in science teaching can potentially provide students with valuable learning opportunities. Regularly taking students on field trips to science museums, industries, or nature preserves can create additional learning opportunities for students. In general, science teachers should consider science teaching as a continuum from formal science learning (i.e., classroom science learning) to informal science learning (e.g., free choices science learning), thus bridging formal and informal science learning to create additional learning opportunities for students to reach competence. Other important learning opportunities potentially benefiting elementary school students to reach competence may include assigning homework at least 1–2 h a
Pedagogical Implications
85
week, providing science instruction everyday, allowing students to give oral reports once or twice a month, and assessing students using self- or peer-evaluation once or twice a grading period. All of these learning opportunities may be planned as part of the hands-on approach to science teaching suggested earlier. For middle school grades, hands-on learning opportunities remain vital in order for students to reach competence. Students should conduct hands-on activities almost everyday. As part of student hands-on learning experiences, students should have adequate opportunities to develop science inquiry skills such as data analysis. This hands-on approach to science teaching should be accompanied by infrequent use of reading science textbooks in the classroom. Computers and other modern technology should also become part of student hands-on inquiry experiences, and should be available for science teaching and learning in both the classroom and computer laboratory. Although providing learning opportunities in the science classroom is a primary responsibility of science teachers, additional learning opportunities available at home and within the school should also be recognized and actively made use of. Effective communication between science teachers and parents, and between science teachers and school administrators, should exist so that an effective learning partnership can be formed among teachers, parents, and school administrators. The existence of such an effective partnership is a prerequisite for making full use of learning opportunities available at home and in the school. Not all parents may realize the importance and availability of learning opportunities at home. Thus, science teachers need to inform parents on how and what learning opportunities may be created at home. Specifically, for elementary grades, parents should make available various learning resources and facilities such as study desks, newspaper, and books (including encyclopedia), so that children at home can pursue learning related to school and beyond. Parents should encourage children to read widely. Given that many competing activities may attract children’s attention at home, parents need to selectively guide their children toward educative activities such as reading, family visits to museums, zoos, and nature preserves. Parents should also monitor their children’s television and video game time and content to ensure that children spend no more than a certain amount of time on them (e.g., 4 h for elementary graders and 1 h for middle school graders) on a school day. Frequent and ongoing parent involvement in the child’s school work, such as by discussing studies at home and doing homework together, increases students’ likelihood for reaching competence. When parents are knowledgeable about their children’s ongoing learning, it is more likely for parents to consider education values when planning for certain family activities such as vacation, weekend outing, or household purchases. Science teachers need to recognize that disparity in learning opportunities exists among families of different races and ethnicities, and African and Hispanic American families are particularly at a disadvantage. Thus, science teachers, when communicating with parents on the importance of learning opportunities at home for their children, should be sensitive to the reality and specific conditions of individual families. Science teachers should respect and make efforts to incorporate students’ home cultural values during instruction in order to minimize the potential negative effect of “cultural board crossing” from home culture to school science
86
6
Pedagogical and Policy Implications
culture. Individualized instructions and assistance may be necessary for certain families to create learning opportunities at home. Family structure can also have an effect on students’ likelihood for reaching competence because different family structures can potentially create different learning opportunities for children. For example, a single-mother or single-father family is less likely than a two-parent family to provide both financial and emotional support for children to learn through extracurricular activities. Thus, parents should be informed that a stable family with both parents is preferable for the benefit of their children in terms of developing competence. Parents whose native language is other than English should also be informed that speaking the language of instruction in school, i.e., English, increases their children’s likelihood to reach competence. Science classrooms are not the only places in school that offer learning opportunities to students. Science teachers should also recognize that the school environment as a whole also provides various learning opportunities for students to learn. Thus, an active partnership and effective communication mechanism among teachers, partners, and school administrators need to be in place. More specifically, science teachers should actively participate in the development of a variety of parent volunteer programs, such as parent teaching aids, chaperons for field trips, cafeteria supervisors, and guest speakers. The more parents are involved in school volunteer programs, the better it is. Also, parents should be encouraged to actively participate in school events and activities such as school open house and parent–teacher conferences. Parent involvement should also take place at home through such activities as doing homework together with children and reviewing and signing children’s homework. Although schools may identify various subjects such as math and reading as a priority, it is important that doing so should not convey a message that other subjects such as science are not important. All partners including teachers, school administrators, parents, and students should understand that elementary and middle school years are foundational, and a well-rounded education in all subjects is critical for a later productive life. Some schools, particularly elementary schools, may have a retention policy by keeping students at a particular grade for another year to repeat learning. This policy may be counterproductive, and may reduce those students’ learning opportunities and eventually reduce students’ likelihood to reach competence. Thus, schools should avoid retaining students to repeat grades. In addition to the above aspects important for a school environment conducive for learning, middle schools need to make available additional learning opportunities for students, such as staffing the school library by paid professionals, offering more challenging courses such as algebra for high school credits, and creating a positive student attitude toward academic achievement.
Policy Implications It is recognized that in many industrialized countries, education is not the responsibility of the central or federal government. No matter whether or not education is centralized, education policies should be conceptualized as multifaceted and developed at
Policy Implications
87
multiple levels. Given that various learning opportunities exist in different classrooms, homes, and schools, and that students often do not choose to be in a particular classroom, home, or school, different policies at different government levels are needed to ensure that equal learning opportunities are available for all students. First, a national or state/provincial government or organization responsible for developing science education content standards should develop an opportunity-to-learn standard. Particularly, it is important to expect that whenever a content and student performance standard is developed to specify what students should learn and how well students should achieve, a parallel opportunity-to-learn standard should also be developed to specify what learning opportunities should be made available in the classroom, at home, and in schools. For example, exemplary teaching practices for science teaching should be stated for teachers to consider. Essential learning resources at home should be identified so that parents are informed of their obligations and responsibilities in their children’s learning. Similarly, essential learning resources and conditions at the school, such as library resources, science laboratories, and computers, should be made available. Development of content and performance standards should take place simultaneously with the development of opportunity-to-learn standards. Race and ethnicity is an important factor to consider when developing content, performance, and opportunity-to-learn standards. Student achievement gap by race and ethnicity is a historical phenomenon, and the disparity in learning opportunities by race and ethnicity should be reduced by every effort. Special targeted assistance at the national, state/provincial, and school levels is necessary. For example, besides such program as the national lunch program in the United States, special assistance to low-income African and Hispanic American families for purchasing certain essential school supplies and books may be provided. Access to informal learning opportunities such as museums, zoos, and galleries may also be subsidized. Culturally sensitive pedagogy should become an integral part of teacher education and ongoing teacher professional development. Teachers are one key in providing equitable learning opportunities for all students. In addition to solid presevice and in-service teacher education on effective science classroom instruction and assessment, teachers should also receive adequate training on equity and on how to develop effective learning partnerships among teachers, parents, school administrators, and students. Teacher licensing and ongoing evaluation should include review of teacher competence in terms of creating learning opportunities at home, in the school, and in the community. Culturally sensitive science teaching should also become an essential component of pedagogy in teacher education and teacher professional development. While recognizing the value of local control in education including finance, it is also important to ensure that a common set of essential learning opportunities are available for all students no matter what community a school happens to be located in. Such a common set of essential learning opportunities should include such things as vigorous and challenging curriculums, qualified teachers, library resources and staffing, adequate classroom and laboratory equipment, sufficient budget for field trips, parent volunteer programs, and certain student-extracurricular activities. Without such essential learning opportunities common to all students, it is unfair to expect all students to achieve the same level of competence.
88
6
Pedagogical and Policy Implications
Conclusion Standard-based science education is currently a worldwide phenomenon. While standards are commonly in the form of content and expected student performance levels, few content and performance standards are accompanied by an opportunityto-learn standard. This book has demonstrated that different opportunities exist in different classrooms, schools, and homes, and different opportunities are associated with different likelihoods for students to reach the expected competence. Thus, equitable standard-based science education must address differential learning opportunities present in different classrooms, schools, and homes. Teachers have a primary responsibility to provide equitable learning opportunities in classrooms so that no students will be disadvantaged due to poor teaching or inadequate teacher qualification. Equally important is a set of policies at different levels of government and in schools to create conditions and an environment for equal learning opportunities for all students. Only when equal learning opportunities are available for all students can standard-based science education have a chance to succeed.
References
Introduction Atwater, M. M. (1994). Research on cultural diversity in the classroom. In D. L. Gabel (Ed.), Handbook of research on science teaching and learning (pp. 558–576). New York: Macmillan. Blank, W. E. (1982). The competency-based approach to educating and training: Handbook for developing competence-based training programs. Englewood Cliffs, NJ: Prentice-Hall. Bransford, J. D., Brown, A. L., & Cocking, R. R. (Eds.). (2000). How people learn: Brain, mind, experience, and school. Washington, DC: National Academy Press. Comer, J. P. (2004). Leave no child behind: Preparing today’s youth for tomorrow’s world. New Haven, CT: Yale University Press. Dewey, J. (1916). Democracy and education: An introduction to the philosophy of education. New York: The Free Press. Fuhrman, S., & Lazerson, M. (Eds.). (2005). The public schools. Oxford, UK: Oxford University Press. Ladson-Billings, G. (2007). I used to love science and then I went to school: The challenge of school science in urban schools. In J. Settlage & S. A. Southerland (Eds.), Teaching science to every child: Using culture as a starting point (preface). New York: Routledge. Lee, O., & Luykx, A. (2007). Science education and student diversity: Race/Ethnicity, language, culture, and socioeconomic status. In S. K. Abell & N. G. Lederman (Eds.), Handbook of research on science education (pp. 171–197). Mahwah, NJ: Erlbaum. Lynch, S. J. (2000). Equity and science education reform. Mahwah, NJ: Erlbaum. Ravitch, D. (1995). National standards in American education: A citizen’s guide. Washington, DC: Brookings Institutions Press. Spady, W. G. (1977). Competency-based education: A bandwagon in search of a definition. Educational Researcher, 6(1), 9–14.
Chapter 1 Anderson, L. W., & Krathwohl, D. R. (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives. New York: Longman. Brown v. Board of Education. (1954). 347. U.S. 483. Bybee, R. W. (1998). National standards, deliberation, and design: The dynamics of developing meaning in science curriculum. In D. A. Roberts & L. Ostman (Eds.), Problems of meaning in science curriculum (pp. 150–165). New York: Teachers College Press.
89
90
References
Bybee, R. W. (2003). The teaching of science: Content, coherence, and congruence. Journal of Science Education and Technology, 12, 343–358. Bybee, R. W., & Ben-Zvi, N. (1998). Science curriculum: Transforming goals to practices. In K. G. Tobin & B. J. Fraser (Eds.), International Handbook of Science Education (pp. 487–498). Dordrecht, The Netherlands: Kluwer. Carroll, J. (1963). A model of school learning. Teachers College Record, 64, 723–733. Cizek, G. J., & Bunch, M. B. (2007). Standard setting: A guide to establishing and evaluating performance standards on tests. Thousands Oaks, CA: Sage. Comer, J. P. (2004). Leave no child behind: Preparing today’s youth for tomorrow’s world. New Haven, CT: Yale University Press. Dewey, J. (1916). Democracy and education: An introduction to the philosophy of education. New York: The Free Press. Elliott, M. (1998). School finance and opportunities to learn: Does money well spent enhance students’ achievement? Sociology of Education, 71(3), 223–245. Fuhrman, S., & Lazerson, M. (Eds.). (2005). The public schools. Oxford, UK: Oxford University Press. Guiton, G., & Oakes, J. (1995). Opportunity to learn and conceptions of educational equality. Educational Evaluation and Policy Analysis, 17(3), 323–336. Husén, T. (Ed.). (1967). International study of achievement in mathematics: A comparison of twelve countries. New York: Wiley. Liu, X., & Boone, W. (2006). Applications of Rasch measurement in science education. Maple Grove, MN: JAM Press. Masten, A. S., Coatsworth, J. D., Neemann, J., Gest, S. D., Tellegen, A., & Garmezy, N. (1995). The structure and coherence of competence from childhood through adolescence. Child Development, 66, 1635–1659. McDonnell, L. M. (1995). Opportunity to learn as a research concept and a policy instrument. Educational Evaluation and Policy Analysis, 17(3), 305–322. Moss, P. A., Pullin, D. C., Gee, J. P., Haertel, E. H., & Young, L. J. (Eds.). (2008). Assessment, equity, and opportunity to learn. New York: Cambridge University Press. National Center for Education Statistics [NCES] (1992). Raising standards for American education. Washington, DC: US Department of Education. National Council on Education Standards and Testing [NCEST] (1992). Raising standards for American education: A report to Congress, the secretary of education, the national education goals panel, and the American people. Washington, DC: US Government Printing Office. National Research Council [NRC] (2006). Systems for state science assessment. Washington, DC: The National Academies Press. Petrovich, J., & Wells, A. S. (Eds.). (2005). Bringing equity back: Research for new era in American educational policy. New York: Teachers College Press. Porter, A. (1993). Opportunity to Learn. Brief No. 7. Madison, WI: Center on Organization and Restructuring of Schools. Smithson, J. L., Porter, A., & Blank, R. K. (1995). Describing the enacted curriculum: Development and dissemination of opportunity to learn indicators in science education. Washington, DC: Council of Chief State School Officers. Stevens, F. I. (1997). Opportunity to learn science: Connecting research knowledge to classroom practices. Philadelphia, PA: Mid-Atlantic Lab for Student Success. Tate, W. (2001). Science education as a civil right: Urban schools and opportunity-to-learn considerations. Journal of Research in Science Teaching, 38(9), 1015–1028. Wang, J. (1998). Opportunity to learn: The impacts and policy implications. Educational Evaluation and Policy Analysis, 20(3), 137–156. Wilson, M. (2005). Constructing measures: An item response modeling approach. Mahwah, NJ: Erlbaum.
References
91
Chapter 2 Han, J., & Kamber, M. (2001). Data mining: Concepts and techniques. San Francisco, CA: Mogan Kaufman. Liu, H., Han, J., Xin, D., & Shao, Z. (2006). Top-down mining of interesting patterns from very high dimensional data. In Proceedings of the 22nd International Conference on Data Engineering (ICDE’06), p.114. IEEE Computer Society. URL: http://doi.ieeecomputersociety. org/10.1109/ICDE.2006.161 (accessed March 19, 2007). National Center for Education Statistics [NCES] (1999). 1996 State assessment program in science secondary-use data user guide. Washington, DC: Author. Witten, I. H., & Frank, E. (2005). Data mining: Practical machine learning tools and techniques. Amsterdam, the Netherlands: Morgan Kaufmann.
Chapter 3 Bates, P. E. (1978). The role of laboratory in secondary school science programs. In M. B. Rowe (Ed.), What research says to the science teacher (pp. 55–82). Washington, DC: National Science Teachers’ Association. Black, P., & Wiliam, D. (1998a). Assessment and classroom learning. Assessment in Education, 5(1), 7–73. Black, P., & Wiliam, D. (1998b). Inside the black box: Raising standards through classroom assessment. London: King’s College London School of Education. Black, P., & Wiliam, D. (2003). In praising educational research: Formative assessment. British Journal of Educational Research, 29(5), 623–636. Blosser, P. E. (1983). What research says: The role of the laboratory in science teaching. School Science and Mathematics, 83, 165–169. Bransford, J. D., Brown, A. L., & Cocking, R. R. (Eds.). (2000). How people learn: Brain, mind, experience, and school. Washington, DC: National Academy Press. Buckley, B. C., Gobert, J. D., Kindfield, A. C. H., Horwitz, P., Tinker, R. F., & Gerlits, B. (2004). Model-based teaching and learning with BioLogicaTM: What do they learn? How do they learn? How do we know? Journal of Science Education and Technology, 13(1), 23–41. Chiappetta, E. L., & Koballa, T. R. (2006). Science instruction in the middle and secondary schools. Upper Saddle River, NJ: Merrill Prentice Hall. Edelson, D. C. (1998). Realizing authentic science learning through the adaptation of scientific practice. In B. J. Fraser & K. G. Tobin (Eds.), International handbook of science education (pp. 317–333). Dordrecht, the Netherlands: Kluwer. Fraser, B., & Kahle, J. B. (2007). Classroom, home and peer environment influences on student outcomes in science and mathematics: An analysis of systemic reform data. International Journal of Science Education, 29(15), 1891–1909. Gallagher, J. J. (2007). Teaching science for understanding: A practical guide for middle and high school teachers. Upper Saddle River, NJ: Merrill. Johnson, C. C., Kahle, J. B., & Fargo, J. D. (2007). Effective teaching results in increased science achievement for all students. Science Education, 91(3), 371–383. Lazarowitz, R., & Tamir, P. (1994). Research on using laboratory instruction in science. In D. L. Gabel (Ed.), The Handbook of research on science teaching and learning (pp. 94–128). New York: Macmillan. Lee, H. S., & Songer, N. B. (2003). Making authentic science accessible to students. International Journal of Science Education, 25(8), 923–948. Metcalf, S. J., & Tinker, R. F. (2004). Probeware and handhelds in elementary and middle school science. Journal of Science Education and Technology, 13(1), 43–49.
92
References
Milne, C., & Otieno, T. (2007). Understanding engagement: science demonstrations and emotional energy. Science Education, 91(4), 523–553. Nakhleh, M. B., Polles, J., & Malina, E. (2002). Learning chemistry in a laboratory environment. In J. K. Gilbert, T. de Jong, R. Justi, D. F. Treagust & J. H. van Driel (Eds.), Chemical education: Towards research-based practice (pp. 69–94). Dordrecht, the Netherlands: Kluwer. National Assessment Governing Body [NAGB) (2000). Science framework for the 1996 and 2000 National Assessment of Educational Progress. Washington, DC: Author. National Center for Education Statistics [NCES] (1999). 1996 State assessment program in science secondary-use data user guide. Washington, DC: Author. National Research Council [NRC] (1996). National science education standards. Washington, DC: National Academy Press. National Research Council [NRC] (2000). Inquiry and the national science education standards. Washington, DC: National Academy Press. National Research Council [NRC] (2003). Assessment in support of instruction and learning: bridging the gap between large-scale and classroom assessment – workshop report. Washington, DC: National Academy Press. Odom, A. L., Stoddard, E. R., & LaNasa, S. M. (2007). Teacher Practices and Middle-School Science Achievements. International Journal of Science Education, 29(11), 1329–1346. Ogborn, J., Kress, G., Martin, I., & McGillicuddy, K. (1996). Explaining science in the classroom. Buckingham, UK: Open University Press. Parr, C. S., Jones, T., & Songer, N. B. (2004). Evaluation of a handheld data collection interface for science learning. Journal of Science Education and Technology, 13(2), 233–242. Roth, W-M., McRobbie, C. J., Lucas, K. B., & Boutonné, S. (1997). Why may students fail to learn from demonstration? A social practice perspective on learning in physics. Journal of Research in Science Teaching, 34, 509–533. Schroeder, C. M., Scott, P., Tolson, H., Huang, T.-Y., & Lee, Y.-H. (2007). A meta- analysis of national research: Effects of teaching strategies on student achievement in science in the United States. Journal of Research in Science Teaching, 44(10), 1436–1460. Songer, N. B. (1998). Can technology bring students closer to science? In B. J. Fraser & K. G. Tobin (Eds.), International handbook of science education (pp. 333–248). Dordrecht, the Netherlands: Kluwer. Treagust, D. F. (2007). General instructional methods and strategies. In S. K. Abell & N. G. Lederman (Eds.), Handbook of research on science education (pp. 373–391). Mahwah, NJ: Erlbaum. Trowbridge, L. W., Bybee, R. W., & Powell, J. C. (2004). Teaching secondary school science: Strategies for developing scientific literacy. Upper Saddle River, NJ: Pearson. Von Secker, C. E., & Lissitz, R. W. (1999). Estimating the impact of instructional practices on student achievement in science. Journal of Research in Science Teaching, 36(10), 1110–1126. White, R., & Gunstone, R. (1992). Probing understanding. London: Falmer Press. William, D., Lee, C., Harrison, C., & Black, P. (2004). Teachers developing assessment for learning: impact on student achievement. Assessment in Education, 11(1), 49–65.
Chapter 4 Bourdieu, P. (1977). Outline of a theory of practice. London: Cambridge University Press. Bourdieu, P. (1984). Distinction: A social critique of the judgement of taste. London: Routledge and Kegan Paul. Bourdieu, P. (1986). Forms of capital. In J. G. Richardson (Ed.), Handbook of theory and research for the sociology of education (pp. 241–258). New York: Greenwood Press. Bourdieu, P. (1998). Practical reason: On the theory of action. Cambridge, UK: Polity Press. Bransford, J. D., Brown, A. L., & Cocking, R. R. (Eds.). (2000). How people learn: Brain, mind, experience, and school. Washington, DC: National Academy Press.
References
93
Cavanagh, S. (2007). Poverty’s effect on U.S. scores greater than for other nations. Education Week, 27(15), 1–13. Coleman, J. S. (1988). Social capital in the creation of human capital. American Journal of Sociology, 94, 95–120. Coleman, J. S. (1990). Foundations of social theory. Cambridge, MA: Harvard University Press. Coleman, J. S. (1997). Family, school, and social capital. In L. J. Saha (Ed.), International encyclopaedia of the sociology of education (pp. 623–625). Oxford, UK: Pergamon. Coleman, J. S., Campbell, E. Q., Hobson, C. J., McPartland, J., Mood, A. M., Weinfeld, A. D., & York, R. L. (1966). Equality of educational opportunity. Washington, DC: US Government Printing Office. Falk, J. H. (2001). Free-choice science education: How we learn science outside school. New York: Teachers College Press. Gorman, S., & Yu, C. C. (2000). Science achievement and home environment: National Assessment of Educational Progress 1985–1986. Paper presented at the Annual Meeting of the American Educational Research Association (Boston, MA, April 16–20). Haile, G. A., & Nguyen, A. N. (2008). Determinants of academic attainment in the United States: A quantitle regression analysis of test scores. Education Economics, 16(1), 29–57. Jencks, C., & Phillips, M. (Eds.). (1998). The black-white test score gap. Washington, DC: Brookings Institution Press. Kellaghan, T., Sloane, K., Alvarez, B., & Bloom, B. S. (1993). The home environment and school learning: Promoting parental involvement in the education of children. San Francisco, CA: Jossey-Bass. Kingston, P. (2001). The unfulfilled promise of cultural capital theory. Sociology of Education, Extra Issue, 88–99. Lareau, A. (2000). Home advantage: Social class and parental intervention in elementary education. Lanham, MD: Rowman & Littlefield. Lareau, A. (2003). Unequal childhoods: Class, race, and family life. Berkeley, CA: University of California Press. Lareau, A., & Horvat, E. M. (1999). Movements of social inclusion and eclusion: Race, class, and cultural capital in family-school relationships. Sociology of Education, 72, 37–53. Lee, J. (2002). Racial and ethnic achievement gap trends: Reversing the progress toward equity? Educational Researcher, 31(1), 3–12. Lee, J.-S., & Bowen, N. K. (2006). Parent involvement, cultural capital, and the achievement gap among elementary school children. American Educational Research Journal, 43(2), 193–218. Marjoribanks, K. (2005). Family environments and children’s outcomes. Educational Psychology, 25(6), 647–657. McNeal, R. B. (1999). Parental involvement as social capital: Differential effectiveness on science achievement, truancy, and dropping out. Social Forces, 78(1), 117–44. Muller, P. A., Stage, F. K., & Kinzie, J. (2001). Science achievement growth trajectories: Understanding factors related to gender and racial-ethnic differences in precollege science achievement. American Educational Research Journal, 38(4), 981–1012. National Center for Education Statistics [NCES] (2001). The national report card: Science highlights 2000. Washington, DC: US Department of Education. Norberg-Schonfeldt, M. (2008). Children’s school achievement and parental work: An analysis for Sweden. Education Economics, 16(1), 1–17. Peng, S. S., & Hill, S. T. (1995). Understanding racial-ethnic differences in secondary school science and mathematics achievement (NCES Publication No. 85–710). Washington, DC: National Center for Education Statistics. Pong, S-L (2003). Family policies and children’s school achievement in single- versus two-parent families. Journal of Marriage and Family, 65(3), 681–699. Rodriquez, A. (1998). Busting open the meritocracy myth: Rethinking equity and student achievement in science education. Journal of Women and Minorities in Science and Engineering, 4(2, 3), 195–216.
94
References
Smith, F. M., & Hausafus, C. O. (1998). Relationship of family support and ethnic minority students’ achievement in science and mathematics. Science Education, 82(1), 111–125. Tamir, P. (1993). What makes a student a high achiever in science? Gifted Educational International, 9(1), 24–32. Tyler, K. M., Uqdah, A., Dillihunt, M. L., & et al. (2008). Cultural discontinuity: Toward a quantitative investigation of a major hypothesis in education. Educational Researcher, 37(5), 280–297. Wang, J., & Wildman, L. (1995). An empirical examination of the effects of family commitment in education on student achievement in seventh grade science. Journal of Research in Science Teaching, 32(8), 833–837. Zady, M. F. (2001). When low-SES parents cannot assist their children in solving science problems. Journal of Education for Students Placed at Risk, 6(3), 215–229.
Chapter 5 Barbour, C., & Barbour, N. H. (1997). Families, schools, and communities. Upper Saddle River, NJ: Merrill. Burstein, L., Fischer, K. B., & Miller, M. D. (1980). The multilevel effects of background on science achievement: A cross-national comparison. Sociology of Education, 53, 215–225. Cornbleth, C. (1984). Beyond Hidden Curriculum. Journal of Curriculum Studies, 16(1), 29–36. Epstein, J. (1994). Theory to practice: School and family partnership lead to school improvement and success. In C. L. Fagnano & B. Z. Werber (Eds.), School, family, and community internal team: A view from the firing line (pp. 39–52). Boulder, CO: Westview Press. Houtenville, A. J., & Conway, K. S. (2008). Parent effort, school resources, and student achievement. Journal of Human Resources, 43(2), 437–453. Hilton, J. K. (2003). The effect of technology on student science achievement. Unpublished Doctoral dissertation, the Claremont Graduate University. Jordan, C., Orozo, E., & Averett, A. (2001). Emerging issues in school, family and community connections. Austin, TX: National Center for Family and Community Connections with Schools. Lee, J.-S., & Bowen, N. K. (2006). Parent involvement, cultural capital, and the achievement gap among elementary school children. American Educational Research Journal, 43(2), 193–218. Levine, D. I., & Painter, G. (2008). Are measured school effects just sorting? Causality and correlation in the National Education Longitudinal Survey. Economics of Education Review, 27, 460–470. Mardis, M. A. (2005). The relationship between school library media programs and science achievement in Michigan middle schools. Unpublished doctoral dissertation, Eastern Michigan University. Miller-Whitehead, M. (2001). Science achievement, class size, and demographics: The debate continued. Research in the Schools, 8(2), 33–44. van Voorhis, F. L. (2003). Interactive homework in middle school: Effects on family involvement and science achievement. Journal of Educational Research, 96(6), 323–338. Wen, M. L., Barrow, L. H., & Alspaugh, J. (April., 2002). How does computer availability influence science achievement? Paper presented at the Annual Meeting of the National Association for Research in Science Teaching, New Orleans, LA. Yasumoto, J. Y., Uekawa, K., & Bidwell, C. E. (2007). The collegial focus and high school students’ achievement. Sociology of Education, 74(3), 181–209.
Appendix A
Variables Related to Teaching Practices Measured in 1996 for Grades 4 and 8 NAEP Science
NAEP variable
Grade
NAEP variable label
T060601
4, 8
How often do students read science textbook?
T060602
4, 8
How often do students read science book/magazine?
T060603
4, 8
How often do students discuss science in the news?
T060604
4, 8
How often do students work with others on activities?
T060605
4, 8
How often do students give oral science report?
T060606
4, 8
How often do students write science report?
T060607
4, 8
How often do students do hands-on science activities?
Recoded NAEP variable values 1 = Almost every day 2 = Once or twice a week 3 = Once or twice a month 4 = Never or hardly ever Others = Missing 1 = Almost every day 2 = Once or twice a week 3 = Once or twice a month 4 = Never or hardly ever Others = Missing 1 = Almost every day 2 = Once or twice a week 3 = Once or twice a month 4 = Never or hardly ever Others = Missing 1 = Almost every day 2 = Once or twice a week 3 = Once or twice a month 4 = Never or hardly ever Others = Missing 1 = Almost every day 2 = Once or twice a week 3 = Once or twice a month 4 = Never or hardly ever Others = Missing 1 = Almost every day 2 = Once or twice a week 3 = Once or twice a month 4 = Never or hardly ever Others = Missing 1 = Almost every day 2 = Once or twice a week 3 = Once or twice a month 4 = Never or hardly ever Others = Missing (continued) 95
96
Appendix A
NAEP variable
Grade
NAEP variable label
T060608
4, 8
How often do students talk about hands-on results?
T060609
4, 8
T060610
4, 8
T060611
4, 8
T060701
4, 8
T060702
4, 8
T060703
4, 8
T060704
4, 8
T060705
4, 8
T060801
4, 8
Recoded NAEP variable values
1 = Almost every day 2 = Once or twice a week 3 = Once or twice a month 4 = Never or hardly ever Others = Missing How often do students take 1 = Almost every day science tests/quiz? 2 = Once or twice a week 3 = Once or twice a month 4 = Never or hardly ever Others = Missing How often do students use library 1 = Almost every day resources for science? 2 = Once or twice a week 3 = Once or twice a month 4 = Never or hardly ever Others = Missing How often do students use com1 = Almost every day puters for science? 2 = Once or twice a week 3 = Once or twice a month 4 = Never or hardly ever Others = Missing How often do you talk to class 1 = Almost every day about science? 2 = Once or twice a week 3 = Once or twice a month 4 = Never or hardly ever Others = Missing How often do you do a science 1 = Almost every day demonstration? 2 = Once or twice a week 3 = Once or twice a month 4 = Never or hardly ever Others = Missing How often do you show a 1 = Almost every day science video/television 2 = Once or twice a week program? 3 = Once or twice a month 4 = Never or hardly ever Others = Missing How often do you use computers for 1 = Almost every day science? 2 = Once or twice a week 3 = Once or twice a month 4 = Never or hardly ever Others = Missing How often do you use compact 1 = Almost every day disks or laser disks on science? 2 = Once or twice a week 3 = Once or twice a month 4 = Never or hardly ever Others = Missing How often do students go on 1 = Three or more times/year science field trips? 2 = One or two times a year 3 = Never or hardly ever Others = Missing (continued)
Appendix A
97
NAEP variable
Grade
NAEP variable label
T060901
4, 8
How often do you bring a guest speaker for science?
T061001
4, 8
T061101
4, 8
T061102
4, 8
T061103
4, 8
T061104
4, 8
T061105
4, 8
T061106
4, 8
T061107
4, 8
T061108
4, 8
T061109
4, 8
T061201
4, 8
Recoded NAEP variable values
1 = Three or more times/year 2 = One ot two times a year 3 = Never or hardly ever Others = Missing Do you save student work in 1 = Yes portfolios for assessment? 2 = No Others = Missing How much emphasis do you place 1 = Heavy emphasis on knowing science facts/ 2 = Moderate emphasis terminology? 3 = Little/no emphasis Others = Missing How much emphasis do you place 1 = Heavy emphasis on understanding key science 2 = Moderate emphasis concepts? 3 = Little/no emphasis Others = Missing How much emphasis do you place 1 = Heavy emphasis on developing problem-solving 2 = Moderate emphasis skills? 3 = Little/no emphasis Others = Missing How much emphasis do you place 1 = Heavy emphasis on science relevance to 2 = Moderate emphasis society/technology? 3 = Little/no emphasis Others = Missing How much emphasis do you place 1 = Heavy emphasis on communicating science 2 = Moderate emphasis ideas? 3 = Little/no emphasis Others = Missing How much emphasis do you place 1 = Heavy emphasis on developing laboratory 2 = Moderate emphasis skills? 3 = Little/no emphasis Others = Missing How much emphasis do you 1 = Heavy emphasis place on developing interest in 2 = Moderate emphasis science? 3 = Little/no emphasis Others = Missing How much emphasis do you place on 1 = Heavy emphasis developing data analysis skills? 2 = Moderate emphasis 3 = Little/no emphasis Others = Missing How much emphasis do you place 1 = Heavy emphasis on using technology as science 2 = Moderate emphasis tool? 3 = Little/no emphasis Others = Missing Do you assign individual/group 1 = Yes projects that take a week or more? 2 = No Others = Missing (continued)
98
Appendix A
NAEP variable
Grade
NAEP variable label
T061301 / T090501
4, 8
How often do you assess students using multiple-choice tests?
T061302
4, 8
How often do you assess students using short or long written responses?
T061303
4, 8
How often do you assess students using individual projects?
T061304
4, 8
How often do you assess students using group projects?
T061305
4, 8
How often do you assess students using portfolios?
T061306
4, 8
How often do you assess students using in-class essays?
T061307
4, 8
How often do you assess students using self- or peer-evaluation?
T061308
4, 8
How often do you assess students using laboratory notebooks/ journals?
Recoded NAEP variable values 1 = Once or twice a week 2 = Once or twice a month 3 = Once a grading period 4 = Once or twice a year 5 = Never or hardly ever Others = Missing 1 = Once or twice a week 2 = Once or twice a month 3 = Once a grading period 4 = Once or twice a year 5 = Never or hardly ever Others = Missing 1 = Once or twice a week 2 = Once or twice a month 3 = Once a grading period 4 = Once or twice a year 5 = Never or hardly ever Others = Missing 1 = Once or twice a week 2 = Once or twice a month 3 = Once a grading period 4 = Once or twice a year 5 = Never or hardly ever Others = Missing 1 = Once or twice a week 2 = Once or twice a month 3 = Once a grading period 4 = Once or twice a year 5 = Never or hardly ever Others = Missing 1 = Once or twice a week 2 = Once or twice a month 3 = Once a grading period 4 = Once or twice a year 5 = Never or hardly ever Others = Missing 1 = Once or twice a week 2 = Once or twice a month 3 = Once a grading period 4 = Once or twice a year 5 = Never or hardly ever Others = Missing 1 = Once or twice a week 2 = Once or twice a month 3 = Once a grading period 4 = Once or twice a year 5 = Never or hardly ever Others = Missing (continued)
Appendix A
99
NAEP variable
Grade
NAEP variable label
T061309
4, 8
How often do you assess students using homework?
T061310
4, 8
How often do you assess students using hands-on activities?
T061401
4, 8
What is the proportion of science evaluation based on hands-on?
T061501
4, 8
What is the computer availability for science?
T061601
4, 8
Do you use computers for science by drills and practice?
T061611
4, 8
Do you use computers for science by playing learning games?
T061621
4, 8
Do you use computers for science by simulations and modeling?
T061631
4, 8
Do you use computers for science as data analysis?
T061641
4, 8
Do you use computers for science as word processing?
T061651
4, 8
I do not use computers for science?
T062001
4, 8
How much time do you spend on life science?
Recoded NAEP variable values 1 = Once or twice a week 2 = Once or twice a month 3 = Once a grading period 4 = Once or twice a year 5 = Never or hardly ever Others = Missing 1 = Once or twice a week 2 = Once or twice a month 3 = Once a grading period 4 = Once or twice a year 5 = Never or hardly ever Others = Missing 1 = Most or all of grade 2 = About half of grade 3 = Very little of grade 4 = None of grade Others = Missing 1 = None available 2 = In lab – difficult access 3 = In lab – easy access 4 = 1 in classroom 5 = 2–3 in classroom 6 = 4 or more in classroom Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 0 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = A lot 2 = Some 3 = A little 4 = None Others = Missing (continued)
100
Appendix A
NAEP variable
Grade
NAEP variable label
T062002
4, 8
How much time do you spend on earth science?
T062003
4, 8
How much time do you spend on physical science?
T062201
4, 8
Do students produce notebooks/ reports of laboratory work?
T062202
4, 8
Do students produce reports of extended science projects?
T062203
4, 8
Do students produce reports on specific topics/issues?
T062204
4, 8
Do students produce reports/ records of field trips?
T062205
4, 8
Do students produce journals/diaries/logs of ideas?
T062206
4, 8
Do students produce photos/pictures of projects?
T062207
4, 8
Do students produce audio/video records of activities?
T062208
4, 8
Do students produce reports of personal interviews?
T062209
4, 8
Do students produce three-dimensional science models?
T062210
4, 8
Do students produce computergenerated multimedia projects?
T062301
4, 8
How much time a week should students spend on science homework?
Recoded NAEP variable values 1 = A lot 2 = Some 3 = A little 4 = None Others = Missing 1 = A lot 2 = Some 3 = A little 4 = None Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = None 2 = 0.5 h 3=1h 4=2h 5 = More than 2 h Others = Missing
Appendix B
Variables Related to Family Background and Home Environment Measured in 1996 for Grades 4 and 8 NAEP Science
NAEP variable
Grade
NAEP variable label
Recoded NAEP variable values
B003001A
4, 8
Which race/ethnicity best describes you?
B008001A
4, 8
How long have you lived in the United States?
B003201A
4, 8
How often are languages other than English spoken at home?
B008601A
4, 8
How much education did your mother receive?
B008701A
4, 8
How much education did your father receive?
B008801A
4, 8
About how many books are there in your home?
B000901A
4, 8
Does your family get a newspaper regularly?
1 = White 2 = Black 3 = Hispanic 4 = Asian/Pacific 5 = American Indian 6 = Others 1 = All my life 2 = >5 years 3 = 3–5 years 4 = <3 years Others = Missing 1 = Never 2 = Sometimes 3 = Always Others = Missing 1 = Did not finish high school 2 = Graduated from high school 3 = Some education after high school 4 = Graduate from college Others = Missing 1 = Did not finish high school 2 = Graduated from high school 3 = Some education after high school 4 = Graduate from college Others = Missing 1 = None 2 = 1–10 3 = 11–25 4 = 26–100 5 = More than 100 Others = Missing 1 = Yes 2 = No Others = Missing (continued) 101
102
Appendix B
NAEP variable
Grade
NAEP variable label
Recoded NAEP variable values
B000903A
4, 8
Is there an encyclopedia in your home?
B000905A
4, 8
Does your family get magazines regularly?
B008901A
4, 8
B009001A
4, 8
Do you have your own study desk or table at home? How much television/ video tapes do you watch on a school day?
B006601A
4, 8
How much time do you spend on homework?
B009101A
4, 8
How many hours of extra reading do you do per week not connected with school?
B009201A/B010101A
4, 8
How many times have you changed school since first grade?
B005601A
4, 8
B005701A
4, 8
B007401A
4, 8
Does your mother or stepmother live at home with you? Does your father or stepfather live at home with you? Do you discuss your studies at home?
1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = None 2 = 1 h or less 3 = 2h 4 = 3h 5 = 4h 6 = 5h 7 = 6 h or more Others = Missing 1 = Do not usually have 2 = Have but do not do 3 = 0.5 h or less 4 = 1h 5 = More than 1 h Others = Missing 1 = None 2 = 1–2 h 3 = 3–4 h 4 = 5–6 h 5 = 7–8 h 6 = 9–10 h 7 = More than 10 h Others = Missing 1 = None 2=1 3=2 4 = 3 or more 5=4 6=5 7=6 8 = 7 or more Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Almost every day 2 = Once or twice a week 3 = Once or twice a month 4 = Never or hardly ever Others = Missing (continued)
Appendix B
103
NAEP variable
Grade
NAEP variable label
Recoded NAEP variable values
B009301A
4, 8
How often do you use a home computer for schoolwork?
B009501A/B009601A
4, 8
B009502A/B009602A
4, 8
Does your mother or stepmother work at a job for pay? Does your father or stepfather work at a job for pay?
1 = Almost every day 2 = Once or twice a week 3 = Once or twice a month 4 = Never or hardly ever Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing
Appendix C
Variables Related to School Context Measured in 1996 for Grades 4 and 8 NAEP Science
NAEP variable
Grade
NAEP variable label
Recoded NAEP variable values
C030901
4
What best describes how 4th grades are organized?
C034201
8
What best describes how 8th grades are organized?
C037101
4
Are 4th graders assigned by ability/ achievement level?
C034402
8
Are 8th graders assigned to math by ability?
C034403
8
Are 8th graders assigned to science by ability?
C034401
8
Are 8th graders assigned to English by ability?
C034406
8
Are 8th graders assigned to arts by ability?
C034510
8
How often do 8th graders receive computer science instruction?
C034511/ C031212
8, 4
How often do 8th/4th graders receive math instruction?
1 = Self-contained 2 = Departmentalized 3 = Regrouped Others = Missing 1 = Self-contained 2 = Semi-departmentalize 3 = Departmentalized Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Every day 2 = Three or four times a week 3 = Once or twice a week 4 = Less than once a week 5 = Subject not taught Others = Missing 1 = Every day 2 = Three or four times a week 3 = Once or twice a week 4 = Less than once a week 5 = Subject not taught Others = Missing (continued) 105
106 NAEP variable
Appendix C
Grade
NAEP variable label
Recoded NAEP variable values
C034512/ C031205
8, 4
How often do 8th/4th graders receive science instruction?
C034513
8
How often do 8th graders receive English instruction?
C031213
4
How often do 4th graders receive English instruction?
C034514/ C031214
8, 4
How often do 8th/4th graders receive arts instruction?
C031603
8, 4
Has math been identified as a priority?
C031607
8, 4
Has science been identified as a priority?
C031611/ C031601
8, 4
Has English/reading been identified as a priority?
C031610
8, 4
Has arts been identified as a priority?
C031606
8, 4
Has subject integration been a priority?
C034601
8
Do schools offer 8th graders algebra for high school credit?
C035701
8, 4
Are computers available all the time in classroom?
C035702
8, 4
Are computers grouped in a separate laboratory and available?
C035703
8, 4
Are computers available to bring to the room when needed?
1 = Every day 2 = Three or four times a week 3 = Once or twice a week 4 = Less than once a week 5 = Subject not taught Others = Missing 1 = Every day 2 = Three or four times a week 3 = Once or twice a week 4 = Less than once a week 5 = Subject not taught Others = Missing 1 = Every day 2 = Three or four times a week 3 = Once or twice a week 4 = Less than once a week 5 = Subject not taught Others = Missing 1 = Every day 2 = Three or four times a week 3 = Once or twice a week 4 = Less than once a week 5 = Subject not taught Others = Missing 1 = Yes 0 = No Others = Missing 1 = Yes 0 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing (continued)
Appendix C
107
NAEP variable
Grade
NAEP variable label
Recoded NAEP variable values
C037201
8, 4
Is it a school with special focus on math?
C037202
8, 4
Is it a school with special focus on science?
C037203
8
Is it a school with special focus on English?
C037204
8, 4
Is it a school with special focus on arts?
C037205
8, 4
Is it a school with special focus on other?
C037206
8, 4
Is the school not a special focus school?
C037207
4
Is it a school with a special focus on reading?
C037301
8, 4
Does it follow district/state math curriculum?
C037302
8, 4
Does it follow district/state science curriculum?
C037303
4
Does it follow district/state reading curriculum?
C037306
8
Does it follow district/state English curriculum?
C037304
8, 4
Does it follow district/state arts curriculum?
C037305
8, 4
Does it follow no district/state curriculum?
C039401
8, 4
In the past year was there an 8th/4th grade field trip for math?
C039402
8, 4
In the past year was there an 8th/4th grade field trip for science?
C039403
8, 4
In the past year was there an 8th/4th grade field trip for reading?
C039404
8, 4
In the past year was there an 8th/4th grade field trip for arts?
1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing (continued)
108
Appendix C
NAEP variable
Grade
NAEP variable label
Recoded NAEP variable values
C039405
8, 4
In the past year was there an 8th/4th grade field trip for other subjects?
C039406
8, 4
In the past year was there no 8th/4th grade field trip for any of the above?
C039501
8, 4
Do 8th/4th graders participate in extracurricular activities for math?
C039502
8, 4
Do 8th/4th graders participate in extracurricular activities for science?
C039503
8, 4
C039504
8, 4
Do 8th/4th graders participate in extracurricular activities for English language/reading? Do 8th/4th graders participate in extracurricular activities for arts?
C039505
8, 4
C039601
8, 4
C039602
8, 4
Do 8th/4th graders participate in summer programs for science?
C039603
8, 4
C039604
8, 4
Do 8th/4th graders participate in summer programs for English language/reading? Do 8th/4th graders participate in summer programs for arts?
C039605
8, 4
C036601
8, 4
C032207
8, 4
Do you involve parents as aides in classroom?
C032209
8, 4
Do you have parents review/sign homework?
1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing 1 = No library in school 2 = Library – No staff or volunteer staff 3 = Part-time staff 4 = Full-time staff Others = Missing 1 = Yes, routine 2 = Yes, occasional 3 = No Others = Missing 1 = Yes, routine 2 = Yes, occasional 3 = No Others = Missing
Do 8th/4th graders not participate in extracurricular activities for any of the above? Do 8th/4th graders participate in summer programs for math?
Do 8th/4th graders not participate in summer programs for any of the above? What is the primary way library is staffed?
(continued)
Appendix C
109
NAEP variable
Grade
NAEP variable label
Recoded NAEP variable values
C032210
8, 4
Do you assign homework for students to do with parents?
C032211
8, 4
Do you have a parent volunteer program?
C037701
8, 4
What is the percentage of parents in parent–teacher organization?
C037702
4
What is the percentage of parents in open house/back to school night?
C037703
8, 4
What is the percentage of parents in parent–teacher conferences?
C037704
8, 4
What is the percentage of parents involved in curriculum decisions?
C037705
8, 4
What is the percentage of parents in volunteer programs?
C032402
8, 4
Is student absenteeism a problem in your school?
C032401
8, 4
Is student tardiness a problem in your school?
C032404
8, 4
Are physical conflicts a problem in your school?
1 = Yes, routine 2 = Yes, occasional 3 = No Others = Missing 1 = Yes, routine 2 = Yes, occasional 3 = No Others = Missing 1 = 0–25% 2 = 26–50% 3 = 51–75% 4 = 76–100% Others = Missing 1 = 0–25% 2 = 26–50% 3 = 51–75% 4 = 76–100% Others = Missing 1 = 0–25% 2 = 26–50% 3 = 51–75% 4 = 76–100% Others = Missing 1 = 0–25% 2 = 26–50% 3 = 51–75% 4 = 76–100% Others = Missing 1 = 0–25% 2 = 26–50% 3 = 51–75% 4 = 76–100% Others = Missing 1 = Serious 2 = Moderate 3 = Minor 4 = Not a problem Others = Missing 1 = Serious 2 = Moderate 3 = Minor 4 = Not a problem Others = Missing 1 = Serious 2 = Moderate 3 = Minor 4 = Not a problem Others = Missing (continued)
110
Appendix C
NAEP variable
Grade
NAEP variable label
Recoded NAEP variable values
C032406
8, 4
Is teacher absenteeism a problem in your school?
C032407
8, 4
Are racial/cultural conflicts a problem in your school?
C032408
8, 4
Is student health a problem in your school?
C032409
8, 4
Is lack of parent involvement a problem in your school?
C032410
8, 4
Is student alcohol use a problem in your school?
C032411
8, 4
Is student tobacco use a problem in your school?
C032412
8, 4
Is student drug use a problem in your school?
C032413
8, 4
Are gang activities a problem in your school?
C032414
8, 4
Is student misbehavior a problem in your school?
C032415
8, 4
Is student cheating a problem in your school?
1 = Serious 2 = Moderate 3 = Minor 4 = Not a problem Others = Missing 1 = Serious 2 = Moderate 3 = Minor 4 = Not a problem Others = Missing 1 = Serious 2 = Moderate 3 = Minor 4 = Not a problem Others = Missing 1 = Serious 2 = Moderate 3 = Minor 4 = Not a problem Others = Missing 1 = Serious 2 = Moderate 3 = Minor 4 = Not a problem Others = Missing 1 = Serious 2 = Moderate 3 = Minor 4 = Not a problem Others = Missing 1 = Serious 2 = Moderate 3 = Minor 4 = Not a problem Others = Missing 1 = Serious 2 = Moderate 3 = Minor 4 = Not a problem Others = Missing 1 = Serious 2 = Moderate 3 = Minor 4 = Not a problem Others = Missing 1 = Serious 2 = Moderate 3 = Minor 4 = Not a problem Others = Missing (continued)
Appendix C
111
NAEP variable
Grade
NAEP variable label
Recoded NAEP variable values
C032502
8, 4
How is the teacher morale?
C032503
8, 4
How are student attitudes toward academic achievement?
C032505
8, 4
How is the parent support for student achievement?
C032506
8, 4
How is the regard for school property?
C033601
8, 4
What is the percentage of students absent on an average day?
C036501
8, 4
What is the percentage of teachers absent on an average day?
C037801
8, 4
What is the percentage of students still enrolled at the end of year?
C037901
8, 4
What is the percentage of 8th/4th graders held back and repeating 8th/4th grade?
C038001
8, 4
What is the percentage of full-time teachers who left before end of school year?
1 = Very positive 2 = Somewhat positive 3 = Somewhat negative 4 = Very negative Others = Missing 1 = Very positive 2 = Somewhat positive 3 = Somewhat negative 4 = Very negative Others = Missing 1 = Very positive 2 = Somewhat positive 3 = Somewhat negative 4 = Very negative Others = Missing 1 = Very positive 2 = Somewhat positive 3 = Somewhat negative 4 = Very negative Others = Missing 1 = 0–2% 2 = 3–5% 3 = 6–10% 4 = More than 10% Others = Missing 1 = 0–2% 2 = 3–5% 3 = 6–10% 4 = More than 10% Others = Missing 1 = 98–100% 2 = 95–97% 3 = 90–94% 4 = 80–89% 5 = 70–79% 6 = 60–69% 7 = 50–59% 8 = Less than 50% Others = Missing 1 = 0% 2 = 1–2% 3 = 3–5% 4 = 6–10% 5 = More than 10% Others = Missing 1 = 0% 2 = 1–2% 3 = 3–5% 4 = 6–10% (continued)
112 NAEP variable
Appendix C
Grade
NAEP variable label
Recoded NAEP variable values 5 = 11–15% 6 = 16–20% 7 = More than 20% Others = Missing 1 = Yes 2 = No Others = Missing 1 = Yes 2 = No Others = Missing
C038301
8, 4
Is the school in the national school lunch program?
C038801
8, 4
Does the school receive chapter 1/ Title I funding?
Appendix D
Accuracy Measures of Competence Models
A number of measures describe the accuracy of a model. The most commonly used measures are Recall, Precision, F-measure, and Kappa. To compute these measures, a machine learning algorithm first defines a 2 × 2 contingency table representing the possible outcomes of the classification: true positive (TP), true negative (TN), false positive (FP), and false negative (FN). TP is when an instance belonging to a class is classified into that class; TN is when an instance not belonging to a class is classified into other classes. TP and TN are correct classifications; the higher the rates, the better. FP is when an instance not belonging to a class is classified into that class; FN is when an instance belonging to a class is classified into other classes. FP and FN are errors; the lower the rates, the better. Recall, also known as sensitivity, is defined as the proportion of instances of the class that are correctly identified by the algorithm. Using the contingency table, recall is computed as follows: Recall =
TP TP + FN
Precision, also known as a positive predictive value, is the proportion of correct classifications achieved by the following algorithm: Precision =
TP TP + FP
A derived measure, called F-measure, is used to measure the overall accuracy of classification. The F-measure corresponds to the harmonic mean of recall and precision. The F-measure is defined as: F-measure =
2 × Recall × Precision 2 × TP = Recall + Precision 2 × TP + FP + FN
The F-measure ranges from 0 to 1; the higher the value of the F-measure, the better. Another statistic, called Kappa, is also used to measure the agreement between predicted and observed classification by correcting for agreement by chance. Kappa is defined as:
113
114
Appendix D
Kappa =
Pr(agreement) − Pr(chance) 1 − Pr(chance)
where Pr(agreement) represents the probability of agreement, and Pr(chance) is the probability that the agreement occurs by chance. The values of Kappa range from −1 to 1, with 1 representing perfect agreement between the model and data. In addition to the above statistics for describing the classification accuracy, a confusion matrix indicates the number of instances that are correctly classified and the number of instances that are incorrectly classified. While the F-measure gives overall accuracy rate, the confusion matrix shows exactly how many and what instances are correctly classified, and how many and what instances are incorrectly classified.
Accuracy Measures: Grade 4 Model Based on Classroom Teaching Practices Correctly classified instances Incorrectly classified instances Kappa statistic Mean absolute error Root mean squared error Relative absolute error Root relative squared error Total number of instances Ignored class unknown instances
5,471 (74.9%) 1,831 (25.1%) 0.2 0.4 0.4 92.3% 96.4% 7,302 759
Detailed accuracy by class TP rate FP rate Precision
Recall
F-measure
Class
0.96 0.19
0.96 0.19
0.85 0.30
U S
0.81 0.04
0.76 0.63
Confusion matrix a b < – classified as 5,094 1,608
223 377
|a=U |b=S
Accuracy Measures: Grade 8 Model Based on Classroom Teaching Practices Correctly classified instances Incorrectly classified instances
5,957 (76.7%) 1,814 (23.3%)
Appendix D
115
Kappa statistic Mean absolute error Root mean squared error Relative absolute error Root relative squared error Total number of instances Ignored class unknown instances
0.2 0.4 0.4 94.2% 97.0% 7,771 429
Detailed accuracy by class TP rate FP rate Precision
Recall
F-measure
Class
0.98 0.12
0.98 0.12
0.86 0.21
U S
0.88 0.018
0.77 0.70
Confusion matrix a b < – classified as 5,714 1,709
105 243
|a=U |b=S
Accuracy Measures: Grade 4 Model Based on Home Environment Variables Correctly classified instances Incorrectly classified instances Kappa statistic Mean absolute error Root mean squared error Relative absolute error Root relative squared error Total number of instances Ignored class unknown instances
5,402 (74.0%) 1,900 (26.0%) 0.3 0.3 0.4 82.1% 94.9% 7,302 759
Detailed accuracy by class TP rate FP rate Precision
Recall
F-measure
Class
0.88 0.38
0.88 0.37
0.83 0.44
U S
0.63 0.12
0.79 0.53
Confusion matrix a b < – classified as 4,662 1,245
655 740
|a=U |b=S
116
Appendix D
Accuracy Measures: Grade 8 Model Based on Home Environment Variables Correctly classified instances Incorrectly classified instances Kappa statistic Mean absolute error Root mean squared error Relative absolute error Root relative squared error Total number of instances Ignored class unknown instances
5,877 (75.6%) 1,897 (24.4%) 0.27 0.32 0.42 84.6% 96.1% 7,774 426
Detailed accuracy by class TP rate FP rate Precision
Recall
F-measure
Class
0.90 0.33
0.90 0.33
0.85 0.41
U S
0.67 0.097
0.80 0.54
Confusion matrix a b < – classified as 5,215 1,337
560 662
|a=U |b=S
Accuracy Measures: Grade 4 Model Based on School Variables Correctly classified instances Incorrectly classified instances Kappa statistic Mean absolute error Root mean squared error Relative absolute error Root relative squared error Total number of instances Ignored class unknown instances
5,472 (74.9%) 1,830 (25.1%) 0.23 0.36 0.42 89.6% 95.1% 7,302 759
Detailed accuracy by class TP rate FP rate Precision
Recall
F-measure
Class
0.94 0.25
0.94 0.25
0.85 0.35
U S
0.75 0.065
0.77 0.59
Appendix D
117
Confusion matrix a b < – classified as 4,973 1,486
344 499
|a=U |b=S
Accuracy Measures: Grade 8 Model Based on School Variables Correctly classified instances Incorrectly classified instances Kappa statistic Mean absolute error Root mean squared error Relative absolute error Root relative squared error Total number of instances Ignored class unknown instances
5,956 (76.7%) 1,815 (23.4%) 0.19 0.35 0.42 92.6% 96.4% 7,771 429
Detailed accuracy by class TP rate FP rate Precision
Recall
F-measure
Class
0.96 0.18
0.96 0.18
0.86 0.28
U S
0.82 0.036
0.78 0.62
Confusion matrix a b < – classified as 5,611 1,607
208 345
|a=U |b=S
Appendix E
Tutorial on the Weka Machine Learning Workbench
Weka stands for Waikato Environment for Knowledge Analysis. It was developed at the University of Waikato in New Zealand. Written in Java and distributed under the terms of the GNU General Public License, Weka is a collection of state-of-the-art machine learning algorithms and data reprocessing and representation tools. One of the most popular machine learning programs freely available, Weka runs on virtually all computer platforms including Mac, PC, and Linux. The current version of Weka is Weka 3.4.
Installing Weka Go to the Web site at http://www.cs.waikato.ac.nz/~ml/weka/index.html. In the left-hand side frame, locate and click at the Download link. In the main frame, locate the section corresponding to your computer platform (i.e., Windows, Mac OS X, or other platforms). Click at the appropriate link to begin download Weka 3.4 alone (if your computer is already installed with Java VM 1.4) or Weka 3.4 together with the Java VM 1.4. Choose Save File to download the program to your computer’s hard drive. Run the downloaded computer source file, and follow instructions to install the Weka 3.4 on your computer.
Starting Weka After you have successfully installed Weka on your computer, you should be able to see a shortcut link created on your computer’s desktop (if you chose to do so) and in a program folder titled Weka 3.4.11. Select Weka 3.4 to start the program. You should now see the Weka main graphic user interface as follows. Within the main graphic user interface, you have choices for four different ways of machine learning. Simple CLI allows you to enter Java commands in appropriate
119
120
Appendix E
text fields to perform specific data mining tasks. Explorer gives you access to all machine learning algorithms using menu selection and form filling. Experimenter allows you interactively to find out which methods and parameter values work best for a given machine learning task. Finally, Knowledge Flow allows you to design configurations for streamed data processing by dragging boxes representing learning algorithms and data sources and joining them to best fit your machine learning task. For novices, Explorer is preferred as it makes available all the machine learning algorithms and allows easy menu selection. The disadvantage for Explorer is inefficiency in computer memory because it loads all program codes and many of them may not be needed. Computer memory becomes a concern only when your data set is very large. For most machine learning tasks in education, even with such large data sets as National Assessment of Educational Progress (NAEP), computer memory is not a problem. Click at the Explorer button; you should now see the following screen.
Appendix E
121
Uploading Data File The initial Explorer interface is Preprocess, which contains three tabs for you to load or identify your data source. If your data are in a table format saved as a file, which is the most common data source for machine learning, you will click at Open File. Similarly, if your data are from a Web site, you will click at Open URL; and if your data are from a database, you will click at Open DB. Let us assume that your data are in a table and saved in a file. Before you upload your data file into Explorer, you will need to ensure that your data are represented in a particular format. The standard format Weka accepts is Attribute-Relation File Format (ARFF). In an ARFF data file, lines beginning with a % sign are comments. Lines beginning with a @ sign are attributes and relation definitions. The actual data are comma separated numeric values and strings, with each line representing one data record. One example of data records in the ARFF file format is as follows: % ARFF file for the energy competence data mining task @relation energy competence @attribute item_code string @attribute cognitive_level numeric @attribute percentage_correct numeric @attribute competence {Mastery, Unmastery} @data
122
Appendix E
ASMSA09, Understanding, 65.8, Mastery ASMSA09, Understanding, 72.9, Mastery ASMSE07, Applying, 50.8, Unmastery In the above example, the first line describes the data mining task. The blank line is ignored by computers. The third line gives a title to be used in all outputs. The next four lines define the structure of the data records, with each record consisting of four data fields namely item_code, cognitive_level, percentage_correct, and competence. The lines following @data are actual data records. Since data files are commonly in a spreadsheet format or can be easily converted into a spreadsheet format, Weka accepts a data file format called Comma-Separated Value (CSV). A spreadsheet file with the .xls extension can be directly saved as a CSV file with the extension of .csv. When Weka uploads a CSV file, it automatically identifies data attributes and creates an internal ARFF file to process. Click at the Open File tab, locate your data file in the CSV format, and open the file into the Explorer interface. You will now see a screen as follows.
The example data file is from a study published in Journal of Research in Science Teaching (Liu, X., and Ruiz, M. (2008). Using data mining to predict
Appendix E
123
K–12 students’ performance on large-scale assessment items related to energy. Journal of Research in Science Teaching, 45(5), 554–573.) In this data file, 76 items related to energy came from various large-scale science assessments including NAEP and Trends in International Mathematics and Science Study (TIMSS). Each item was classified by three attributes: type of content, cognitive level, and context. The types of content were in the following hierarchical categories: energy’s ability to do work or being responsible for various activities (activity/ work), energy sources and forms (form/source), energy transfer (transfer), energy degradation (degradation), and energy conservation (conservation). The identified items were also classified as involving everyday context, i.e., if an item deals with a phenomenon related to typical American students’ everyday experiences; or non-everyday context, i.e., if an item deals with a phenomenon beyond typical American students’ everyday experiences. The cognitive levels were conceptual understanding and reasoning/investigating. In addition, each item had an attribute of percentage correct – weighted percentage correct for the entire population, an attribute of student grade level, and an attribute of satisfactory (S) performance or unsatisfactory (U) performance using a cutoff value of 55% correct. The objective of data mining task for this study was to identify significant predictors for students’ performances on energy-related items in terms of percentage correct and competence level (S or U). After the data file has been loaded, a number of functions are available to allow you to examine the structure and accuracy of your data. First, the number of instances, i.e., data records, and attributes are displayed. For each attribute, its type (numerical or string), number of missing values, number of distinct values, and percent of unique values are displayed. A histogram is also displayed to help examine the data structure and accuracy. If an unusual data structure or abnormal data value is found, the original data file may contain errors. Changes can be made to the data file using MS Excel or MS Word, or the Edit function within the Explorer interface. When using the Edit function within the Explorer interface, you will right click on your mouse over an attribute name. You may also delete the attributes you do not need for a particular machine learning task by selecting those attributes, and then click the Remove button. Another way to edit and modify your data is using Filters. Filters are available by clicking the Choose button. After changes are made, you should Save your data file. The Visualize tab on the top of the interface allows you to view scatter plots between attributes. After you have examined and ensured that your data are accurate, you are ready to perform machine learning.
Performing Machine Learning There are over 100 machine learning algorithms implemented in Weka. These algorithms are grouped into the following categories: Classify, Cluster, Associate, and Select Attributes. A brief description of these algorithms is available in Appendix F.
124
Appendix E
Next, we will demonstrate two machine learning algorithms: Decision Tree using J48 and Linear Function using linear regression; both are available within the interface of Classify. The purpose of building a decision tree is to predict students’ status of competence, i.e., satisfactory (>55%) or unsatisfactory (≤ 55%) using item attributes (e.g., content type, cognitive level, student grade level, etc.). Before performing this machine learning task, the data file was modified by keeping only the following attributes: item content type (content), item cognitive level (cognitive), performance grade level (population), item context (context), and competence status (standard). Click at Classify, and choose the J48 function as shown below.
Selecting J48 by a double click, you will then see the following interface with the Start button visible. Make sure that the box above the Start button shows the target to be predicted, i.e., standard in this example. Click at Start button to activate the machine learning process. In a few moments, the status at the bottom will show OK, indicating the completion of the task. The screen should look like this.
Appendix E
125
In the Classifier Output panel/window, you will see basic information about the process. Scroll down the window, you will find the accuracy measures of the constructed decision tree (as follows).
126
Appendix E
To visualize the built decision tree, move your mouse over the highlighted task within the panel Result List, right click your mouse, and choose Visualize tree; a new pop-up window opens showing the decision tree. Maximize the window into a full screen, right click your mouse within the new window, use Fill to Screen and Auto Scale to obtain the best visualization effect. The decision tree is as follows.
In order to better represent your decision tree, particularly for the publication purpose, you may use a separate graphic software to represent the above decision tree in a more easy-to-read graph as follows. Cognitive Level Understanding
Reasoning and Investigating
Context everyday
non-everyday
Population elementary
middle and high school
Content work satisfactory (4/1)
form/transfer/ degradation/ conservation
unsatisfactory (13/0)
satisfactory (33/11)
unsatisfactory (12/3)
unsatisfactory (14/1)
Appendix E
127
To perform another machine learning task, it is often necessary to reopen your data and conduct data modification again because different machine learning tasks may make use of different data attributes. For example, the above decision tree task uses a nominal attribute (satisfactory or unsatisfactory) for competence. If you are going to perform a linear regression, you will need to use a numerical attribute for student performance, i.e., percentage correct. Let us perform another machine learning task – linear function/linear regression. Reload your data file, modify the data file by only keeping the following attributes: content, cognitive, context, population, and percentage. Choose Linear Regression within Functions–Linear Regression. Again, make sure that the predicted attribute percentage is shown in the box above the Start button. Click the Start button, and let Weka complete its process. The following screen will then appear.
The Classifier Output window shows the regression model, as well as its accuracy measures.
Refining Procedures to Obtain Best Results On the Test Options panel, there are a few options available for performing machine learning and calculating accuracy measures. Machine learning typically performs two processes: (a) learning or discovering rules and (b) testing rules. The two
128
Appendix E
processes use two different data sets, with the first process using a training data set, and the second process using a testing data set. Accuracy measures are estimated based on the testing data set. Both the training and testing data sets are randomly created from the original data set. The Weka default is tenfold cross-validation. The tenfold cross-validation refers to the strategy by which the original data set is randomly split into ten roughly equal data sets. One data set is then withheld as the testing data set and the remaining nine data sets are used as the training data set. The process repeats ten times with each data set being used as the testing data set, and the accuracy measures are calculated as an average over the ten testing runs. The tenfold cross-validation is used as the default because research has shown that this strategy produces best classification rules and most accurate estimates of classification precisions. Obviously, this strategy assumes a large data set. Sometimes, another strategy than the standard tenfold cross-validation may produce better results. The option Use training set refers to using the same original data set as both the training data set and testing data set. It is obvious that this strategy will result in much higher values of accuracy measures. The accuracy measures under the option of Use training set may be considered as the upper bound or most optimistic scenario of machine learning. Users may also supply an external data set as the test data set, or specify a particular percentage for splitting the original data set for training and testing. For more information on how to use Weka, please refer to Witten, I. H. and Frank, E. (2005). Data Mining: Practical Machine Learning Tools and Techniques. San Francisco, CA: Morgan Kaufmann Publishers.
Appendix F
Machine Learning Algorithms Implemented in Weka
Classifier: Bayesian Classifiers Averaged, One-Dependence Estimators (AODE): Averages over a space of alternative Bayesian models that have weaker independence assumption. BayesNet: Learns Bayesian networks under the assumptions that attributes are nominal and there are no missing values. ComplementNaiveBayes: Builds a Completement Naive Bayes classifier. NaiveBayes: Builds a standard probabilistic Naive Bayes classifier. NaiveBayesMultinominal: Implements the multinomial version of Naive Bayes. NaiveBayesSimple: Simple implementation of Naive Bayes. NaiveBayesUpdateable: Incremental Naive Bayes classifier that learns one instance at a time.
Classifier: Trees ADTree: Builds alternating decision trees using boosting and is optimized for two-class problems. DecisionStump: Builds one-level decision trees. Id3: Implements the basic divide-and-conquer decision tree algorithm. J48: Implements the C4.5 decision tree learner. LMT: Builds logistic model trees for binary and multiclass target variables, numerical and nominal attributes, and missing values. M5P: Implements the M5’ model tree learner. NBTree: Builds a decision tree with Naive Bayes classifier at the leaves. RandomForest: Constructs random forests. RandomTree: Constructs a tree that considers a given number of random features at each node. REPTree: Implements a fast tree learner that uses reduced-error pruning. UserClassifier: Allows users to build their own decision tree.
129
130
Appendix F
Classifier: Rules ConjunctiveRule: Learns a single rule that predicts either a numerical or nominal class value. DecisionTable: Builds a decision table majority classifier. JRip: Implements the RIPPER algorithm for fast, effective rule induction. M5Rule: Obtains regression rules from model trees built using M5’. Nnge: Implements the nearest-neighbor method for generating rules using nonnested generalized exemplars. OneR: Generates a one-level decision tree expressed in the form of a set of rules that all test one particular attribute. Part: Obtains rules from partial decision trees. Prism: Implements the elementary covering algorithm for rules. Ridor: Learns rules with exceptions by generating the default rule using incremental reduced-error pruning to find exceptions with the smallest error rate. ZeroR: Predicts the test data’s majority class (if nominal) or average value (if numeric).
Classifier: Functions LeastMedSq: Implements the robust linear regression method that minimizes the median (rather than the mean) of the squares of divergences from the regression line. LinearRegression: Performs the standard least squares linear regression and can optionally perform attribute selection using backward elimination or a full model. Logistic: Builds linear logistic regression models. MultilayerPreceptron: Builds a neural network that trains using back-propagation. PaceRegression: Builds linear regression models using the Pace regression technique. RBNetwork: Implements a Gaussian radial basis function network. SimpleLinearRegression: Builds a linear regression model based on a single attribute. SimpleLogistic: Builds linear logistic regression models with built-in attribute selection. SMO: Implements the sequential minimal optimization algorithm for training a support vector classifier. SMOreg: Implements the sequential minimal optimization algorithm for regression problems. VotedPerceptron: Implements the voted perceptron algorithm. Windows: Implements mistake-driven perceptron with multiplicative updates.
Appendix F
131
Classifier: Lazy IB1: Implements the basic instance-based learner that finds the training instance closest in Euclidean distance to the given test instance and predicts the same class as this training instance. IBk: Implements the k-nearest-neighbor classifier that uses the same distance metric. KStar: Implements the nearest-neighbor method with a generalized distance function based on transformations. Lazy Bayesian Rules (LBR): Bayesian classifier that defers all processing to classification time. Locally Generalized Learning (LWL): Implements a general algorithm for locally weighted learning.
Classifier: Miscellaneous Hyperpipes: Records the range of values observed in the training data for each attribute and works out which ranges contain the attribute values of a test instance. VFI: Constructs intervals around each class by discretizing numeric attributes and using point intervals for nominal ones, records class counts for each interval on each attribute, and classifies test instances by voting.
Classifier: Metalearning AdaBoostM1: Boosts using the AdaBoostM1 method. AdditiveRegression: Enhances the performance of a regression method by iteratively fitting the residues. AttributeSelectedClassifier: Reduces dimensionality of data by attribute selection. Bagging: Bags a classifier to reduce variance; works for regression too. ClassificationViaRegression: Performs classification using a regression method. CostSensitiveClassifier: Makes its base classifier cost-sensitive. CVParameterSelection: Performs parameter selection by cross-validation. Decorate: Builds ensembles of classifiers by using specially constructed artificial training examples. FilteredClassifier: Runs a classifier on filtered data. Grading: Metalearners whose inputs are base-level predictions that have been marked as correct or incorrect. LogistBoost: Performs additive logistic regression. MetaCost: Makes a classifier cost-sensitive. MultiBoostAB: Combines boosting and bagging using the MultiBoosting method. MultiClassClassifier: Uses a two-class classifier for multiclass datasets.
132
Appendix F
MultiScheme: Uses cross-validation to select a classifier from several candidates. OrdinalClassClassifier: Applies standard classification algorithms to problems with an ordinal class value. RacedIncrementalLogistBoost: Performs batch-based incremental learning by racing logit-boosted committees. RandomCommittee: Builds an ensemble of randomizable base classifiers. RegressionByDiscretization: Discretizes the class attribute and employs a classifier. Stacking: Combines several classifiers using the stacking method. StackingC: More efficient version of stacking. ThresholdSelector: Optimizes the F-measure for a probabilistic classifier. Vote: Combines classifiers using an average of probability estimates or numeric predictions.
Clustering EM: Clusters using expectation maximization. Cobweb: Implements both the Cobweb algorithm for nominal attributes and the Classit algorithms for numeric attributes. FarthestFirst: Implements the farthest-first traversal algorithm. MakeDensityBasedClusterer: A metaclusterer that wraps a clustering algorithm to make it return a probability distribution and density. SimpleKMeans: Clusters data using k-means.
Association Apriori: Finds association rules using the Apriori algorithm. PredictiveApriori: Finds association rules sorted by predictive accuracy. Tertius: Finds rules according to a confirmation measure.
Attribute Selection Attribute selection is done by searching the space of attribute subsets and evaluating each one.
Attribute Selection: Evaluation CfsSubsetEval: Considers the predictive value of each attribute individually, along with the degree of redundancy among them. ClassifierSubsetEval: Uses a classifier to evaluate attribute set.
Appendix F
133
ConsistencySubsetEval: Projects training set onto an attribute set and measures consistency in class values. WrapperSubsetEval: Uses a classifier plus cross-validation. ChiSquareAttributeEval: Computes the chi-square statistics of each attribute with respect to the class. GainRatioAttributeEval: Evaluates attributes based on the gain ratio. InfoGainAttributeEval: Evaluates attributes based on information gain. OneRAttributeEval: Uses OneR’s methodology to evaluate attributes. PrincipalComponents: Performs principal components analysis and transformation. ReliefAttributeEval: Implements instance-based attribute evaluator. SVMAttributeEval: Uses a linear support vector machine to determine the value of attributes. SymmetricalUncertAttributeEval: Evaluates attributes based on symmetric uncertainty.
Attribute Selection: Search BestFirst: Greedy hill-climbing with backtracking. ExhaustiveSearch: Searches exhaustively. GeneticSearch: Searches using a simple genetic algorithm. GreedyStepwise: Greedy hill-climbing without backtracking. RaceSearch: Uses race search methodology. RandomSearch: Searches randomly. RankSearch: Sorts the attributes and ranks promising subsets using an attribute subset evaluator. Ranker: Ranks individual attributes (not subsets) according to their evaluation.
Author Index
A Alspaugh, J., 66 Alvarez, B., 46 Anderson, L.W., 8 Atwater, M.M., 3 Averett, A., 67
B Barbour, C., 65 Barbour, N.H., 65 Barrow, L.H., 66 Bates, P.E., 22 Ben-Zvi, N., 8 Bidwell, C.E., 66 Black, P., 23 Blank, R.K., 10 Blank, W.E., 3 Bloom, B.S., 46 Blosser, P.E., 22 Boone, W., 5 Bourdieu, P., 45 Boutonné, S., 23 Bowen, N.K., 46, 67 Bransford, J.D., 3, 19, 20, 43 Brown, A.L., 3, 19, 20, 51 Buckley, B.C., 22 Bunch, M.B., 8 Burstein, L., 65 Bybee, R.W., 8
C Campbell, E.Q., 44 Carroll, J., 8 Cavanagh, S., 47 Chiappetta, E.L., 22 Cizek, G.J., 8
Coatsworth, J.D., 5 Cocking, R.R., 3, 19, 20, 51 Coleman, J.S., 44–46 Comer, J.P., 3, 9 Conway, K.S., 67 Cornbleth, C., 65
D Dewey, J., 4, 9
E Edelson, D.C., 22 Elliott, M., 10 Epstein, J., 67
F Falk, J.H., 43 Fargo, J.D., 21 Fischer, K.B., 65 Frank, E., 14, 16 Fraser, B., 21 Fuhrman, S., 3, 9
G Gallagher, J.J., 23, 24 Garmezy, N., 5 Gee, J.P., 10 Gerlits, B., 22 Gest, S.D., 5 Gobert, J.D., 22 Gorman, S., 47 Guiton, G., 9 Gunstone, R., 23
135
136 H Haertel, E.H., 10 Haile, G.A., 45, 46 Han, J., 14 Harrison, C., 23 Hausafus, C.O., 46 Hill, S.T., 44 Hilton, J.K., 66 Hobson, C.J., 44 Horvat, E.M., 46 Horwitz, P., 22 Houtenville, A.J., 67 Huang, T.-Y., 24 Husén, T., 9
J Johnson, C.C., 21 Jones, T., 22 Jordan, C., 67
K Kahle, J.B., 21 Kamber, M., 14 Kellaghan, T., 46 Kindfield, A.C.H., 22 Kingston, P., 46 Kinzie, J., 45 Koballa, T.R., 22 Krathwohl, D.R., 8 Kress, G., 23
L Ladson-Billings, G., 2 LaNasa, S.M., 24 Lareau, A., 46 Lazarowitz, R., 22 Lazerson, M., 3, 9 Lee, C., 23 Lee, H.S., 22 Lee, J.-S., 45, 46, 67 Lee, O., 2, 3 Lee, Y.-H., 24 Levine, D.I., 65, 66 Lissitz, R.W., 21 Liu, H., 14 Liu, X., 5, 122 Lucas, K.B., 23 Luykx, A., 2, 3 Lynch, S.J., 3
Author Index M Malina, E., 22 Mardis, M.A., 66 Marjoribanks, K., 45, 46 Martin, I., 23 Masten, A.S., 5 McDonnell, L.M., 8 McGillicuddy, K., 23 McNeal, R.B., 46 McPartland, J., 44 McRobbie, C.J., 23 Metcalf, S.J., 22 Miller, M.D., 65 Miller-Whitehead, M., 67 Milne, C., 23 Mood, A.M., 44 Moss, P.A., 10 Muller, P.A., 45
N Nakhleh, M.B., 22 Neemann, J., 5 Nguyen, A.N., 45, 46 Norberg-Schonfeldt, M., 47
O Oakes, J., 9 Odom, A.L., 24 Ogborn, J., 23 Orozo, E., 67 Otieno, T., 23
P Painter, G., 65, 66 Parr, C.S., 22 Peng, S.S., 44 Petrovich, J., 10 Phillips, M., 44 Polles, J., 22 Pong, S.L., 47 Porter, A., 9 Powell, J.C., 22 Pullin, D.C., 10
R Ravitch, D., 4 Rodriquez, A., 44 Roth, W.-M, 23
Author Index S Schroeder, C.M., 24 Scott, P., 24 Shao, Z., 14 Sloane, K., 46 Smith, F.M., 46 Smithson, J.L., 10 Songer, N.B., 22 Spady, W.G., 3 Stage, F.K., 45 Stevens, F.I., 10 Stoddard, E.R., 24
T Tamir, P., 22, 47 Tate, W., 9 Tellegen, A., 5 Tinker, R.F., 22 Tolson, H., 24 Treagust, D.F., 22, 23 Trowbridge, L.W., 22
U Uekawa, K., 66
V van Voorhis, F.L., 67 Von Secker, C.E., 21
137 W Wang, J., 10, 11 Weinfeld, A.D., 44 Wells, A.S., 10 Wen, M.L., 66 White, R., 23 Wildman, L., 47 William, D., 23 Wilson, M., 6 Witten, I.H., 14, 16
X Xin, D., 14
Y Yasumoto, J.Y., 66 York, R.L., 44 Young, L.J., 10 Yu, C.C., 47
Z Zady, M.F., 46
Subject Index
A Academic aspiration, 46 Accuracy measures confusion matrix, 114–117 contingency table, 113 F-measure, 113, 114 false negative (FN), 113 false positive (FP), 113 Kappa, 113, 114 precision, 113 recall, 113 true negative (TN), 113 true positive (TP), 113 Achievement achievement gap, 2, 3, 21, 43, 45, 46, 87 achievement standard, 4 Advance organizer, 22 Assessment formative assessment, 20, 23, 24 summative assessment, 20
C Capital cultural capital, 45, 46 economic capital, 45 family capital, 45–46 human capital, 46 social capital, 45 Causal comparative, 13 Class size, 10, 67 Competence level, 5, 17, 26, 28, 33, 34, 37, 56, 58, 64, 83, 123 measure, 5–7 model, 4, 13–18, 19–41, 43–64, 65–81, 83, 84, 113–117 scale, 6
Competence-based education, 2, 3 Computer modeling, 22 Conceptual change, 19 Cross-validation, 15, 16, 128, 131–133 Culture, 3, 4, 49, 65, 71, 76, 85, 86 Curriculum orientation of, 65 explicit, 65 hidden, 65 implicit, 65
D Data mining/machine learning, 4, 11, 13–18, 113, 120–123, 127, 128 Discriminant analysis, 13 Diversity, 3, 21
E Educational attainment, 46, 47 Equality democratic conception of, 9 liberal conception of, 9 libertarian conception of, 9 Equity, 1–4, 10, 87 Ethnicity, 3, 21, 43, 45, 49, 50, 57, 59, 64, 87 Experimental design quasi-experimental, 13 randomized experimental, 13 Extended project, 27
F Family background, 43–47, 66, 101–103 Field trip, 25–28, 40, 65, 84, 86, 87 Free and reduced lunch, 67 Free-choice learning, 43
139
140
Subject Index
G Guest speaker, 26–28, 40, 84, 86
O Opportunity-to-learn (OTL), 4, 87
H Hands-on, 22, 26–29, 39–41, 84, 85 Home environment, 43, 47, 67, 83, 101–103
P Parent education, 46 employment, 43, 46 involvement, 67–70, 74, 78–81, 85, 86 occupation, 46, 47 volunteer program, 80, 81, 86, 87 Parent–child interaction, 46 Profile analysis, 13
I Inquiry approach, 19, 20
K Knowledge discovery, 14
L Laboratory skills, 21, 22, 33, 35, 37, 40, 41 Language, 3, 9, 43, 46, 51, 52, 56, 60, 63, 64, 86 Learning environment assessment-centered, 20 community-centered, 20, 26 knowledge-centered, 20 student-centered, 20, 24
M Measurement, 5–8
N National Assessment of Educational Progress (NAEP), 5, 6, 7, 9, 17, 19, 20, 24–39, 43, 44, 49–57, 59–62, 65, 66, 68–74, 76–80, 83, 95–100, 101–103, 105–112, 120, 123 National Center for Education Statistics (NCES), 9, 17, 44 National Council on Education Standards and Testing (NCEST), 9 National Research Council (NRC), 8, 19, 20, 21, 23
R Race and ethnicity Asian, 44, 49, 50, 56, 57, 59, 60, 64 Black, 43–45, 49, 50, 57, 59, 60 Hispanic, 44, 45, 49, 50, 57, 59, 60, 85, 87 White, 44, 45, 49, 50, 57, 58, 59, 60 Regression linear, 124, 127, 130 logistic, 130, 131
S School resources, 66–68 Science demonstration, 22, 23, 35, 37, 40 Social class, 46 Social Economic Status (SES), 9 Standard-based education (SBE), 1–4, 8 Standardized testing, 2 Standard setting, 8
T Technology, 14, 22, 23, 24, 66, 85
W Weka, 14, 16, 119–128, 129–133