The Effectiveness of One-to-One Laptop Initiatives in Increasing Student Achievement
←
→
Page content transcription
If your browser does not render page correctly, please read the page content below
1101 Pennsylvania Ave. NW, Suite 600 Washington, DC 20004 P 202.756.2971 F 866.808.6585 www.hanoverresearch.com The Effectiveness of One-to-One Laptop Initiatives in Increasing Student Achievement In this report, the Hanover Research Council provides a review of seven major studies designed to measure the impact of one-to-one laptop initiatives on student achievement, with particular emphasis placed on the areas of reading and writing. Throughout the report, we pay particular attention to evidence that suggests that participation in such programs also increases achievement for students of low socioeconomic backgrounds. MARKET EVALUATION SURVEYING DATA ANALYSIS BENCHMARKING INNOVATIVE PRACTICES LITERATURE REVIEW
DISTRICT ADMINISTRATION PRACTICE MARCH 2010 Executive Summary In response to the increased importance placed on the use of technology in today’s world, schools have sought out ways to better prepare students for the 21 st century workforce. Furthermore, in a time of increased accountability and the need to meet baseline standards as set forth by state policies and federal requirements such as No Child Left Behind, it has become increasingly important for school administrators to explore innovative strategies that may help boost student achievement. A number of early studies have provided strong evidence of a correlation between increased exposure to technology and improved academic achievement, and one approach that appears to have grown in popularity is the use of one-to-one computing initiatives, in which each child has access to a laptop computer. However, despite increased attention from leaders in K-12 education, there is still a lack of large-scale research studies focused on teaching and learning in ubiquitous computing environments. Many of the laptop initiatives launched in the early 2000s have just recently reached a point at which there is sufficient data for study. Only in the past few years have there been a number of studies beginning to look at the relationship between student achievement and participation in a one-to-one laptop program.1 In this report, we provide a literature review on the topic of the effect of one-to-one or ubiquitous computing environments on student achievement, with particular focus on achievement in the areas of reading and writing. Subsequently, we offer a short discussion of laptop initiatives and their potential impact on learning for students from economically disadvantaged backgrounds. While there are a number of small anecdotal cases to discuss in regard to the relationship between laptops and the academic achievement of students in poverty, our research did not uncover large- scale studies published to date. This report presents case studies of seven one-to-one laptop initiatives implemented across the country in an effort to determine if such programs typically result in increased student achievement, particularly in reading and writing. Below, we present a list of the programs profiled in this report and a brief summary of each. Stillwater Independent School District, Stillwater, MN The district does not have a significant low-income population. The control group for comparison was a school with a 3:1 computer program in place. As such, the results compare students in the 1:1 1Bebell, Damian and Rachel Kay. ―One to One Computing: A Summary of the Quantitative Results from the Berkshire Wireless Learning Initiative.‖ Journal of Technology, Learning, and Assessment, Vol. 9, No. 2 (Jan 2010): p. 6. http://escholarship.bc.edu/cgi/viewcontent.cgi?article=1222&context=jtla © 2010 The Hanover Research Council – District Administration Practice 2
DISTRICT ADMINISTRATION PRACTICE MARCH 2010 computing program with students who also have a high level of technology access. The authors hesitate to draw any significant conclusions from the study, as too many variables exist which could have influenced student achievement. The authors observed only minimal increases in reading scores on standardized tests for student participants in the laptop program. Students who scored in the bottom quartile on standardized tests before the laptop program was implemented saw the greatest gains in reading scores after two years in the program. Henrico School District, Richmond, VA Approximately one-quarter of students qualify for federally subsidized lunch programs. The study did not track student achievement, but rather student attitudes toward the program, with an emphasis on minorities and students of low socioeconomic status. The study found that students’ computer usage varied by ethnicity. Students of Hispanic, African-American, and ―other‖ backgrounds tended to use their computers less often than Asian and White students. Students’ fidelity to the laptop initiative also varied by socioeconomic background. Eighty-eight percent of students receiving free or reduced-price lunches wanted the laptop program to continue. Comparatively, only 77 percent of students not receiving free or reduced-price lunches felt the same way. Berkshire Wireless Learning Initiative, Western Massachusetts The control group for the study consisted of a population of neighboring middle schools with similar demographics but no laptop program. The program was implemented specifically to increase student achievement. To analyze results, ten years of state standardized testing scores were procured, providing a strong historical performance background to determine if changes from year to year were significant. After participating in the laptop program, students taking a mock standardized test free-response question wrote longer responses and scored higher than their peers taking a pen-and-paper exam. The study concluded that student achievement has been positively enhanced through the laptop program. © 2010 The Hanover Research Council – District Administration Practice 3
DISTRICT ADMINISTRATION PRACTICE MARCH 2010 Technology Immersion Pilot, Texas Public Schools Approximately 75 percent of students in the program were classified as economically disadvantaged. The study focused on how different levels of program implementation affected student achievement on Texas Assessment of Knowledge and Skills tests. The level of student access and usage was the strongest and most consistent predictor of reading achievement, meaning students who reported higher levels of use in school as well as at home fared better on assessments than peers with low access and use scores. According to the authors, the most important conclusion of the study is that ubiquitous computing environments allowing students to take laptops home help equalize out-of-school learning opportunities for students in disadvantaged family situations and, in turn, increase academic achievement. Estrella School District, Southern California Approximately forty percent of Estrella School District students are classified as economically disadvantaged. The study and control groups for the study were small, consisting each of only 54 students. Participation in the laptop program consistently had positive effects on students’ reading and writing scores on state standardized tests. The data indicates that a longer-term study may reveal more striking results in achievement for the laptop group. The current study only lasted two years. Harvest Park Middle School, Pleasanton, CA Very few students are classified as economically disadvantaged: only 1 percent in the laptop group and 4 percent in the school as a whole are eligible for free or reduced-price lunch. Parents were asked to provide laptops for their children, though parents who could not afford a computer could appeal to the district for assistance. Laptop program participation was found to have a significant influence on students’ GPAs, end of course grades for language arts, performance on district writing assessments, and state standardized test scores. © 2010 The Hanover Research Council – District Administration Practice 4
DISTRICT ADMINISTRATION PRACTICE MARCH 2010 The State of Maine Despite being studied over five years, there have been no appreciable changes in students’ standardized test scores since the beginning of Maine’s laptop program. Turning to writing abilities, researchers found that participation in the laptop program greatly improved students’ skills in this area. Students who self-reported a higher level of engagement with technology over the course of the writing process tended to score higher on state writing assessments. Student writing was found to have improved regardless of whether students were tested on a computer or on paper. © 2010 The Hanover Research Council – District Administration Practice 5
DISTRICT ADMINISTRATION PRACTICE MARCH 2010 Case Studies of One-to-One Laptop Initiatives and the Resulting Impacts on Student Achievement in Reading and Writing Case 1: Stillwater Independent School District, Stillwater, MN The Stillwater Independent School District enacted a technology-intensive program in its two junior high schools (grades 7-9) in the fall of 2004, aiming to increase students’ access to laptop computers. In the district’s two junior high schools, the overall enrollment figures for grades 7-9 were 1,016 and 1,084 students in the fall of 2007. Of these students, only 11 percent at one school and 12 percent at the other qualified for free or reduced-price lunches.2 These figures suggest that the district’s student body is relatively affluent and that it may be able to rely upon a higher degree of parental and community involvement in its initiatives. Because of the composition of the student body, the district is notably less concerned with the laptop program’s effect on minority or low-income students than other districts. Stillwater initially began its laptop program in November of 2003, but made significant modifications for the fall of 2004. For the sake of comparison, the district opted to permit students at Oak-Land Junior High School (OLJHS) to take their computers home throughout the school year, while students attending Stillwater Junior High School (SJHS) could only access the laptops via mobile carts. While the OLJHS program was a true one-to-one initiative, the SJHS program only maintained a student to computer ratio of 3:1, making it much less technology-intensive and allowing it to serve as a control to the more developed program at OLJHS.3 The five year one-to-one pilot program at OLGHS has largely been considered a success, and the current installment of the District Technology Plan indicates that SJHS is in the early stages of upgrading its technology initiative from a 3:1 cart-based system to a one-to-one computing program similar to that in place at its sister school.4 After five years, the Stillwater laptop initiative has yielded a significant amount of data for analysis of the effects on student achievement. The University of Minnesota’s Center for Applied Research and Educational Improvement (CAREI) assessed the project’s first three years of operation, with its final report published in November of 2008. The overall goal of the study, according to its authors, was to ―collect 2 ―Laptop Initiative Evaluation Report.‖ Center for Applied Research and Educational Improvement, (5 Nov 2008), p. 3. http://www.stillwater.k12.mn.us/sites/363874ed-8822-4032-b432- 366a02d38aa1/uploads/Stillwater_Technology_ Report_2.pdf 3 Ibid., p. 10. 4 ―District Technology Plan 2008-2011.‖ Stillwater Area Public Schools, p. 119. http://www.stillwater.k12.mn.us/sites/363874ed-8822-4032-b432-366a02d38aa1/uploads/Stillwater2008- 11_TechPlan.pdf © 2010 The Hanover Research Council – District Administration Practice 6
DISTRICT ADMINISTRATION PRACTICE MARCH 2010 information about the impact on teaching and learning as a result of implementing the laptop initiative at OLJHS and SJHS.‖5 While the CAREI study is substantial and notable in its comprehensiveness, its authors did not affirm a direct connection between any improvements in student performance and the computing initiative, since eliminating other factors that could have affected student performance in order to prove causality would have been impossible.6 In fact, the study’s statistical analyses indicate that there were no statistically significant differences between OLJHS and SJHS students’ standardized test scores in reading and mathematics. Increases in scores did exist between the two groups, but they were minimal. CAREI’s conclusion instead is that ―the results suggest that neither the one-to-one model nor the cart model of laptop access detract from students’ performance on standardized assessment measures.‖7 While, according to CAREI, there were no significant differences between student achievement in the 1:1 environment versus the 3:1 environment, there are potential differences in achievement over time under both programs—the longer students have been exposed to the laptop initiatives, the higher they test on standardized assessments on average. The tables below present reading scores on the Measures of Academic Progress (MAP) assessments for each cohort of students affected by the laptop initiative. Table 1.1 presents the average percentile and average score for Reading tests administered to three cohorts of students. For Cohorts I and II, there is an alarming drop in average reading scores as students reach the 9 th grade (see Fall 2006 for Cohort I and Fall 2007 for Cohort II). Aside from that anomaly, scores do trend upwards to some degree over time. However, it is impossible to say with certainty whether this is attributable to the laptop program or normal student progress. Table 1.1: Average Reading Scores by Cohort8 OLJHS SJHS Mean Mean Mean Mean Score Percentile Score Percentile Fall 2004 67.89 222.74 66.78 222.59 Spring 2005 66.14 226.47 66.06 226.62 Cohort I Fall 2005 69.78 227.15 70.93 227.95 Spring 2006 69.48 230.28 70.63 230.91 5 ―Laptop Initiative Evaluation Report.‖ Op. cit., p. i. 6 Ibid., p. 15. 7 Ibid., p. vi. 8 Ibid. p. 97. © 2010 The Hanover Research Council – District Administration Practice 7
DISTRICT ADMINISTRATION PRACTICE MARCH 2010 OLJHS SJHS Mean Mean Mean Mean Score Percentile Score Percentile Fall 2006 19.87 205.78 26.45 211.48 Spring 2007 18.83 208.45 34.27 217.83 Fall 2005 74.31 225.63 76.04 226.61 Spring 2006 76.03 229.70 76.79 230.62 Cohort II Fall 2006 65.17 225.06 67.67 226.28 Spring 2007 22.30 209.33 45.00 218.97 Fall 2007 22.33 205.62 15.31 203.04 Fall 2006 64.73 221.46 64.84 221.68 Cohort III Spring 2007 NA NA NA NA Fall 2007 64.06 224.16 67.45 226.25 Source: Center for Applied Research and Educational Improvement, University of Minnesota Table 1.2 summarizes performance on the MAP Reading tests by students in the bottom quartile of all test-takers. As demonstrated by this data, students who were initially performing the poorest saw consistently increasing scores over the study period. Indeed, these lowest performers appear to have benefited the greatest from the introduction of the laptop initiatives. It should be noted that only data for Cohort I is offered, as data for the other cohorts is incomplete for this group. Table 1.2: Reading Test Performance by Cohort I Students in the Lowest Quartile9 OLJHS SJHS Median Score Mean Score Median Score Mean Score Fall 2004 202.5 198.95 203.0 201.59 Spring 2005 208.0 208.05 210.5 211.36 Fall 2005 209.0 209.76 212.0 211.0 Spring 2006 217.0 215.47 219.0 217.15 Source: Center for Applied Research and Educational Improvement, University of Minnesota Case 2: Henrico County Public Schools, Richmond, VA The Henrico County Public School District is located just outside the city of Richmond, Virginia. It was the largest school district to initiate one-to-one computing when it began its program in 2001. At present, the program is ongoing, and the district reports distributing over 24,000 Dell and Apple laptops to its middle and high school students each year through the one-to-one program. The district provides another 3,700 laptops for the entire instructional and administrative staff.10 Henrico County permits students to take their laptops home throughout the school year and 9Ibid., p. 102. 10―Your Administration.‖ Henrico County Public Schools. http://henrico.k12.va.us/administration/instruction/technology/technology.html © 2010 The Hanover Research Council – District Administration Practice 8
DISTRICT ADMINISTRATION PRACTICE MARCH 2010 provides a dedicated staff to conduct student and faculty training and maintain the laptops.11 Overall, approximately 25 percent of the district’s 45,000 students qualify for free or reduced-price lunches, with the proportion being much higher in the comparatively poor eastern portion of the district.12 Development Associates produced a report in 2005 which reviewed the first three years of the Henrico County laptop initiative. The report draws on extensive data obtained from student, teacher, administrator, and parent surveys. Its primary purpose was to capture the overall opinion of the program and its effect on student learning habits, tracking the program’s effect on individual demographic segments of the student body. The report focused heavily on the program’s impact on minority students and students of low socioeconomic backgrounds. In the Henrico County schools reviewed in the study, 53.6 percent of students are non-white and 24.7 percent are eligible for free or reduced-price lunch.13 The study’s survey asked students to report on their laptop usage both at home and at school. The report found that the extent of reported use varies by race/ethnicity, but does not vary by free/reduced lunch status. To determine average usage by such characteristics, the study assigned a composite score based on questionnaire responses. The possible scores ranged from 0 (lowest use) to 46 (highest use). Table 2.1 presents the composite scores by race/ethnicity. Table 2.1: Composite Score for Reported Computer Usage by Race/Ethnicity14 Race/Ethnicity Composite Score Asian 25.0 White 24.2 Hispanic 24.0 African-American 23.2 Other 23.2 Source: Development Associates The differences in usage by different racial and ethnic groups are notable, though not staggering. Another interesting aspect of the Development Associates report lies in its study of students’ perception of the usefulness of the laptops. There were meaningful differences among groups based on race/ethnicity and free/reduced lunch status. 11 Ibid. 12 Zucker, Andrew, et al. ―A Study of One-to-One Computer Use in Mathematics and Science Instruction at the Secondary Level in Henrico County Public Schools.‖ SRI International (Feb 2005), p. 1. http://ubiqcomputing.org/FinalReport.pdf 13 Davis, Diana, et al. ―Henrico County Public Schools iBook Survey Report.‖ Development Associates (10 Feb 2005), p. 13. http://www.docstoc.com/docs/19557270/HENRICO-COUNTY-PUBLIC-SCHOOLS-BOOK- SURVEY-REPORT 14 Ibid., p. 17. © 2010 The Hanover Research Council – District Administration Practice 9
DISTRICT ADMINISTRATION PRACTICE MARCH 2010 When asked if iBooks should be offered in the following year, 88 percent of students receiving free/reduced lunch indicated that they wanted them to return. Only 77 percent of students who did not receive free/reduced lunch were similarly enthusiastic about continuing the laptop program. Concerning race and ethnicity, African-American students were most enthusiastic about the program being continued (89 percent), while White students were least enthusiastic (75 percent). Furthermore, 85 percent of Hispanic students, 83 percent of Asian students, and 82 percent of students of other ethnicities wanted the laptop program to extend to the following year.15 While the Development Associates report did not track and measure student achievement in light of the new laptop initiative, it is useful to note students’ qualitative perceptions of the program. If student attitudes toward the implementation of new technologies are negative, it is less likely that the same technologies will have positive impacts on their scholastic achievement. Case 3: The Berkshire Wireless Learning Initiative, Western Massachusetts The Berkshire Wireless Learning Initiative was implemented across five western Massachusetts middle schools over three years beginning in 2005. The program’s overall aim was to determine the extent to which a one-to-one computing environment would affect teaching and learning in an otherwise traditional setting. The primary goal of the program was to enhance student achievement, while other goals had more qualitative bases, such as enhancing students’ capabilities to conduct independent research and improving student engagement.16 In order to study the change in student achievement once the laptop program was implemented, the researchers compiled ten years’ worth of student performance results on the Massachusetts Comprehensive Assessment System (MCAS). Other student-level data was provided directly by the participating schools for the period 2005 to 2008. For a control group, the researchers further gathered comparison data from two nearby middle schools that had similar demographics, but had not implemented a laptop program.17 The study addressed two points with regard to its primary aim to measure student academic success under the laptop program. Specifically, the investigation addressed: 1) trends in the schools’ overall MCAS performance over time compared to the comparison schools and statewide trends during this same period, and 15 Ibid., p. 19. 16 Bebell, Damian and Rachel Kay. ―One to One Computing: A Summary of the Quantitative Results from the Berkshire Wireless Learning Initiative.‖ Journal of Technology, Learning, and Assessment, Vol. 9, No. 2 (Jan 2010): p. 7. http://escholarship.bc.edu/cgi/viewcontent.cgi?article=1222&context=jtla 17 Ibid., p. 8. © 2010 The Hanover Research Council – District Administration Practice 10
DISTRICT ADMINISTRATION PRACTICE MARCH 2010 2) which, if any, of students’ technology uses in school or at home are related to student-level performance on various MCAS measures (while statistically controlling for students’ pre-BWLI academic performance using prior MCAS performance).18 The study’s overall conclusion regarding student achievement may be summarized as follows: After three years of 1:1 implementation there was evidence that student achievement had been positively enhanced through the types of educational access and opportunities afforded by the 1:1 pilot program.19 The authors support this conclusion through a discussion of teacher and administrator observations and beliefs, achievement trends in MCAS performance, and the results of a computer-writing study. The latter two aspects will be discussed in the subsections to follow. MCAS Performance Table 3.1 displays the percent of students passing the eighth Grade MCAS in mathematics from 1998 to 2008 at the state level, for the comparison group, and for the group of schools participating in the BWLI initiative.20 Table 3.1: Mathematics MCAS Pass Rates for BWLI, Comparison, and State Schools 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 BWLI 50% 50% 47% 57% 54% 58% 60% 55% 59% 65% 70% Comparison 50% 53% 58% 64% 64% 64% 68% 67% 74% 74% 76% State 58% 60% 61% 69% 67% 67% 71% 70% 71% 75% 76% Source: Journal of Technology, Learning, and Assessment Both the BWLI group and the comparison group lag behind the state average of passing students from the outset of the study data. However, students in the comparison group make steady gains in their MCAS scores, closing the gap between their achievement levels and the state’s by 2006. The BWLI group does not fare as well, seeing rising figures on average, though not as dramatic of gains as seen in the comparison group. By 2005 and 2006, BWLI passing rates lag significantly behind the state and comparison groups. 18 Ibid., p. 13. 19 Ibid., p. 25. 20 Ibid. p. 32. © 2010 The Hanover Research Council – District Administration Practice 11
DISTRICT ADMINISTRATION PRACTICE MARCH 2010 The Spring 2007 MCAS is the first time the eighth graders taking the test had 1:1 laptop access throughout their eighth grade year as well as at least half of their seventh grade year in BWLI schools. As the table above demonstrates, this cohort saw promising improvement in their math MCAS performance both in 2007 and 2008, increasing 5 percent each year and beginning to close the achievement gap. As the authors succinctly state, ―in other words, this unprecedented two-year improvement in eighth grade math pass rates across BWLI settings corresponded with the years students participated in the 1:1 laptop program.‖21 Although the design of the study precludes the authors’ ability to say for certain that participation in the laptop initiative improved test scores, it is safe to assume that one possible explanation for the test scores’ leap in the same year that the program was implemented is that 1:1 participation fostered performance improvements. Writing Assessment Results A further point of interest for the researchers was the laptop program’s impact on students’ writing abilities. The MCAS only tests pencil and paper writing responses, which some studies suggest may not appropriately evaluate the writing abilities of students who have grown accustomed to writing and editing using a computer. 22 In order to test this hypothesis, the researchers randomly assigned students to one of two groups: one group completed a mock MCAS writing assessment in the traditional format, and the other was given the prompt on a computer with all editing and formatting tools disabled. In order to score the tests, all responses were transcribed electronically to avoid scorer bias based on format. Table 3.2 presents a summary of student performance under both the computer and paper testing conditions. Table 3.2: 7th Grade Results for Students Completing the Mock MCAS Writing Assessment No. of Students Topic Score Conventions Score Word Count Computer 310 7.2 5.6 388 Paper 141 6.6 5.3 302 Source: Journal of Technology, Learning, and Assessment The seventh graders selected for testing using this mock MCAS writing assessment had participated in the laptop program for two years. After two years of technology- intensive learning, students using the laptops both wrote longer responses and scored higher on their open responses than students responding with paper and pencil. Bebell and Kay draw the conclusion that these results are strong indicators of the positive influence of the laptop program on students’ writing abilities: 21 Ibid. p. 33. 22 Ibid., p. 14. © 2010 The Hanover Research Council – District Administration Practice 12
DISTRICT ADMINISTRATION PRACTICE MARCH 2010 Given these results, it is clear that pilot students, after using a laptop across their sixth and seventh grade years of middle school, performed better across both writing scales when allowed to complete the writing assessment using their BWLI computers.23 Case 4: Technology Immersion Pilot, Texas Public Schools In 2003, the Texas Legislature decided that immersing schools in technology would be more effective at increasing technology usage in teaching and learning than introducing it cyclically over time.24 The Texas Education Agency allocated $20 million to fund technology immersion projects at high-need middle schools, dubbed the Technology Immersion Pilot, or TIP. At the same time, a four-year research study evaluated the program’s effect on teaching and learning. The study’s design included comparisons between 21 control schools and 21 schools participating in TIP. Student achievement was evaluated by performance on the statewide Texas Assessment of Knowledge and Skills (TAKS) assessment, which was implemented in 2003 to test students’ mastery of the state’s content standards.25 The test is criterion- referenced, and evidence supports its content and construct validity. The test material was developed by Harcourt Assessment and Pearson, with input from educators and the general public. Students affected by the pilot program were a largely diverse group and primarily economically disadvantaged. Table 4.1 displays the demographics of each of the three study cohorts. Table 4.1: Demographic Characteristics of Technology Immersion Students by Cohort26 Cohort 1 Cohort 2 Cohort 3 th 8 Graders 8th Graders 7th Graders 2006-07 2007-08 2007-08 Number of students 2,586 2,578 2,547 % Economically Disadvantaged 75.8 75.5 76.7 % African American 5.8 5.1 4.3 % Hispanic 72.7 75.1 75.9 % White 20.4 18.8 19.2 % Limited English Proficient 48.6 49.7 48.4 23 Ibid., p. 45. 24 Shapley, Kelly, Daniel Sheehan, Catherine Maloney, and Fanny Caranikas-Walker. ―Evaluating the Implementation Fidelity of Technology Immersion and its Relationship with Student Achievement.‖ Journal of Technology, Learning, and Assessment, Vol. 9, No. 4 (Jan 2010). http://escholarship.bc.edu/cgi/viewcontent.cgi?article=1204&context=jtla 25 Ibid., p. 17. 26 Ibid., p. 14. © 2010 The Hanover Research Council – District Administration Practice 13
DISTRICT ADMINISTRATION PRACTICE MARCH 2010 Cohort 1 Cohort 2 Cohort 3 th 8 Graders 8th Graders 7th Graders 2006-07 2007-08 2007-08 % Female 48.6 49.7 48.4 % Male 51.4 50.3 51.6 Source: Journal of Technology, Learning and Assessment The study focused primarily on the level of technology implementation at the student, teacher, administrative, and home levels when schools adopted a technologically immersive environment. Student achievement was then framed by these determined levels of implementation, to see if outcomes varied by the amount of instructional and other supports students received. Analyses used standardized implementation indicators (z scores) and predictors that measured school supports (Immersion Support Index), the extent of teachers’ classroom immersion (Classroom Immersion Index), and the extent of students’ technology access and use (Student Access and Use Index).27 These were analyzed against students’ TAKS scores, which were standardized by the authors so that the median score was 50 with a standard deviation of 10. Results of the study indicate that only Immersion Support was a positive predictor for Cohort 1 eighth graders’ reading achievement, after controlling for prior student achievement, demographic characteristics, school poverty, and classroom immersion. Cohort 2 students with language arts teachers with average levels of Classroom Immersion had slightly higher TAKS reading scores over students with teachers with below average Classroom Immersion scores.28 The level of student access and technology use was a stronger and more consistent predictor of reading achievement. Higher levels of student access and use had a consistently positive effect on TAKS reading assessments for all three cohorts. Additionally, study results found that home learning, or the amount of time a student spent completing school-related tasks on their laptop at home, was ―the strongest implementation predictor of reading achievement.‖29 Overall, the study shows that technology immersion through one-to-one computing initiatives has a positive relationship with student achievement in reading. The most important point from the study for the purposes of this report is the finding regarding the importance of home learning. Students in the study were largely economically disadvantaged and members of minority groups (see Table 4.1). Thus, the finding for the importance of home learning underscores the role that individual student laptops play in promoting ubiquitous learning and closing the ―digital divide‖ by equalizing the out-of-school learning opportunities for students in disadvantaged 27 Ibid., p. 34. 28 Ibid., p. 37. 29 Ibid., p. 40. © 2010 The Hanover Research Council – District Administration Practice 14
DISTRICT ADMINISTRATION PRACTICE MARCH 2010 family situations. Indeed, ―individual student laptops, in contrast to laptops on carts or computers available in libraries, labs, and classrooms, expanded where and how learning occurred.‖30 Case 5: Estrella School District, Southern California Estrella School District, a pseudonym used to represent a school district in southern California, is a moderate size suburban school district with approximately 14,000 K-8 students.31 The district features a diverse student population: 47 percent Hispanic, 28 percent White, 20 percent Asian, and 5 percent in the ―other,‖ multi-ethnic, or unstated category. The district is also economically diverse, with 40 percent of students participating in the free or reduced-price lunch program. ESD implemented its 1:1 laptop initiative in 2004, choosing two middle schools and two elementary schools to participate. School selection was based on district administrators’ hope to test the program at both ends of the economic spectrum. The subsequent study sought to measure the effects of the 1:1 laptop program on student achievement on the California Standards Test (CST) English Language Arts section. The CST is a criterion-referenced test designed to allow students to demonstrate their mastery of California’s academic standards for their grade level. The CST scores are scaled to a normal distribution in the range of 150 to 600 points, with 350 as the threshold for ―passing‖ or adequate performance, and are stable from year to year, facilitating comparison over multiple years.32 The study group consisted of 54 fourth grade students participating in the 1:1 laptop program and 54 students in the non-laptop control group. The authors analyzed English Language Arts (ELA) total and subtest scores to identify the effects of the laptop program, as well as a number of background characteristics, including parent education level, ethnicity, and gender. It should be noted that 12 students in the treatment group were also participants in a gifted and talented program, whereas none of the control group students were identified as such.33 Table 5.1 summarizes the study’s findings regarding the laptop and non-laptop groups’ performances on the CST ELA section as well as on each ELA subtest. 30 Ibid., p. 49. 31 Suhr, Kurt, David Hernandez, Douglas Grimes, and Mark Warschauer. ―Laptops and Fourth-Grade Literacy: Assisting the Jump Over the Fourth Grade Slump.‖ Journal of Technology, Learning, and Assessment, Vol. 9, No. 5 (Jan 2010): p. 12. http://escholarship.bc.edu/cgi/viewcontent.cgi?article=1207&context=jtla 32 Ibid., p. 13. 33 Ibid., p. 14. © 2010 The Hanover Research Council – District Administration Practice 15
DISTRICT ADMINISTRATION PRACTICE MARCH 2010 Table 5.1: Changes in ELA Scores: Laptop and Non-Laptop Groups Year 1 Year 2 Combined Total ELA Laptop 19.56% 2.19% 21.74% Non-laptop 26.67% -16.83% 9.83% Difference -7.11% 19.02% 11.91% ELA Subtests Literary Response and Analysis Laptop -0.05% 3.76% 3.70% Non-laptop -0.04% 2.76% 2.72% Difference -0.01% 1.00% 0.98% Writing Strategies Laptop 4.37% 1.89% 6.26% Non-laptop 4.57% 0.19% 4.76% Difference -0.20% 1.70% 1.50% Source: Journal of Technology, Learning and Assessment In the first year of the program, both laptop and non-laptop groups made significant progress on their total ELA scores, with non-laptop students actually seeing a larger gain than their laptop counterparts. In Year 2, however, the non-laptop students lost most of their previous year’s gain, falling almost 17 percent, while progress with the laptop group slowed to a statistically insignificant 2 percent. Overall both groups saw notable gains in their ELA achievement, and the difference between groups was ultimately statistically insignificant.34 Achievement on the subtests presents a similar picture. There appears to be very little difference in scores between the laptop and non-laptop students. The authors further analyzed the test results using multiple regression analyses. Interestingly, none of the independent variables—parent education level (a proxy for socioeconomic status), gifted and talented designation, or laptop participation—were significant predictors of improved achievement on the CST. However, participation in the laptop program consistently had positive effects on change in ELA score, literary response and analyses scores, and writing strategies scores. 35 In fact, analysis of variance (ANOVA) and multivariate analysis of variance (MANOVA) tests showed that after Year 2, ―laptop students significantly outperformed non-laptop students in their change scores for literary response and analysis and writing strategies.‖36 Table 5.2 presents the means and standard 34 Ibid., p. 28. 35 Ibid., p. 32. 36 Ibid., p. 36. © 2010 The Hanover Research Council – District Administration Practice 16
DISTRICT ADMINISTRATION PRACTICE MARCH 2010 deviations of changes in laptop and non-laptop students’ CST ELA total scores and subtest scores to illustrate this finding. Table 5.2: Changes in Laptop and Non-Laptop Students’ ELA Scores, Means and Standard Deviations Year 1 Year 2 Combined ELA test or subtest M SD M SD M SD Total scale score Non-laptop 26.67 29.64 -16.83 28.35 9.83 40.41 Laptop 19.56 29.35 2.19 34.33 21.74 32.43 Word analysis and vocabulary development Non-laptop -1.11 2.23 -3.70 2.59 -4.81 2.84 Laptop -1.83 1.89 -3.30 1.55 -5.13 1.92 Reading comprehension Non-laptop 1.19 2.49 -1.39 2.51 -0.20 2.74 Laptop 0.87 1.66 -0.44 2.38 0.43 2.54 Literary response and analysis Non-laptop -0.04 1.48 2.76 2.29 2.72 2.30 Laptop -0.06 1.65 3.76 1.62 3.70 1.95 Written and oral language conventions Non-laptop 3.85 2.74 0.20 2.18 4.06 2.58 Laptop 4.17 2.08 -0.35 2.28 3.81 1.96 Writing strategies Non-laptop 4.57 2.20 0.19 2.47 4.76 2.90 Laptop 4.37 2.32 1.89 2.57 6.26 2.44 Source: Journal of Technology, Learning and Assessment The authors note that the generalizability of their study is limited due to a number of factors. First, they could not control for some variables and school characteristics which may have influenced students’ achievement, such as funding, school size, teacher education level and experience, and pedagogy. Additionally, there was insufficient representation from ethnic groups aside from White and Asian to check for differences in performance that might be attributable to race or ethnicity. Finally, and perhaps most importantly, the length of the study was insufficient to draw long- term conclusions. Because student performance was better over the two years of the study versus over just the first year, it is possible that a longer study would discover even more positive outcomes as teachers and students continue to learn to make better use of the laptops. Nevertheless, ―the study adds to an emerging body of literature suggesting that laptop use over multiple years may have a small positive effect on literacy test score outcomes.‖37 37 Ibid., p. 39. © 2010 The Hanover Research Council – District Administration Practice 17
DISTRICT ADMINISTRATION PRACTICE MARCH 2010 Case 6: Harvest Park Middle School, Pleasanton, CA Harvest Park Middle School in Pleasanton, California serves a suburban, increasingly diverse population in a highly educated, high-income community. The school’s laptop program, established in 2001, was implemented as the school strived to maintain or improve its already-high standards in the face of rapid enrollment growth. The laptop program was the result of a partnership between the school and a number of local high-tech businesses.38 Unlike a number of other laptop programs, participants in Harvest Park’s laptop initiative were self-selected. Parents purchased laptops for their children to use. Parents who were unable to do so for financial reasons but whose children wished to participate in the program were able to appeal to a Laptop Advisory Committee (LAC) for assistance. The LAC has not denied an application for a loaner laptop to date. The total enrollment and demographic characteristics of students in the school are presented in Tables 6.1 and 6.2, respectively. Table 6.1: Laptop Immersion Program and Total Enrollment by Grade39 Grade Laptop Program Enrollment Total School Enrollment 6 91 353 7 93 361 8 75 371 Total 259 1085 Source: Journal of Technology, Learning and Assessment Table 6.2: Student Demographics: Laptop Immersion Program and School-Wide40 Student Demographics Laptop School-wide Ethnicity Asian 14% 16% Filipino 1% 2% Hispanic/Latino 6% 7% African American 0% 1% White 79% 74% Gender Female 44% 49% Male 56% 51% Gifted and Talented 27% 24% 38 Cengiz Gulek, James and Hakan Demirtas. ―Learning With Technology: The Impact of Laptop Use on Student Achievement.‖ Journal of Technology, Learning, and Assessment, Vol. 3, No. 2 (Jan 2005). http://escholarship.bc.edu/cgi/viewcontent.cgi?article=1052&context=jtla 39 Ibid., p. 9. 40 Ibid., p. 10. © 2010 The Hanover Research Council – District Administration Practice 18
DISTRICT ADMINISTRATION PRACTICE MARCH 2010 Student Demographics Laptop School-wide Special Education 5% 10% Economically Disadvantaged 1% 4% English Language Learner 1% 3% Parent Education Level Graduate School 42% 37% College Graduate 46% 44% Some College 10% 12% High School Graduate 2% 6% Not High School Graduate 0% 1% Source: Journal of Technology, Learning and Assessment Demographic indicators show no more than a five percent difference between laptop and non-laptop students, indicating demographic profiles for the two groups are similar and appropriate for comparison. The assessment of Harvest Park’s laptop immersion program sought to answer four outcomes-based questions in regard to student achievement: Does the laptop program have an impact on students’ grade point averages? Does the laptop program have an impact on students’ end-of-course grades? Does the laptop program have an impact on students’ essay writing skills? Does the laptop program have an impact on students’ standardized test scores? To address the first question, the authors gathered information on students’ grade point averages for the 2003-2004 academic year, three years after the laptop program’s implementation. They found an average difference between laptop and non-laptop students’ GPAs of 0.29, with the greatest difference occurring amongst sixth grade students (a difference of 0.37).41 To evaluate the second research objective, the authors accessed the end-of-course grades for all students in English Language Arts and Mathematics at the middle school for the 2003-2004 academic year and separated them into laptop and non- laptop groups for comparison. The results for English Language Arts are presented in Table 6.3. 41 Ibid., p. 13. © 2010 The Hanover Research Council – District Administration Practice 19
DISTRICT ADMINISTRATION PRACTICE MARCH 2010 Table 6.3: End of Course Grades by Grade Level and Program for English Language Arts42 End of Grade 6 Grade 7 Grade 8 Course Laptop Non-Laptop Laptop Non-Laptop Laptop Non-Laptop Grade A 50% 38% 39% 23% 36% 39% B 42% 32% 45% 33% 54% 40% C 7% 21% 11% 28% 10% 17% D 1% 6% 3% 9% 0% 3% F 0% 3% 2% 7% 0% 1% Source: Journal of Technology, Learning and Assessment There is a significant difference in achievement between the laptop and non-laptop groups as demonstrated by the grades above. For sixth graders, 92 percent of students in the laptop program earned an A or B in ELA, compared with only 70 percent of non-laptop students. The difference is even greater amongst seventh graders, where 84 percent of laptop students earned an A or B, compared with 56 percent of non-laptop students. The gap closes significantly by eighth grade, though there is still a large discrepancy between the two groups: 90 percent of laptop students earned an A or B compared with 79 percent of non-laptop students. Across all years, there were fewer F grades amongst the laptop students compared to non- laptop students. The study’s third research objective, to determine whether participation in the laptop program improved students’ writing abilities, was measured by assessing the results of a 2004 district writing assessment administered to all sixth and eighth grade students. Table 6.4 presents the distribution of student scores on the writing assessment for laptop students, Harvest Park as a whole, and the district average. Table 6.4: 2004 District Writing Assessment Results by Grade Level and Program Enrollment43 Score of 4 Score of 3 Score of 2 Score of 1 (Advanced) (Solid) (Limited) (Minimal) Laptop Program 17% 78% 5% 0% Grade 6 Harvest Park 16% 68% 16% 1% District Average 9% 72% 19% 2% Laptop Program 15% 76% 9% 0% Grade 8 Harvest Park 17% 66% 17% 2% District Average 16% 68% 16% 2% Source: Journal of Technology, Learning and Assessment 42 Ibid., p. 14. 43 Ibid., p. 15. © 2010 The Hanover Research Council – District Administration Practice 20
DISTRICT ADMINISTRATION PRACTICE MARCH 2010 Fewer eighth grade laptop students scored a ―4‖ on the writing assessment than their school and district peers, though more laptop students scored in the ―solid‖ range (score of 3), which ultimately yielded a smaller proportion of laptop students in the lowest scoring categories (scores of 1 and 2). For sixth graders, the highest-scoring group closely matched their school peers, and together all laptop students far outperformed their district peers. Overall, laptop students outperformed both their school and district peers: 95 percent of laptop sixth graders achieved a 3 or 4, compared with 84 percent of school peers and 79 percent of district peers, while 91 percent of laptop eighth graders earned a 3 or 4, compared with 83 percent of school peers and 84 percent of district peers. The laptop program may not consistently push performance to its highest possibilities, but it does appear to have some bearing on overall student achievement. The final research question concerned students’ performance on state standardized tests. The authors examined scores on California Standards Tests for both Mathematics and English Language Arts given to public school students from grades two through eleven. As demonstrated in Table 6.5, the scores reveal notably higher achievement levels for students in the laptop program than those who were not. Table 6.5: 2004 CST English Language Arts Results: Percent of Students Scoring Proficient or Advanced44 English Language Arts Laptop 80% Grade 6 Non-Laptop 68% Laptop 83% Grade 7 Non-Laptop 64% Laptop 76% Grade 8 Non-Laptop 56% Source: Journal of Technology, Learning and Assessment In order to provide some perspective, the authors accessed baseline assessment results for 2000-2001, the year before the first cohort of laptop students entered the program. When the 2000-2001 data is compared with the following year’s data, after students had participated in the laptop program for one year, there were only minor differences between laptop and non-laptop students across most areas. With the exception of the CST ELA test, laptop students saw gains in test performance across the study year.45 For Cohort 2, when the same analysis was performed, students saw the strongest gains in CST ELA test results. For Cohort 3, students saw declines in average performance except for the District Writing Test, on which both groups of students performed notably better.46 44 Ibid., p. 17. 45 Ibid., p. 19. 46 Ibid., p. 24. © 2010 The Hanover Research Council – District Administration Practice 21
DISTRICT ADMINISTRATION PRACTICE MARCH 2010 This study has a number of shortcomings. First, the authors only examine one year of data for all objectives except achievement on state assessments. They do not present historical data to show that student achievement has actually improved over time, only drawing conclusions by comparing one group of students with another across one year. No other control groups were created, and no analysis was performed based on economic status, parents’ level of education, or race/ethnicity to determine if variables other than participation in the laptop program had any bearing on student achievement. For the writing program analysis, it was not stated whether the district assessments were offered on a computer, with pencil and paper, or as a mixture of both. As other studies have noted, students who learn to improve and hone their writing and editing skills on a computer may be at a disadvantage when suddenly asked to perform with pencil and paper on a standardized test. Case 7: The State of Maine Beginning in the fall of 2002, all Maine seventh and eighth graders were provided laptop computers as part of a statewide technology initiative aimed at improving performance in state middle schools.47 The program has impacted over 100,000 Maine middle school students and their teachers. A research brief was subsequently prepared by researchers at the Maine Education Policy Research Institute to determine the laptop program’s impact on students’ writing abilities. Given the unprecedented scope of Maine’s laptop initiative, many expected to see significant improvements in student achievement, particularly on standardized tests. However, student achievement on the eighth grade Maine Education Assessments (MEA) ―has not changed appreciably since the inception of the laptop program.‖48 However, the authors note three possible reasons for this lack of expected improvement: first, it takes time to see the results of educational reforms when studying student achievement, so perhaps the program has not been in place long enough for students and teachers to be effectively and efficiently utilizing the technology. Second, the method of implementation may have impacted the results, as there was no central control over how the program was implemented at each school. Third, and most importantly according to the authors, is that MEA assessments are ill-equipped to measure the 21st century skills developed in ubiquitous computing environments.49 Silvernail and Gritter decided to focus on the laptop program’s effect on students’ writing abilities, as numerous researchers prior had pointed toward a significant correlation between technology adoption and improved writing skills and processes. 47 Silvernail, David and Aaron Gritter. ―Maine’s Middle School Laptop Program: Creating Better Writers.‖ Maine Education Policy Research Institute, University of Southern Maine (2008). http://www.usm.maine.edu/cepare/Impact_on_Student_Writing_Brief.pdf 48 Ibid., p. 4. 49 Ibid., p. 4. © 2010 The Hanover Research Council – District Administration Practice 22
DISTRICT ADMINISTRATION PRACTICE MARCH 2010 The authors compared MEA writing scores in 2000, the year before the program’s implementation, with scores in 2005. Table 7.1 presents the results. Table 7.1: MEA Writing Scores, Average and Standard Deviation, 2000 and 2005 Students Score S.D. Effect Size 2000 16,557 534.11 10.61 0.32 2005 16,251 537.55 9.17 Source: Maine Education Policy Research Institute The Effect Size was then calculated, which is a measure designed to quantify the magnitude of differences between the average scores. The Effect Size in this case was 0.32, or approximately 1/3 of a standard deviation: ―put another way, an average student in 2005 scored better than approximately two-thirds of all students in 2000.‖50 Additionally, the study revealed a 12.3 percent increase in the number of students meeting or exceeding writing proficiency between 2000 and 2005. A particularly interesting finding of the Maine study is that students’ performance on the writing section of the MEA correlated with self-reported laptop-related writing activities. Table 7.2 shows how the writing scores relate to four levels of student engagement with writing on the laptop, demonstrating that higher levels of laptop use in the writing process result in statistically significant writing score increases. Table 7.2: Type of Laptop Use in Writing and MEA Scores51 Survey Question Number of Scale Score Stem Responses Students X S.D. Drafts and final copy 11,593 538.8 8.97 How do you use Final copy only 3,413 537.7 8.89 your laptop for Drafts only 233 533.0 9.74 writing? Not at all 642 532.0 9.63 Source: Maine Education Policy Research Institute Another key finding of the study indicates that students’ writing abilities not only improve when tested using a computer, but also on traditional paper tests. This claim refutes findings of earlier studies which claimed that paper testing puts laptop students at a disadvantage as they have learned to compose and edit using a computer. In 2005, some students completed the MEA assessment online, while others took the traditional paper test. Table 7.3 presents each group’s average scores. 50 Ibid., p. 6. 51 Ibid., p. 7. © 2010 The Hanover Research Council – District Administration Practice 23
DISTRICT ADMINISTRATION PRACTICE MARCH 2010 Table 7.3: MEA 2005 Writing Scale Scores by Mode of Writing52 Writing Sample No. of Students Average Score s.d. Online 3,251 537.68 10.52 Longhand 13,000 537.52 8.80 Source: Maine Education Policy Research Institute As is evident, students’ scores are nearly identical, and there is no statistically significant difference between the scores of the two groups. In other words, the authors state, ―writing improved regardless of the writing test medium.‖53 Five years after the initial implementation of the laptop program, it was clear that it had had a positive impact on students’ writing abilities. Students’ scores on Maine’s statewide standardized test significantly improved, and students’ writing abilities improved the more extensively they used their laptops in developing and producing written work products. It was evident as well that the initiative produced better writers in general, not only students who could write better using a computer. 52 Ibid., p. 9. 53 Ibid. © 2010 The Hanover Research Council – District Administration Practice 24
DISTRICT ADMINISTRATION PRACTICE MARCH 2010 Improving Achievement for Students of Low Socioeconomic Status The term ―digital divide‖ refers to the disparity between students who have easy access to computers and use them often and those students who lack such access. While the term does not specifically refer to students of minority backgrounds or students of low socioeconomic status, these groups are often those that suffer the greatest from limited access to technology. A 2005 study by eSchool News found that 3 million young people remain without Internet access, and many of those come from financially disadvantaged backgrounds, a disproportionate number of which are African-American.54 One of the main issues in the study of laptop programs and their effect on student achievement is that these initiatives largely occur in private schools or upper-income public schools. As one researcher noted, ―there has been very little research done to study [1:1 laptop programs’] effectiveness for low-income students.‖55 Widespread use of technology in the classroom may help close the gap between students of varying economic backgrounds. Starting early is key, as once students reach high school, they may feel left behind. As one teacher observes, ―at-risk kids aren’t able to use technology every day and haven’t had exposure to it at home and have to play catch up to learn the technology as well as the lessons. When they’re concentrating so much on the tool rather than the lesson, it costs them time and presents a steep learning curve.‖56 The earlier kids are exposed to technology and learn to operate laptops efficiently, the less of a concern this learning curve will be as students progress in school. Students from low socioeconomic backgrounds are at a particular disadvantage when laptop initiatives ask that parents purchase or lease students’ computers. This may lead to a lower participation rate of low-SES students in resulting studies on academic achievement, meaning that their potential as a group in an immersive technology environment has not been fully explored. At Community School District Six in New York City, for instance, parents are asked to pay a monthly fee equivalent to half of the cost of the students’ annual laptop lease agreement, despite the fact that 94 percent of its 30,000 students live at or below the poverty level. At Clovis Unified School District in Fresno, California, students whose parents provide them with laptop computers are put into immersive environments in which all peers also have laptops, which may leave students who are economically disadvantaged at a further educational disadvantage as they are removed from a large group of their peers.57 54 ―Critical Issue: Using Technology to Improve Student Achievement.‖ North Central Regional Educational Laboratory, Learning Point Associates. http://www.ncrel.org/sdrs/areas/issues/methods/technlgy/te800.htm 55 Hadfield, Nicholas. ―Laptop Programs: Rapid Change and the Search to Justify the Money.‖ P. 8. http://www.scribd.com/doc/18074/Laptop-Programs-The-failure-of-success 56 Long, Cindy. ―Mind the Gap.‖ National Education Association (2008). http://www.nea.org/home/9302.htm 57 Carter, Kim. ―Laptop Lessons: Exploring the Promise of One-to-One Computing.‖ Tech & Learning (15 May 2001). http://www.techlearning.com/article/18520 © 2010 The Hanover Research Council – District Administration Practice 25
You can also read