In September 1997, the Virginia Board of Education adopted new regulations that make individual schools accountable for their performance on achievement tests based on the rigorous Standards of Learning (SOLs) adopted in 1995.
Both the standards and the stakes of statewide testing in Virginia will soon rise. Yet, most Virginia students are performing no better on standardized tests today than in 1991. Taken together, these two realities underscore the formidable task facing education policy makers and school officials who want all schools to attain the higher performance levels.
Although Virginia’s average test scores were relatively stable from 1991 to 1996, there were large differences in scores from one local school division to another. The highest and lowest scoring school divisions were 50 to 60 percentage points apart, and nearly a third of the school divisions had scores that were more than ten points above or below the average.
Believing that clues to improving overall test performance could be found by studying the characteristics of school divisions that produce different scores, the Thomas Jefferson Institute for Public Policy set two goals for this research project:
(1) To find out why test scores vary so much from one locality to another, and
(2) To develop policy recommendations based on the research findings.
Why Test Scores Vary: A Summary of Findings
Applying analytical statistical methods to data collected by Virginia’s Department of Education, this study contradicts those who believe that nothing can make a difference in our public schools. As much as one-third of the variation in test scores across Virginia has resulted from education policies that distinguish some school divisions from others. It could go as high as fifty percent, but we won’t know without access to school-level data that enable researchers to dig deeper than is possible with aggregate division-level data.
Nevertheless, whatever a community’s socio-economic profile may be, its schools can achieve higher test scores when:
– academic standards are raised and social promotion is reduced,
– more students are taking challenging courses (such as 8th graders taking algebra),
– excessive absenteeism is curtailed,
– teachers are encouraged to earn advanced degrees, and
– pupil-teacher ratios in the elementary grades are lowered.
While half of the explanation for different test scores is rooted in the social and economic setting of the school division, such demographic factors are beyond the direct control of school officials. The concerns that prompted this study are very practical, and provoke a key question:
What influence can local school officials and policy makers in Richmond have on the performance of our public schools? Answering that question requires us to focus on how test scores are likely to respond to changes in policy.
The focus of the study was on understanding differences in standardized test results among Virginia’s 132 local school divisions. One step in the research process was the calculation of a composite test score for each division, using weighted average results of tests administered to students at four grade levels over a six-year period from 1991 to 1996. Typically, the test results had been published in a form that identified the percentage of students achieving a certain level of performance, such as the percentage of 8th graders scoring above the national average. Thus, the “test score” in this study is best understood as a composite indicator of the performance level of students rather than a “test grade” in the traditional sense.
The analysis stage of the study used statistical regression procedures to identify relationships between the test scores and dozens of indicators of school division education policies and local socio-economic conditions. The final product of the research was a set of quantitative estimates of the “effects” of each factor on test scores. In the table below, the five education policy factors that were found to have statistically significant relationships with test scores are listed, along with estimates of their effects on test scores that resulted from the analysis. Each factor and its relationship to test scores is explained in detail in the body of this report. The table provides a quick overview of the significance of the research findings.
Each of these factors is controllable or subject to influence by state and local officials. The table shows the estimated effects on test scores and students if each factor could be changed from its 1991-96 average to the level indicated in the second column. For example, it is estimated that raising the percentage of students who miss no more than 10 days of school from 72 to 78 percent would raise the percentage of students scoring above the the 50th percentile on national standardized tests by 2.09 points. In this example, therefore, almost 22,400 additional students in Virginia would exceed “the national average.”
Each of the changes in the five education policy factors is equal to what statisticians call one standard deviation, a measure of variation that is not unusual. So the targets for improvement are realistic.
The relative impact of these five factors is translated in the pie chart in a way that can assist policy makers to set priorities based on terms of cost-effectiveness.
The next two pages contain the policy recommendations that flow from the research findings.
8th grade algebra
pupil-teacher ratios 13%
master’s degree teachers 14%
|Factor||change from 91-96 avg||additional pet of students scoring>50th pentile||additional students scoring>50pctile25397|
|academic standards index||81 to 88||2.37|
|% students abs no more than 10 days||72 to 78||2.09||22367|
|% teachers w/ master’s degrees||31 to 39||1.08||11558|
|1st grade pupil-teacher ratios||18 to 16||0.97||10383|
|% 8th graders taking algebra||24 to 37||0.94||10094|
How difficult it will be for Virginia students to pass the Standards of Learning (SOL) achievement tests at each grade level will not be known until the first results are seen this school year. Without aggressive implementation of the SOLs, the challenge may be comparable to the difficulty that 6th graders have in passing the Literacy Passport Tests (LPT).
By the end of school year 2002-03, a 70 percent passing rate on the SOL tests will have to be achieved in order for schools to attain “Fully Accredited” status. Currently, less than one third of the school divisions have passing rates above the 70 percent level on all three portions of the LPT, and the average passing rate is only 65 percent.
To improve the LPT test scores to the point where 95 percent of the school divisions have at least a 70 percent passing rate would require an average passing rate of 80 percent and a 50 percent reduction in the variation in test scores from one locality to another. That would require a 25 percent improvement for the average school division, and those at the lower end would have to do even better than that. Thus, aiming to raise Literacy Passport Tests scores to the point where almost all schools have at least a 70 percent passing rate within five years might seem unrealistic.
With teachers at each grade level fully aware of what students are expected to learn under the new standards, however, performance on the SOL tests should be much better than on the sixth grade LPT. With aggressive implementation of the SOLs in every classroom, there is reason to be optimistic about reaching the 70 percent goal on the Standards of Learning achievement tests.
1. To achieve an average school division passing rate on SOL tests of 80 percent
by the end of school year 2002-03.
2. To achieve a 70 percent passing rate on SOL tests by at least 95 percent of the
school divisions by the end of school year 2002-03.
The Strategic Plan
First Steps: Progress can begin immediately at the local school division level
• by making scholastic achievement the sole criterion for promotion from one grade to the next,
• by increasing emphasis on successful implementation of the new SOLs in all grades, and
• by taking the initiative for improving student attendance.
Next Steps: Meanwhile, at the state level, incentives should be provided
• to teachers and schools for reducing excessive student absenteeism,
• to school divisions for reducing elementary school pupil-teacher ratios,
• to teachers for earning master’s degrees.
1. The Department of Education should monitor implementation of the Standards of Learning, and distribute frequent reports on innovative lesson plans and teaching methods to all schools.
2. The Board of Education should make promotion from grades 3,5,8, and 11 contingent on passing the SOL achievement tests in those grades. “Second-chance” tests should be provided, and a stringent exceptions process should govern extraordinary cases. However, promotion for reasons other than scholastic achievement should come to an abrupt halt.
3. The Department of Education should hold mandatory workshops for all school principals and at least one school division administrator to identify effective tactics for reducing absenteeism.
4. The Board of Education should amend the basic state aid formula so that, beginning in the fall of 1999, the per-pupil basic aid would be adjusted up or down 1/10 of one percent for each one percentage point change in the number of students absent 10 days or less.
5. The General Assembly should fund a Teacher Teamwork Performance Bonus Program for all instructional personnel in schools that meet the student attendance goals.
Master’s Degree Incentives
6. To increase the number of teachers with master’s degrees by 1400 (5 percent) within five years, the General Assembly should establish a Master Teacher Incentive Program that provides:
(1) a $1000 tax rebate to teachers earning new master’s degrees within five years, and
(2) up to a $1000 matching salary supplement for all teachers with master’s degrees.
7. The Board of Education should lower the maximum permissible school division pupil-teacher ratio to 17:1 in grades 1-3 and 19:1 in grades 4-6, effective in school year 2000-01. This would require assigning a total of 1600 new teachers to the 65 school divisions that have pupil-teacher ratios at the elementary level in excess of the current average.
8. The Board should also put a limit on the pupil-teacher ratio disparity between schools within a local school division.
9. The General Assembly should fully fund the additional 1600 teacher positions during a three year phase-in period. After that, the 65 local school divisions that gain additional teachers would share the costs according to their Standards of Quality composite index (“ability to pay”).
10. The Board of Education should establish a citizen review panel to evaluate data published in state documents and the reporting burden on local school divisions, and to make recommendations for streamlining the data reporting and publication process and for discarding, retaining, or adding specific data items.
11. Local school divisions should improve management of individual school performance in all areas by developing database systems that enable more timely assessment and corrective action.
Projected Costs of Major Budget Items*
|State Budgetif performance goals met by:Actions: 50% 75% 100%LocalBudgetscommentsperformance bonuses (divisions) $14.0 $21.0 $28.0annually, beginning 1999performance bonuses (teachers) $20.0 $30.0 $40.01-time, max.$500/teachernew teachers (during start-up) $155.6 $155.6 $155.6$36.4**5-year start-up total**new teachers (after start-up) $29.8 $29.8 $29.8$18.2**annually, after 5th year**perf. bonuses (master’s degrees) $0.7 $1.0 $1.45-year totalmaster’s degree supplements $27.7 $28.0 $28.4annually, beginning 1998TotalsTotal, first 5 years $260.0 $298.7 $337.4$36.4**ANNUAL average, first 5 years $52.0 $59.7 $67.5ANNUALLY, after 5th year $71.5 $78.8 $86.2$7.3**$18.2**|
|*Dollar amounts are in millions, 1997 dollars.**Only applies to school divisions with current K-6 pupil-teacher ratios greater than 18:1|