Skip to content
Menu

Is It Time for a Graduate Warranty Program?

Share this Story on Facebook, X, Text, LinkedIn, Gmail, Yahoo Mail, or Outlook

Background and Purpose of this Report

For many college admissions officers and private employers, high school diplomas no longer adequately signal achievement and competence. When too many diplomas are deficient— that is, they fail to convey the educational attainment of the recipients—new methods of certification are needed.

One indicator of deficient diplomas in Virginia is the finding by the State Council on Higher Education (SCHEV) that about one-fourth of the first-year college students from Virginia’s public school system must take remedial courses due to their failure to demonstrate minimal competency in reading, English, or mathematics. According to SCHEV, the remediation costs amount to almost $40 million annually, of which $15 million is borne by college students and their parents in the form of tuition for unexpected mandatory courses that do not count toward degree requirements. That leaves $25 million to be paid by the rest of Virginia’s taxpayers. A review of national statistics confirms that the Commonwealth is not alone in issuing deficient high school diplomas, and the national remediation rate appears to be even higher than Virginia’s.

In some states, legislators have proposed that the cost of remedial instruction be paid by the K-12 school district that produced the unprepared high school graduates in the first place. One school division in Virginia—Hanover County—has voluntarily developed a Graduate Warranty Program, and the county pays for any remedial courses required of warranted students. A joint initiative by SCHEV and Virginia’s Board of Education is being designed to encourage other school divisions to adopt similar plans.

Each annual release of SCHEV’s remediation statistics provokes the kind of media attention that generates criticism and reaction at the local school division level. Believing that adequate preparation—for both college and the workforce—is critical for Virginia’s high school graduates and that much of what passes for “debate” on this issue produces more heat than light, the Thomas Jefferson Institute for Public Policy commissioned the research for this report.

The report evaluates the controversial remediation statistics for the period 1991 -96 in the context of other relevant data, and identifies reasons that some school divisions seem to do a better job of preparing their graduates for college. In addition, the report considers policy options that emerged from the analysis, including ways to upgrade high school students’ preparation for college-level courses, methods to improve the monitoring of college students’ performance, and designs for graduate warranty programs.

Indicators of Effectiveness

The statewide average percentage of former public school graduates requiring remediation in college has been consistently close to 25 percent for several years. However, students from some school divisions have been much more likely to require remediation than their college

classmates from other parts of the state, as the map on the cover of this report illustrates. The SCHEV data indicated a range from a low near 10 percent of first-year students needing remediation to a high near 50 percent. One goal of this study has been to identify factors that account for such differences among school divisions.

Valid accounting for differences in measures, however, requires valid measures in the first place. SCHEV reports only on students at Virginia’s public colleges and universities and, thus, excludes those attending private colleges within the Commonwealth or out-of-state institutions. If one-third of Virginia’s college-bound graduates are excluded from the monitoring system, as our estimates suggest, the SCHEV data alone cannot reveal how well schools are preparing students for college. The bias inherent in such an omission is that the SCHEV data may show two school divisions with similar remediation rates, while considerable differences in college preparatory effectiveness may exist if one of the divisions has a higher percentage of its graduates enrolled in college, albeit on private and/or out-of-state campuses. Criticism of the SCHEV data must be tempered, however, by the observation that the out-of-state and private college enrollment data needed to correct for such omissions are not within SCHEV’s jurisdiction.

The closest approximation to college enrollment data for public school graduates is collected by Virginia’s Department of Education (DOE) in the form of graduates’ “plans” to attend college. However, the National Center for Education Statistics (NCES) has estimated the overall percentage of Virginia graduates attending public and private institutions, both in- and out-of-state, and has compared the remediation rates at public and private colleges. Therefore, using a combination of SCHEV, DOE, and NCES data, we calculated a more valid indicator of graduates’ readiness for college.

For each school division, the percentage of graduates “bound and prepared” for

college was defined as the estimated percentage of high school graduates who met two criteria: (a) they went to college, and (b) they were not required to take remedial courses upon arrival. The map below displays the geographic variation in the percentage of graduates “bound and prepared” for college, and Exhibit 4 (page 14) lists each school division’s percentage.

Percentage of Graduates

Percentage of Each Division’s Graduating Class Bound and Prepared for College

Source: Estimated from SCHEV, DOE, and NCES data, 1991-96

College Bound and Prepared: Why School Divisions Vary
74% of the Variation in School Divisions’ Graduates Being Bound and Prepared Was Accounted For by 2 Factors:(1)    Standardized Test Results(2)    Master’s Degree Teachers

Two factors—students’ test results and teacher education levels—were highly accurate predictors of being “bound and prepared for college.” Graduates were more likely to both (a) attend college, and (b) not need remediation, if they came from school divisions where more teachers had master’s degrees and more juniors scored above the 75thpercentile on national standardized tests. Those two factors accounted for 74 percent of the variation from one school division to another, and, as the pie chart at right indicates, the test results contributed most (68 percent) of that overall statistical explanation.

With those findings in hand, the analysis moved to a second stage to determine what accounted for the differences in school divisions’ test results. The factors examined included those within school officials’ control (education policies and practices) and also the community and school demographic characteristics that had to be statistically isolated prior to making valid inter-division comparisons.

Overall, 86 percent of the variation in school division test results was accounted for by five education factors (listed below) and two demographic factors (the percentage of the community’s adults with high school diplomas and the percentage of Black Americans in the community). The pie chart below makes it clear that, while the demographic factors accounted for almost half of the school division differences in test results, nearly 40 percent of the variation was attributable to education policies and practices that are within the control of education policy makers and administrators.

P/T Ratio grades K-6

86% of the Variation in School Divisions’ 11th Grade Test Results Was Accounted For by 7 Factors

DEMOGRAPHIC FACTORS: 48%

–> % Adults with HS Diploma –> % Black Americans EDUCATION FACTORS: 38%

–> % Receiving Adv Studies Diploma –> % Taking Adv Placement Courses –> % Passing Adv Placement Exams –> % Absent More than 10 days, 9-12 –> Pupil-Teacher Ratio, K-6

Excessive Absenteeism grades 9-12
AP Exam Scores F0

percentage in parentheses indicates how much each factor contributed to the test results, and +/- signs indicates the direction of the influence

The common thread for three of the five education factors was advanced, usually rigorous, course work. When comparing school divisions with similar demographic features, juniors’ test results were better in those divisions where more students enrolled in courses required for the advanced studies diploma, where more students took college-level advanced placement (AP) courses, and where more students passed at least one national AP exam. Test scores were also higher—again, other things being equal—in school divisions where excessive absenteeism (missing more than 10 days) among high school students was lower, and where the pupil-teacher ratio in grades K-6 was lower.

Policy Options

It is clear that high school graduates’ chances of being “bound and prepared for college” are lower in some school divisions than others, and policy considerations should focus on what is necessary to increase those chances.

Improving Test Results

Improving graduates ’ chances of going to college and being prepared for college-level work upon arrival is a realistic objective for school divisions in Virginia. Many of the factors that affect the chances for graduates to achieve that status are within the control of education policy makers.

In our earlier, much broader study of school division test scores at all grade levels {Understanding Virginia’s Report Card, Thomas Jefferson Institute for Public Policy, November 1997), many of the policy recommendations were incentive-based; that is, school divisions would have financial incentives for vigorously pursuing measurable improvements in critical areas that would improve test results. The findings in this report, which relate just to 11th grade test results, reaffirm the importance of those earlier recommendations, which included:

•    Raising Academic Standards through vigorous implementation of the new Standards of Learning (SOL), elimination of social promotion, and making promotion contingent on passing the SOL achievement tests;

•    Reducing Excessive Absenteeism through financial incentive programs at both the division and individual school level;

•    Providing Incentives for Master’s Degrees, including tax credits and state matching salary supplements; and

•    Reducing Pupil-Teacher Ratios in grades K-3 to 17:1 and in grades 4-6 to 19:1, by joint state and local funding for 1600 new teachers in the 65 school divisions with ratios above the statewide average.

If these earlier recommendations were implemented, along with concentrated efforts to increase student enrollment in challenging courses (such as advanced placement courses and those required for the advanced studies diploma), improvements in 11th grade test scores would occur, and more graduates would be prepared for college-level course work. Consideration should also be given to financial incentives for extraordinary academic achievements, such as a monetary award to high schools for each student passing an advanced placement exam.

Improving Information Needed for Policy Making and Evaluation

In that earlier study, we also recommended improvements in data collection and reporting to enhance strategic planning and program evaluation at both the state and local levels. The current study reiterates the weaknesses in DOE’s data system, and also points to limitations inherent in the SCHEV student monitoring system.

Assessing the effectiveness of the public school system—an absolute prerequisite for major

changes in policies affecting that system—requires at least the following additions to the current database maintained by DOE and SCHEV:

•    demographic data for individual public and private high schools,

•    high school courses, grades, and SAT scores for public and private high school graduates

enrolled in college (at public and private colleges, in- and out-of-state),

•    college enrollment data for public and private high schools (at public and private colleges,

in- and out-of-state), and

•    remedial courses taken by all public and private high school graduates at all colleges (public

and private, in- and out-of-state).

School Division Warranty Programs

What schools should do to improve the preparation of their students for college is clear. How to get them to do it is less obvious. Several of the earlier policy recommendations contain financial incentives. In addition, success or failure in the implementation of the Standards of Learning will have consequences in terms of school accreditation by spring, 2003. However, it is time for a new policy that provides an incentive structure arching over all the others.

Whether designed as a “carrot” or a “stick” program, a mandatory Graduate Warranty Program that builds on the best features of the Hanover County innovation and the initiative undertaken by SCHEV and the Board of Education deserves serious attention at the highest levels of Virginia state government. There are two essential components of such a program: (1) Graduates who met certain academic requirements would be certified by their school divisions as being “ready for college,” and (2) the school divisions would promise to reimburse any certified graduates who had to take remedial courses during their first year in college. SCHEV and Board of Education officials are hoping that private donors will provide sufficient “matching fund” incentives to entice some school divisions to participate in the Graduate Guarantee Program, but that voluntary approach has yet to work. The Governor and General Assembly should give serious consideration to a mandatory Graduate Warranty Program.

A new state policy creating another state mandate for local schools, however, would necessitate at least temporary partial state funding through matching grants. Ultimately, however, to have the desired incentive effect—namely, to get school divisions to accept responsibility for the quality of their graduates and reduce the need for remedial courses—a. Graduate Warranty Program would have to “bite” financially closest to home. If school administrators had to appear before local school boards and county supervisors to request funding for former graduates’ remedial course tuition, they would have to be ready with answers to tough questions from taxpayers and their elected representatives. After an initial start-up period of 3 to 5 years, therefore, localities should have to pay for warranty claims from local budgets alone.

Creating a remedial course tuition insurance program should not be the intent of a graduate warranty program. If such a program merely compensated students and parents for the $15 million in annual remedial tuition without affecting the education of students in the public schools, it would just shift $15 million more to the $25 million deficient diploma burden already borne by Virginia taxpayers, and the Commonwealth’s public school graduates would be no better prepared for college.

Instead, the goal should be the reduction in the need for remedial courses in college. Our strategy should be to develop an incentive structure within which local school officials would see opportunities foregone due to new or reprogrammed funds being diverted to warranty claims, and see the benefits of producing more graduates who are bound and prepared for college.

About the Author of Education Policy Research Commissioned by the Jefferson Institute for Public Policy

I. David Wheat, Jr. is a strategic planning consultant retained by the Thomas Jefferson

Institute for Public Policy to examine timely education policy issues. Previous reports include Understanding Virginia’s Report Card: Why Standardized Test Scores Vary from One Community to Another (November 1997), 2000 New Teachers: Where Are They Needed Most? (February 1998), Car Tax Cuts: How Should Localities be Reimbursed? (February 1998), Raising Student Attendance: Some Low Cost Strategies (March 1998), and Local Perspective in a State Office: The Legislator’s Dilemma (March 1998), and Deficient Diplomas: Is It Time for A Graduate Warranty Program? (September 1998).1

He is president of Wheat Resources Inc., a consulting firm established in 1981 that specializes in helping both private and public sector clients organize and analyze data they use in making strategic decisions. He received his Master’s Degree in Public Policy from Harvard University’s Kennedy School of Government in 1972, and then served three years as a White House staff assistant specializing in economic and energy issues. Later, at the University of Houston, he served as Director of Federal Relations and taught a graduate course on public policy implementation.

His education policy consulting work is enhanced by several years of nationally recognized classroom instruction experience in Virginia public schools, as well as by service on the Governor’s Commission on Champion Schools, where he participated in the upgrading of the history and social science Standards of Learning for Virginia’s students. He also teaches political science at Virginia Western Community College.

1

Copies are available from the Jefferson Institute for Public Policy (voice: 703-440-9447; fax: 703-455-1531) or from the author (voice: 540-966-5939; fax: 540-966-5167). In addition, downloadable versions are available at the www.wheatresources.com web site.


Share this Story on Facebook, X, Text, LinkedIn, Gmail, Yahoo Mail, or Outlook

Join Our Email List

Name(Required)
Address
Sign me up for:
This field is for validation purposes and should be left unchanged.