Print Friendly, PDF & Email

Evidence of Impact

Multiple Measures for Student Placement:
A Review of the Research and Evidence from the Field

A Review of the Research

The use of multiple measures in college placement is relatively new. Moreover, postsecondary institutions and systems are experimenting with a wide range of assessments, ranging from GPA to high school standardized tests, SATs, essays, and non-cognitive assessments. The picture is further complicated by recent changes in high school graduation requirements, and by important variations in state and system policy structure. As a result, there is a dearth of robust research examining which types of assessments effectively place which types of students and under what conditions.

Yet as states and institutions enact the use of multiple measures and begin to track their effect on students, evidence of promising practices is beginning to emerge. This document first provides a brief overview of rigorous research that has examined alternatives to traditional placement exams. Next, it highlights emerging best practices in several states and institutions and describes research that tracks the effects of these initiatives. When taken together, the information provided in this document can inform policymakers and practitioners as they consider whether to adopt or refine the use of multiple measures in college placement.

I.  Evidence-Based Multiple Measures Practice

For all the reasons described above, little research exists on the efficacy of utilizing multiple measures. However, the use of high school GPA has been shown to be effective. Below, we summarize available evidence for this measure.

There is a growing body of research that clearly shows the following:

  • Traditional placement tests are not strong predictors of college readiness; and
  • High school grade point average (HSGPA) is a more accurate predictor of college readiness than the standardized tests traditionally used for course placement.

Placement test scores are not good predictors of course grades, college GPA, credit accumulation, or success in gatekeeper English and math classes. Instead, between a quarter and a third of students have been found to be misplaced based on placement testing, and therefore required to complete developmental education courses that hinder postsecondary completion.1

The use of high school achievement data (i.e. HSGPA and the number of completed units in math and English) has been shown to result in fewer misplacements (both into college-level courses and into developmental education) and higher success rates in college-level courses than traditional placement tests.2 In fact, college and university cumulative grade point averages have a strong association with high school grade point averages, regardless of standardized test scores. Students with strong high school GPAs perform similarly in college, even if their test scores are low; similarly, students with weak high school GPAs do not do well in college, even if they have high test scores.3

Research conducted at the University of Alaska included a larger set of standardized tests and compared their predictive value with HSGPA. Among students who enrolled directly in college courses, despite being recommended to take developmental education courses, HSGPA was a stronger predictor of performance in college English and math courses than were the SAT, ACT, or ACCUPLACER scores. In English, HSGPA explained between 6% and 19% of the differences in students’ grades, whereas SAT, ACT, or ACCUPLACER scores explained less than 3% of the differences across all student groups. In Math, HSGPA explained 10-16% of the differences in students’ grades, whereas SAT, ACT, or ACCUPLACER scores explained 5% or less of the differences across all student groups.5

II. Selected Evidence from the Field: Promising Practices

States and systems across the country have begun to implement varied multiple measures reforms. While data from the field is limited and policies vary widely, emerging research from a number of multiple measures initiatives provides additional evidence of their potential.

California Community Colleges

The community college system in California has a long history of requiring multiple measures for placement, allowing for research on the impact of the reform with promising results. Title 5 of the California Code of Regulations requires the state’s community colleges to use multiple measures to place students in developmental courses and has been in place for over two decades; however, individual colleges have wide discretion in selecting these measures. With this important caveat about variation, the following studies explored the impact of multiple measures policies on student placement and completion.

  1. The Early Assessment Program (EAP)

Overview: The Early Assessment Program (EAP) is a collaborative effort among the State Board of Education (SBE), the California Department of Education (CDE), and the California State University (CSU). The program was established to measure students’ readiness for college-level English and mathematics in their junior year of high school. It provides academic preparation for students to improve their skills during their senior year and, if still not college ready, their first year of college. Throughout these three years, the program provides multiple opportunities for students to demonstrate college readiness.

Using the Smarter Balanced assessment administered in 11th grade, the EAP provides students with their first opportunity to place into college-level courses. California State University (CSU) developed the program to gauge the college readiness of students while still in high school; state community colleges are able to participate in the program as well, and many do. Students who are deemed college ready after taking the Smarter Balanced assessment are exempt from taking placement exams when they enroll in college. Students who are not college ready by the end of their senior year continue to be supported by the EAP through summer Early Start programs and continued testing throughout the first year of developmental coursework taken at a community college or CSU campus. This provides students multiple opportunities to demonstrate college readiness.

Research: A study of the program followed high school juniors who participated in EAP English and math and those who were exempted based on EAP scores or course completion. These students were matched to data from California Community Colleges to investigate the extent to which participation in the EAP predicts college course placement and influences students’ future academ­ic performance.

Results: While EAP participation in English increased from 46% in 2005, the first year of the program, to 86% by the fifth year of the program (2009), very few stu­dents were exempted from providing additional measures of college readiness (such as the ACT, SAT or placement exams) based on their EAP exam scores. However, EAP course participation during senior year was associated with better first-year college outcomes, indicating that providing early and alternative ways to demonstrate college readiness may lead to more timely degree progress.6

  1. Math Placement in Los Angeles Community College District (LACCD)

Overview: Title 5 of the California Code of Regulations requires California Community Colleges use more than one measure to place students in English and math courses. These measures can include any the campus sees fit, including non-cognitive measures such as goals and motivation. Los Angeles Community College District (LACCD) asks students to provide information regarding their educational background and college plans to determine if they receive additional points to their test scores. The test scores combined with the point “boost” determines math placement.

Research: The LACCD examined whether the use of multiple measures in math placement affected the type of course students were placed in and whether they succeeded. Thirty-two thousand nine-hundred and fifty eight students who took placement tests and enrolled in a math course between 2005 and 2008 were tracked. The study tracked changes in placement and also examined pass rates, controlling for a range of student demographics.

Results:

  • Increased Placement in Higher-Level Developmental Education Courses. Overall, the use of multiple measures did increase the number of students placed into higher-level LACCD developmental education math courses by an average of 4.4%. Yet the percentage of “boosted” students varies by college, ranging from 0 to about 14%. Variation is likely due to the fact that each college uses different multiple measures (e.g., high school grade point average, prior math courses, college plans, motivation).
  • Comparable Pass Rates. Students placed into higher-level courses with multiple measures had similar pass rates to those who were enrolled in the courses via traditional means. Results were from two community colleges:
    • College “A” awarded multiple measure points based on a student’s prior math background; and
    • College “H” awarded multiple measure points based on a student’s high school GPA.

Holding all else constant, students who received lower scores on the traditional placement test but qualified for a higher course due to a boost from an additional indicator had similar pass rates to other students in these classes. These students also showed no difference in the total number of degree-applicable and college-level credits they completed through spring 2012.7

  1. Long Beach City College (LBCC)

Overview: LBCC became a system leader in multiple measures implementation and used the following steps to place students during the 2013-14 academic year:

  • The Early Assessment Program (EAP) provided students their first opportunity to place into college-level courses. All students who did not take the EAP assessment, or who failed to meet the cut score for college-level placement, were required to take the ACCUPLACER in both English and math. Those near the lower cut score on the English component of ACCUPLACER were asked to write an essay which was scored to determine placement.
  • However, students in Promise Pathways (a program that provides multiple measures placement and first-semester success plans with registration priority) who were still ineligible for college-level math and English courses based on the results of the assessments above were placed through the Alternative Assessment System. This system used an algorithm based on a variety of academic indicators to determine an index score to judge college readiness.

Research: LBCC studied the impact of the Promise Pathways placement algorithm by comparing their 2011 cohort (N=1654) to their 2012 cohort (N=933) and following them both for two years.

Results: LBCC found significantly higher rates of placement in college-level courses; see Figure 1.

Figure 1. Placement of Promise Pathways vs. non-Promise Pathways students in college-level courses

Source: Long Beach City College at http://lbcc.edu/promisepathways/

The Alternative Assessment System algorithm allowed nearly one-third of participating math students, and nearly 60% of participating English students, to access college-level courses. LBCC also tracked success rates in college-level math and English courses in fall 2012 and found almost identical rates of successful course completion between Promise Pathways and non-Promise Pathways students.

Massachusetts

Overview: A program to improve outcomes for math students was piloted during the 2014-15 academic year and has shown some early signs of success. The policy calls for the use of multiple measures including high school GPA, courses taken in high school, and ACCUPLACER scores.

Research: The voluntary nature of institutional participation allowed researchers to compare colleges and universities that implemented the new policy to those that have not.

Results:

  • Increased Placement in College-Level Math. According to data from the Massachusetts Department of Higher Education, the percentage of students placed in college-level courses increased between fall 2013 and fall 2014, the first year of the pilot. College-level course placement increased by 16% at community colleges piloting the program for all students, 6% at community colleges piloting the program for some of their students, 53% at state universities piloting the program for all students, and 5% for state universities piloting the program for some students; see Figure 2.
  • Comparable Pass Rates. Students placed in credit-bearing math courses were completing those courses at rates comparable to, or higher than, those of students whose placements were based on ACCUPLACER alone.

Figure 2. Percentage of students placing in credit-bearing courses in Pilot vs. Non-Pilot institutions, by sector

Source: Massachusetts Department of Higher Education: The Vision Project at http://www.mass.edu/visionproject/whatsnew.asp#devmath

Ohio

Overview: As of 2013, high school students earning ACT or SAT scores that meet a standard set by the Ohio Board of Regents were considered “remediation-free” without taking an additional placement test.

Research: The Ohio Department of Education tracked all first-time college and university students from 2010 to 2015 to determine if remediation rates had dropped.

Results: This “remediation-free guarantee” resulted in decreases in the percentage of students placed in developmental math and English courses; see Figure 3.

Figure 3. Percent of first-time Ohio public college/university students requiring remediation by subject area

Source: 2016 Ohio Remediation Report. Ohio Board of Regents and Ohio Department of Education at https://www.ohiohighered.org/sites/ohiohighered.org/files/uploads/Link/2016-Remediation-Report.PDF

North Carolina: Davidson County Community College

Overview: The North Carolina Community College System (NCCCS) developed a hierarchy of placement measures for institutions to use. First, students may be placed directly into college-level courses if they have an un-weighted high school GPA of 2.6 or above. Students who do not meet the GPA cutoff can submit their ACT/SAT scores to demonstrate readiness for college-level courses. Students unable to be placed in college-level coursework based on those measures or who graduated from high school more than five years ago must take a placement test.

Research: NCCCS examined data for the 2013–14 and 2014–15 academic years to determine how students placed using high school transcript data (i.e. HSGPA and math courses completed) fared in English and math gateway courses when compared to those placed based on other criteria.

Results: Students placed using multiple measures succeeded at higher rates than did comparable students placed without multiple measures. Specifically:

  • Seventy six percent of students placed using high school transcript data successfully completed the English gateway course, compared with 59% of students placed based on other criteria.
  • Sixty five percent of students placed using high school transcript data successfully completed the math gateway course, compared with 48% of students placed based on other criteria. This same trend holds when disaggregating these data by race/ethnicity.8

University of Wisconsin-Marathon County (UWMC)

Overview: Success in using multiple measures for placement on one campus in the University of Wisconsin (UW) system has led to adoption elsewhere. The UW system does not yet have a formal multiple measures policy. Despite this, between 2007 and 2010, two UWMC English professors piloted a new process for using multiple assessment measures to determine student placement. The process was first used with “at-risk” students and then expanded to all incoming freshmen.

The campus utilizes the Wisconsin English Proficiency Test, academic measures including high school GPA, high school English grades and courses taken, and SAT/ACT scores. Writing prompts used for multiple measures assessments were also developed, and student responses were scored based on the First-Year Composition Learning Outcomes. Non-cognitive measures include student home language selection, TRIO eligibility, and a student questionnaire.

Research: During 2009 and 2010, Giordano and Hassel conducted a campus assessment on the impact of multiple measures placement at UWMC. They followed the academic performance of students placed in English 098 and English 101 (N=208 in 2006) from 2006 to 2009.

Results: The number of at-risk students who remained in good standing at the end of their first year fall semester increased from 59% in 2006 to 73% in 2009.9

Based on the success of the initiative, other UWC campuses were encouraged to become involved and have determined, campus by campus, whether to use multiple measures for English placement.

University of Alaska

Overview: A subset of students at the University of Alaska took college-level courses instead of the recommended developmental education courses.

Research: Regional Educational Laboratory Northwest followed associate and bachelor’s degree seeking students entering the University of Alaska in 2008. Students who were placed in developmental English but enrolled in college-level English (associate’s n=67, bachelor’s n=161), and students who were placed in developmental math but took college-level math (associate’s n=35, bachelor’s n=92) were tracked until spring 2012.

Results: Of associate degree seeking students who were placed in developmental education but enrolled in college-level English, 70% passed their college-level English course, while 49% did so in math. Among bachelor’s degree students who were placed in developmental education but enrolled in college-level English, 79% passed their college-level English course, while 62% did so in math.10

  1. Belfied, C. & Crosta, P. (2012) Predicting Success in College: The Importance of Placement Tests and High School Transcripts. (CCRC Working Paper No. 42) New York, NY: Columbia University, Teachers College, Community College Research Center; Scott-Clayton, J. (2012). Do high-stakes placement exams predict college success? (CCRC Working Paper No. 41). New York, NY: Columbia University, Teachers College, Community College Research Center.
  2. Armstrong, W. B. (2000). The association among student success in courses, placement test scores, student back ground data, and instructor grading practices. Community College Journal of Research and Practice, 24(8), 681–695; Scott-Clayton, J., & Stacey, G. W. (2015). Improving the accuracy of remedial placement. New York, NY: Columbia University, Teachers College, Community College Research Center; Scott-Clayton, J., & Stacey, G. W. (2015). Improving the accuracy of remedial placement. New York, NY: Columbia University, Teachers College, Community College Research Center.
  3. Hiss, W.C., & Franks, V.W. (2014). Defining Promise: Optional Standardized Testing Policies in American College and University Admissions. National Association for College Admission Counseling; Belfied, C. & Crosta, P. (2012) Predicting Success in College: The Importance of Placement Tests and High School Transcripts. (CCRC Working Paper No. 42) New York, NY: Columbia University, Teachers College, Community College Research Center.
  4. Geiser, S., & Santelices, V. (2007). Validity of high school grades in predicting student success beyond the freshman year: High school records vs. standardized tests as indicators of four-year college outcomes (Research & Occasional Paper Series: CSHE.6.07). Berkeley, CA: University of California at Berkeley.
  5. Hodara, M., & Cox, M. (2016). Developmental education and college readiness at the University of Alaska (REL 2016–123). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Northwest.
  6. Kurlander, D. M. (2014). Assessing the Promise of California’s Early Assessment Program for Community Colleges. The ANNALS of the American Academy of Political and Social Science, 44-46. At the time that this study was conducted, the Smarter Balanced assessment was not yet being used for the EAP.
  7. Federick Ngo, Will Kwon, Tatiana Melguizo, George Prather, and Johannes M. Bos. (2014) Course Placement in Developmental Mathematics: Do Multiple Measures Work? USC: Rossier School of Education at http://www.uscrossier.org/pullias/wp-content/uploads/2013/10/Multiple_Measures_Brief.pdf
  8. Center for Community College Student Engagement. (2016). Expectations meet reality: The underprepared student and community colleges. Austin, TX: The University of Texas at Austin, College of Education, Department of Educational Administration, Program in Higher Education Leadership.
  9. Hassell, Holly, and Joanne Baird Giordano. “First Year Composition Placement at Open-Admission, Two-Year Campuses: Changing Campus Culture, Institutional Practice and Student Success.” OPEN WORDS: Access and English Studies 5. 2 (2011).
  10. Hodara, M., & Cox, M. (2016). Developmental education and college readiness at the University of Alaska (REL 2016–123). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Northwest.