Print Friendly, PDF & Email

College of Coastal Georgia: Piloting Student Placement Indexes

The College of Coastal Georgia is utilizing a placement index in both math and English that combines high school GPA, SAT, and ACT scores and placement test results. The development and implementation of the policy has been a joint effort between University System of Georgia (USG) institutions and the USG system office and provides an example of how collaboration between the system office and institutional leadership regarding the use of data-based decision-making can result in strong multiple measures placement reform with positive student outcomes. The use of the indices was being piloted at the time of the site visit and was expected to be implemented across the system in spring 2017.1

Methodology and Data Sources

Research for Action conducted campus site visits in the spring of 2016, so this case study provides an implementation snapshot from that point in time. Sites were selected based on state or system recommendations on leading institutions in multiple measures reform. Field work at the College of Coastal Georgia included interviews with seven administrators, six faculty members, and two focus groups with students recently placed into coursework. Institutional documents and online resources were reviewed prior to field work. In addition, the institution provided internal analyses it had conducted on the impact of the multiple measures policies on student outcomes. Once drafted, this case study was provided to the primary institutional contact for review and verification.

State or System Policy: Mathematics and English Placement Indices

The University System of Georgia (USG), in collaboration with institutional leaders, developed a Mathematics Placement Index (MPI) and an English Placement Index (EPI) for student placement. Institutions were able to choose whether or not to adopt the placement indices during the piloting phase, which began in the 2014-15 academic year. Indices are comprised of the following:

  • High school grade point average (HSGPA) and SAT/ACT, when both are available
  • HSGPA and Compass/Accuplacer, when SAT/ACT are not available
  • SAT/ACT and Compass/Accuplacer, when HSGPA is not available
  • Compass/Accuplacer, when neither HSGPA nor SAT/ACT is available.

Students are encouraged to submit test scores that can be used to waive placement testing. To be exempt, a student must meet the cut score on the SAT or ACT and have taken the required high school curriculum. In the absence of SAT, ACT, or other scores high enough to exempt placement testing, a placement test is used. The MPI and EPI scores are calculated based on high school grade point average (HSGPA), SAT or ACT scores, and, when indicated, placement testing.

With the decision by ACT to no longer support the Compass placement test at the end of 2016, the USG decided to replace the Compass test with the Accuplacer test for instances in which placement testing is still used. Institutions were able to use Accuplacer voluntarily at the time of the site visit, but all institutions with students requiring placement testing are expected to convert to Accuplacer in 2017.

In order to develop the EPI and MPI and determine the minimum index cut scores for placement directly into college-level courses, the USG office conducted data analyses in consultation with system stakeholders in math and other disciplines. Using data collected from institutions across the system, the USG estimated the probability of success for students in college-level courses based on HSGPA, SAT/ACT results, and placement test results. Institutions that offer both college-level courses with co-requisites and developmental education courses were responsible for establishing the minimum cut score for placement into college-level courses with co-requisite support; students below that cut score are placed into developmental education courses. Although institutions were given discretion in where to set this cut score, the USG specified that it was to be set so that the majority of students not directly eligible for college-level courses would be placed into co-requisite courses. In addition, both USG and CCGA use student-level outcomes data to regularly assess their cut scores and to make changes to ensure that students were appropriately placed.

The system has provided institutions with considerable technical assistance, including guidance on the implementation of the indices and how to advise students in the placement process based on their math pathway. In order to facilitate the use of the indices, the USG also developed spreadsheets that institutional staff could use to calculate the EPI and MPI. The probability of success analyses conducted by the USG was also available to institutions to help them determine appropriate minimum cut scores for co-requisite courses.

Institutional Context

Background: Transitioning from an Associate-Level to a Baccalaureate Institution

The College of Coastal Georgia (CCGA) was founded in 1961. The main CCGA campus is located in Brunswick on the southeast coast of Georgia between Savannah and Jacksonville, Florida. During the 2008-09 academic year, the College of Coastal Georgia began transitioning from an associate-level to a baccalaureate institution as part of its ten-year strategic plan, but it continues to offer associate degree programs as well.2 The characteristics of students at CCGA as of fall 2015 are outlined in Table 1.

Table 1. Student Characteristics at the College of Coastal Georgia3

Impetus for Change: Building on Co-Requisite Reform

Complete College America (CCA) has been an impetus for change in Georgia, calling on the postsecondary systems to improve outcomes for students placed in developmental education programs. In Georgia, these students are called Learning Support students. Prior to this placement reform, CCGA received a grant from CCA to support the development of a co-requisite support course model that provides support to students while they are taking their first college-level courses. Co-requisite support courses were designed to provide “just-in-time” support to students and can use a tutorial format, standard-classroom lecture, or online instruction, among other options. It was recommended by the USG that faculty for these courses also teach the connected college-level course, but institutions made the final decision.

In alignment with this initiative, task forces and ad hoc steering committees coordinated by the USG recommended using multiple measures for placement, which led to the development of the MPI and EPI policies at the system level in July 2013. German Vargas, the Assistant Vice President for Academic Student Engagement and an associate professor of mathematics at CCGA, has been a leader in this process, serving as the chair of the ad hoc steering committee on reforming math placement and instruction. At CCGA, he was instrumental in guiding the piloting of the reform and in providing feedback to the USG office. Piloting began in the fall of 2014 at five institutions in the system, including CCGA.

Placement Process: Combining High School GPA and Assessment Scores for Placement

The placement process at CCGA begins with freshman admissions. To be accepted to the college, students who have graduated from high school in the last five years or have completed their GED must have:

  • A score of 350 or above on both the math and reading sections of the SAT or a score of 14 or higher on both the math and English sections of the ACT; and
  • A Freshman Index score of 1850, which is calculated using students’ high school GPA and SAT or ACT scores.

Because of these admissions requirements at CCGA, first-time students are expected to submit their high school transcript and SAT or ACT scores. However, non-traditional students and students who have been home schooled often do not have test scores or an official high school transcript. Recognizing this, the USG developed a placement process which was being piloted at CCGA that entails a number of steps based on what academic records students provide to the college, as outlined in Figure 1.

Figure 1. CCGA Placement Process

During the spring 2016 semester, the placement process proceeded as follows:

  • Students scoring a minimum of 430 on the reading section of the SAT and a 400 on the math section of the SAT, or a minimum of 17 on the ACT in both English and math, were exempt from any developmental education or co-requisite coursework and placed directly into college-level courses.
  • For students below these thresholds, a Mathematics Placement Index (MPI) and an English Placement Index (EPI) was computed and students then placed in foundations level courses (developmental education), or college-level courses with or without co-requisite support. EPI and MPI index scores were calculated based on high school grade point average (HSGPA), SAT and/or ACT and, when needed, placement test scores, based on what academic records were available for the student. For instance:
    • HSGPA and SAT/ACT: Students who provided CCGA with both SAT or ACT scores and a high school transcript, but do not meet the exemption scores on their SAT or ACT, were placed based on their EPI and MPI scores calculated using their high school GPA and SAT or ACT results. Based on the indices, the student were placed into foundations level courses (developmental education), or college-level courses with or without co-requisite support. However, if their index score was at or below the cut score for college-level coursework, placement test scores were used to recalculate the EPI and MPI for their final placement decision.
    • HSGPA and Compass/Accuplacer: If a student did not have SAT or ACT results, the student’s placement test score were used along with their HSGPA to calculate an EPI and MPI and determine placement.
    • SAT/ACT and Compass/Accuplacer: If a student did have a high school transcript, the student’s placement test score was used along with SAT or ACT results to calculate EPI and MPI and determine placement.
    • Compass/Accuplacer: If a student did not have SAT or ACT results or a high school transcript, the EPI and MPI calculations and placement were based on placement test scores alone.

Placement decisions in math were also based on whether a student was planning to major in an area that falls into the “Algebra-Calculus” pathway or the “Non-Algebra” pathway; advisors discussed this with students as they determined their courses and built their class schedule.

The college was also working on additional options for students without SAT or ACT scores. As a pilot project, CCGA was having these students take the SAT during orientation and initially placing students with a high school GPA of 3.0 or higher in college-level courses with co-requisite supports.

Implementation and Impact: Technical Support from the System, Strong Support at the Institution, and Promising Early Outcomes

Based on interviews with student focus groups and over a dozen faculty and administrators, the following section discusses the successes and challenges of implementing the new placement model. The findings include the following:

  • The highly systematized index formula and technical support from the USG allowed the placement process to be primarily housed in a single department and required little additional campus capacity. Staff in the admissions office used a spreadsheet developed by the USG to compute index scores; this allowed for the easy input of placement results into the college system and did not require much additional staff capacity.4
  • Overall, both faculty and administrators believed the placement indices to be both valuable and accurate—and an improvement over placement tests. They based their opinions on anecdotal and/or personal experience, rather than data. For example, one faculty member said that in a class of 30 students, perhaps one would be misplaced. Another said that “faculty complain about everything, but they do not complain about placement” since the implementation of the new process. An advisor also agreed that placement indices had improved placement accuracy. When asked how the placement process could be improved, respondents had little to say; the system seemed to work and, as a result, faculty concerns were limited.5
  • The availability of co-requisite courses that supplement college-level course instruction has dovetailed with the placement reform. The development of the co-requisite model provides a third placement option in addition to developmental and college-level courses. This additional option has been central to the implementation of the placement reform. CCGA has placed 66% (English) and 55% (Mathematics) of students requiring remediation in a college-level course with a co-requisite support.
  • The college has placed increasing numbers of students in college-level courses. More students are being placed in college-level courses (with or without co-requisite support) than before placement and developmental education reforms, and the number of foundational (developmental) courses has been reduced. However, because the reforms have been implemented together, institutional researchers cannot say whether these outcomes are a result of the placement process or the co-requisite support courses.
  • Overall, students understood why they were placed in their courses and agreed with their placement. At CCGA, placement policy is communicated to students through the college website and the scheduling and advising process. Students do not select their own courses. Rather, course schedules are “built” for them by an advisor based on a survey that asks for their pathway of study, scheduling preferences and course interests. However, students may simply be provided with their schedule at orientation based on their survey and placement results, and if they are satisfied with their courses the placement process may not be formally explained. If students have concerns about their placement and course schedule, they can then meet with an advisor.
  • Placement of non-traditional students remains a challenge. Students without SAT or ACT scores or high school transcripts must still take a placement test, which have often placed students inaccurately. The college is considering other, more individual, processes for placement, including having students take courses other than math or English for a semester and then placing them based on that initial experience at the college.
  • Helping staff and faculty across offices and departments understand how the new placement process works has been a challenge. Despite the centralized nature of this reform, admissions counselors, academic advisors, the registrar, and other administrative officials need to clearly understand the placement process and have access to student placement results. Therefore, it has still been essential to provide background to staff across the campus, which has been a challenge due to the complex nature of the formula.

Lessons for the Field: Support from the System and input from the Institution can facilitate Campus Implementation

Based on the placement process at CCGA and the larger context of USG placement policy, the following lessons may be applied in other states and institutions:

  • Institutional involvement in placement policy development facilitates campus buy-in and implementation. Georgia developed its multiple measures policy using considerable feedback from institutions through local and regional advisory committees. Because institutional administrators and faculty were involved in the development of the policy, they were, in turn, able to clearly communicate the policy to key institutional stakeholders. As a result, the campus seemed to have a high level of buy in.
  • Transcript submission requirements and software systems support the placement process. CCGA requires that students submit their high school transcripts, which allows students to be placed according to their high school GPA, SAT and ACT scores, and placement testing, as needed. In addition, about half of transcripts are submitted electronically; electronic transcripts can be reviewed using a software system that filters the grades from academic high school courses and reports GPA based on only those courses. Paper transcripts continue to be reviewed manually.
  • System offices can support reform by providing technical assistance in placement process design and implementation. The USG used data from across institutions to determine cut scores and test the formula to improve its college readiness predictability. In addition, the system office also developed and disseminated a spreadsheet that facilitated the process of inputting and calculating placement data. As a result, staff in the admissions office did not find the process of inputting data from placement measures burdensome, and only one additional part-time staff member was assigned to the task.
  • Determine cut scores based on student outcomes data and shift them when necessary. Georgia’s multiple measures policy involves multiple cut scores, and each was determined based on data indicating a student’s probability of success. While the state determined the cut score for college readiness, institutions have the authority to determine the cut score for a student needing additional supports, like a co-requisite course. Both USG and CCGA used student-level outcomes data to regularly assess their cut scores and make changes to ensure that students were appropriately placed in the necessary supports.
  1. College of Coastal Georgia Profile
  2. College of Coastal Georgia Profile
  3. http://nces.ed.gov/collegenavigator/?q=college+of+coastal+georgia&s=all&id=139250
  4. One additional part-time staff member was hired to assist with data entry.
  5. Some English faculty reported that they would still like to review essays or portfolios as a part of the placement process.