College Completion

Completing College: A National View of Student Attainment Rates – Fall 2007 Cohort PART TWO

March 7, 2014
The National Student Clearinghouse Research Center has released its second annual report on student attainment rates for student who first enrolled in postsecondary education in the fall of 2007. The report tracks students for six years through the spring of 2013.

Among the report’s findings:
 
  • Overall, 56.1% of first-time degree seeking students who enrolled in postsecondary education in fall of 2007 completed a degree or certificate within six years. Student completion rates, however, varied greatly by a student enrollment intensity.
    • Students who were enrolled exclusively full-time had a 77.7% completion rate.
    • Students who were enrolled exclusively part-time had a 21.9% completion rate.
    • Students who were enrolled on a mixed basis – periods of full and part time—had a 43.2% completion rate.
       
  • Students who began college at a more traditional age – age 20 or younger – had higher completion rates than older students who first enrolled in postsecondary education. By age groups, students who enrolled in postsecondary education in fall of 2007 completed a degree or certificate within six years as follows:
    • Among students ages 20 or younger, 59.7% had completed a degree within 6 years, 15.9% were still enrolled in school, and 24.3% were no longer enrolled.
    • Among students between ages 21 and 24, 40.8% had completed a degree within 6 years, 16.3% were still enrolled in school, and 42.8% were no longer enrolled.
    • Among students over age 24, 43.5% had completed a degree within 6 years, 12.5% were still enrolled in school, and 43.9% were no longer enrolled.
  • Student completion outcomes also varied based on the type of institution in which a student first enrolled.
    • Within six years of starting their postsecondary education at a 4-year private nonprofit, 70.2% had completed at a four-year institution, 2.6% had completed at a 2-year institution, 9.7% were still enrolled in school, and 17.5% were no longer enrolled in school.
    • Within six years of starting their postsecondary education at a 4-year public institution, 59.9% had completed at a four-year institution, 3.6% had completed at a 2-year institution, 15.0% were still enrolled in school, and 21.6% were no longer enrolled in school.
    • Within six years of starting their postsecondary education at a 4-year private for-profit institution, 40.1% had completed at a four-year institution, 2.3% had completed at a 2-year institution, 13.4% were still enrolled in school, and 44.3% were no longer enrolled in school.
    • Within six years of starting their postsecondary education at a 2-year public institution, 29.9% had completed at a two-year institution, 10.0% had completed at a 4-year institution, 18.9% were still enrolled in school, and 41.2% were no longer enrolled in school.
    • Within six years of starting their postsecondary education at a 2-year private for-profit institution, 60.4% had completed at a two-year institution, 3.0% had completed at a 4-year institution, 7.8% were still enrolled in school, and 29.8% were no longer enrolled in school.

How Full-Time are “Full-Time” Students

February 28, 2014
Complete College America has released a policy brief reporting the results of a survey looking at enrollment patters of “full-time” and “part-time” students. Specifically, the brief looks at the number of credits “full-time” students are taking. The report finds that most students classified as “full-time” are not taking enough credits to graduate on time.

Among the survey results:
 
  • 69% of college students are not enrolled in a sufficient number of credits to graduate on time.
  • 52% of “full-time” students are taking less than 15 credits, the standard number of credits needed to graduate on time.
    • Among 4-year institutions, only 50% of “full-time” students were taking 15 or more credit hours.
    • Among 2-year institutions, only 29% of “full-time” students were taking 15 or more credit hours.

STEM Attrition: College Students’ Paths into and out of STEM Fields

February 21, 2014
The National Center for Education Statistics has released a study looking at undergraduate student attrition in the STEM fields (science, technology, engineering, mathematics). Specifically, the study looked at student movements in and out of the STEM fields between 2003 and 2009. In the study, student attrition was defined as undergraduate students declaring a STEM major and subsequently switching into a non-STEM field or leaving postsecondary education without earning degree or certificate. The study aimed to explore STEM attrition rates compared to other fields, the fields into which STEM majors move, and various student characteristics of those leaving and persisting in STEM.

Among the report’s findings:
  • ​Among 2003-2004 beginning bachelor’s degree students, 28 percent entered a STEM major at some point between 2003 and 2009. Among beginning associate’s degree students, 20 percent entered a STEM field in that time period.
  • Among beginning bachelor’s degree students who entered a STEM field between 2003 and 2009, 48 percent had left the STEM fields by 2009. Among beginning associate’s degree students who entered a STEM field between 2003 and 2009, the STEM attrition rate rose to 69 percent.
  • Attrition rates were similarly high in non-STEM fields. Among beginning bachelor’s degree students who entered the education field between 2003 and 2009, 62 percent had left the education field by 2009. Among beginning bachelor’s degree students who entered a health sciences field or humanities field between 2003 and 2009, by 2009 58 percent had left health sciences and 56 percent had left the humanities. Similarly high attrition at the associate’s degree level was also seen in education (70%) and humanities (72%).
  • Among students who left the STEM field for another field, 22 percent of bachelor’s students and 16 percent of associate’s degree students ended up pursuing a business degree, and 12 percent of bachelor’s students and 20 percent of associate’s degree students ended up pursuing a health sciences degree. 

Completing College: A National View of Student Attainment Rates – Fall 2007 Cohort

February 7, 2014
The National Student Clearinghouse has issued a research brief looking at the six-year outcomes of students who first enrolled in postsecondary education in the fall of 2007. The brief provides data broken down by type of postsecondary institution and by student enrollment intensity (e.g., full-time, part-time).

Among the findings:
  • Within six years of starting their postsecondary education at a 4-year private nonprofit, 69.1% had completed at a four-year institution, 2.6% had completed at a 2-year institution, 9.8% were still enrolled in school, and 18.4% were no longer enrolled in school.
  • Within six years of starting their postsecondary education at a 4-year public institution, 58.4% had completed at a four-year institution, 3.5% had completed at a 2-year institution, 15.2% were still enrolled in school, and 22.9% were no longer enrolled in school.
  • Within six years of starting their postsecondary education at a 4-year private for-profit institution, 40% had completed at a four-year institution, 2.3% had completed at a 2-year institution, 13.3% were still enrolled in school, and 44.5% were no longer enrolled in school.
  • Within six years of starting their postsecondary education at a 2-year public institution, 28.6% had completed at a two-year institution, 8.8% had completed at a 4-year institution, 19.1% were still enrolled in school, and 43.5% were no longer enrolled in school.
  • Within six years of starting their postsecondary education at a 2-year private for-profit institution, 60.5% had completed at a two-year institution, 2.1% had completed at a 4-year institution, 7.4% were still enrolled in school, and 30% were no longer enrolled in school.
  • Students enrolled on an exclusively full-time basis had significantly higher completion rates within 6 years of first enrolling.
    • Of students enrolled exclusively full-time, 76.2% had completed a degree within six years, 3.5% were still enrolled in school, and 20.3% were no longer enrolled.
    • Of students enrolled exclusively part-time, 21.9% had completed a degree within six years, 10.9% were still enrolled in school, and 67.1% were no longer enrolled.
    • Of students enrolled on a mixed basis (i.e., periods of full-time and part-time), 41.4% had completed a degree within six years, 25.4% were still enrolled in school, and 33.1% were no longer enrolled. 

A Profile of 2012 ACT-Tested High School Graduates: College Choice Report Part 2 Enrollment Patterns

January 24, 2014
ACT has recently released data on 2012 high school graduates who took the ACT and enrolled in college. The report focuses on student enrollment patterns. In general and in keeping with past research, higher ACT scores, higher levels of parental education, and higher degree aspirations are associated with a higher likelihood of enrolling in a four-year college, attending a private four-year college, attending school out-of-state, and enrolling in colleges farther away from a student’s home. 

Lingerers in the Community College

December 20, 2013
A recent report from the Community College Research Center examines data on first-time college students at nine community colleges including demographics, course enrollment and performance, credential completion, and transfer rates. The report’s analysis is focused on “lingerers,” defined as students who completed 30 or more college-level credits, were still enrolled in their fifth year, but had not yet earned a credential.

Among the report’s findings:
 
  • Lingerers tended to be roughly similar to credit students (students who were enrolled in developmental education or college-level coursework) and completers in terms of ethnicity, age, and median household.
  • The vast majority of lingerers intended to earn a credential or transfer to a four-year institution, similar to the intentions of completers.
  • Lingerers appeared to be less prepared for college than completers.
    • 83% of lingerers were referred to remedial education, compared to 76% of completers.
    • 10% of lingerers tested as “college-ready,” compared to 17% of completers.
    • Only half of lingerers enrolled as full-time students, compared to 60% of completers.
    • Lingerers failed about 25% of courses in which they enrolled, while completers failed 10% of courses.
  • While lingerers attempted similar numbers of credits compared to completers, they earned many fewer credits than completers.
    • Lingerers earned, on average, 57 credits, while completers earned about 82 credits.

Snapshot Report: Degree Attainment

December 13, 2013
According to a recent report from the National Student Clearinghouse Research Center, attainment of science and engineering (S&E) degrees has increased by 19 percent for traditional aged students (26 years old and younger) and by 25 percent for older students (over 26 years old).

Among the report’s other findings:
 
  • Degrees in S&E accounted for 32% of all bachelor’s degrees in 2013, an increase from 30% in 2009.
  • There has been a 19% growth in S&E bachelor’s degrees over the last five years, compared to a 9% degree growth in non-science and non-engineering disciplines.
  • In 2013, students over the age of 26 accounted for 26% of all bachelor’s degrees and 18% of all S&E bachelor’s degrees earned.
  • Over one-third of degrees earned by traditional age students were in S&E disciplines, compared to one-fifth of degrees earned by students over 26.

Colleges Are Supposed to Report Pell Graduation Rates -- Here's How to Make Them Actually Do It

October 30, 2013
​Since 2008, the federal government has spent nearly $200 billion on the Pell Grant program. We know that this sizeable investment has bought a 50 percent increase in the number of people getting these awards. But how many graduates did these funds produce? What percentage of the individuals graduate? And which schools are doing the best with the lowest-income students?

Congress wanted to know the answer to all these questions. That’s why it included requirements in the 2008 reauthorization of the Higher Education Act (HEA) that required colleges to disclose the graduation rates of Pell Grant recipients, students who did not receive Pell but got a Subsidized Stafford Loan, and individuals who got neither type of aid. But it only asked institutions to disclose this information, either on their websites or potentially only if asked for it, not proactively report it to the Department of Education. The results have gone over about as well as a voluntary broccoli eating contest with toddlers. A 2011 survey of 100 schools by Kevin Carey and Andrew Kelly found that only 38 percent even complied with the requirement to provide these completion rates, in many cases only after repeated phone calls and messages.

Absent institutional rates, the only information of any sort we have about Pell success comes as often as the Olympics, when the National Center for Education Statistics (NCES) within the Department updates its large national surveys. These data are great for broad sweeping statements, but cannot report the results for individual institutions, something that’s especially important given the variety of outcomes different schools achieve. Instead, these surveys can only provide information about results by either the sector or Carnegie type of institution. And the surveys are too costly to operate more frequently.

Fortunately, there’s a chance to fix this problem and get colleges to report this completion data. The Department is currently accepting comments on its plans for data collection under the Integrated Postsecondary Education Data System (IPEDS) for the next several years (see here to submit a comment, here for the notice announcing the comment request, and here for the backup documentation of what the Department wants to do). This means there’s an opportunity for the public to provide suggestions before the comment period closes on November 14 as to what additional information IPEDS should include
.
To be clear, a lot of what the Department is already proposing to add into IPEDS through this collection will help us get a significantly better understanding of student outcomes in postsecondary education. First, it would implement some recommendations from the Committee on Measures of Two-Year Student Success, which Congress called for in the 2008 HEA reauthorization to capture students that are not currently captured in the federal graduation rate because they are not full-time students attending college for the first time. The committee’s recommendations, which are being implemented here, aim to capture those missing students by requiring colleges to reporting on the success rates of three additional groups: (1) students who are enrolled part-time and attending for the first time, (2) those who are enrolled full-time and have attended college elsewhere, and (3) those who are enrolled part-time and have attended college elsewhere.  Colleges would then report how many of these students either received an award, are still enrolled, transferred, or dropped out after six and eight years. And it will start this reporting retroactively so that the public won’t have to wait until 2023 to find out the first results.

Other proposed changes to IPEDS are smaller-scale but also important. Colleges would be asked to provide information on the use veterans benefits on their campuses. And the way for-profit colleges report their finances data would be better-aligned with the way public and private non-profit colleges provide this information.
But these changes still leave us without one obvious set of completion information—rates disaggregated by socioeconomic status. Sure, attending full-time can be a proxy for a student’s financial circumstances, but not as definitively as getting a Pell Grant.

The Institute for College Access and Success and others have already argued that the Department should add these data into IPEDS. In response, NCES has noted that improvements to the federal student aid database may make it possible to calculate completion rates for Pell students. But that’s an incomplete solution. That database is legally prohibited from collecting information on students that don’t get federal student aid, so there’s no way to produce the HEA-mandated graduation rate for students who received neither Pell Grants nor subsidized Stafford loans.

Of course, you can’t bring up any discussion of data reporting without running into the “B” word: burden. But remember, this isn’t new burden—colleges are legally required by an act of Congress to provide these graduation rates. Any huge encumbrance these represent (and I’d argue it’s probably not much since you would just be taking a subset of an existing cohort that has easy to identify characteristics based on student aid receipt) has already occurred. In fact, U.S. News and World Report is already getting some schools to provide this information, but it won't share the raw data.

In an ideal world, we would not have to beg and plead with colleges to tell us whether they are successfully using the more than $30 billion they receive each year to educate low-income students. Instead, we would have a student unit record system capable of processing all this information without adding burden to colleges or forcing them to rely on costly alternatives like the National Student Clearinghouse. But thanks to Virginia Foxx (R-N.C.) and the college lobby (primarily the private institutions), we don’t live in that world. Instead, we’re left with IPEDS where these data should be.  

Obama Administration Should Stop Punting on For-Profit College Job Placement Rates

October 17, 2013

[This post is largely adapted from a previous post that ran on Higher Ed Watch in October 2011.]

Last week I argued that the U.S. Department of Education needs to develop a single, national standard that for-profit colleges would be required to use when calculating job placement rates. Department officials could go a long way in achieving this by revisiting a proposal they offered in the summer of 2010 that would have established a standard methodology to use when determining these rates.

Currently, the federal government leaves it up to accrediting agencies and states to set the standards that for-profit schools must use to calculate the rates, and to monitor them. The only exception is for extremely short-term job training programs, which must have employment rates of at least 70 percent to remain eligible to participate in the federal student loan program.

In June 2010, as part of a package of draft regulations aimed at improving the integrity of the federal student aid programs, the administration proposed extending the standards that short-term programs are required to use to all for-profit college and vocational programs that are subject to the Gainful Employment rules. The proposal was met with a firestorm of protest from for-profit college officials, as the federal methodology is much more strict than that used by accreditors and state agencies.

For example, under the Education Department’s requirements, students are only considered to be successfully placed if they have been employed in their field or a related one for at least 13 weeks within the first six months after graduating. In comparison, some accreditors and state agencies apparently allow schools to consider a graduate to be successfully placed if they work in their field for as little as a day.

Meanwhile, the Education Department has established a strict regulatory regime to make sure the rates are not rigged (the extent to which the agency actually holds short-term programs to these standards is unclear). Institutions are required to provide documentation proving that each of the graduates included in their rates is employed in the field in which he or she trained. According to the Department’s rules, acceptable documents “include, but are not limited to, (i) a written statement from the student’s employer; (ii) signed copies of State or Federal income tax forms; and (iii) written evidence of payments of Social Security taxes.” 

To be fair, for-profit colleges were not the only institutions that objected to the proposal. Community colleges and state universities that have training programs that fall under the Gainful Employment requirements also complained that the plan was too stringent. These institutions may have found these requirements to be especially daunting since they generally have not had to track job placements before.

A Recipe for Failure

How did the Education Department’s political leaders respond to this criticism? They punted. Instead of sticking to their guns or devising an alternative proposal, they kicked the issue to the National Center for Education Statistics (NCES). Under the final program integrity regulations, which were released in October 2010, the Department directed the NCES to convene a Technical Review Panel “to develop a placement rate methodology and the processes necessary for determining and documenting student placement” that schools would be required to use to fulfill this mandate.

But putting NCES in charge of developing a federal standard for calculating these rates turned out to be a major blunder. First, this was not an assignment that the NCES had sought out or has typically been asked to do. After all, the Department was not just asking the center to provide technical assistance in devising a new methodology but to take the reins in setting a new federal policy in this highly contentious and controversial area. Second, the Technical Review Panel that the Department chose to carry out this assignment included a number of representatives from schools that were opposed to this effort.

All of this was a recipe for failure. So it was hardly a surprise that, after two days of discussions on this topic in March, the review committee was not able to reach an agreement. The panel suggested in a final report on its deliberations that "the topic be explored in greater detail by the Department of Education.” Translation: This is a job for the Department, and not NCES.

The Education Department's hands have been tied since because the final regulations explicitly require schools to use "a methodology developed by the National Center for Education Statistics, when that rate is available." In the meantime, the job placement rates that for-profit colleges are required to disclose under the new rules are the same ones they report to accreditors and state regulatory agencies. As I've written previously, the methodologies that for-profit schools use to calculate these rates vary state by state and accreditor by accreditor, making them impossible to compare. And because neither accreditors nor state regulators have historically put much of an effort into verifying these rates, the schools don’t seem to have any qualms about gaming them.

As Department officials rewrite the Gainful Employment rules, they need to revisit this issue. Otherwise, prospective students will have to continue relying on faulty information when choosing whether to attend a for-profit college.

Lack of Standard Definition for Job Placement Rates Fuels Abuses

October 15, 2013

Last Thursday California Attorney General Kamala D. Harris filed a lawsuit against Corinthian Colleges accusing the company of deliberately deceiving prospective students and investors about the company’s record in placing graduates into jobs. The California AG’s action comes just two months after New York Attorney General Eric T. Schneiderman reached a $10.25 million settlement with Career Education Corporation over similar charges.

The two cases together underscore the need for policymakers to develop a single, national standard that for-profit colleges would be required to use when calculating their job placement rates and to establish a strict regulatory regime to make sure that the rates are not rigged. U.S. Department of Education officials have the opportunity to establish such standards when they rewrite the Gainful Employment regulations.

Currently, the federal government leaves it up to accreditation agencies and states to set the standards that for-profit schools must use to calculate the rates and to monitor them. The only exception is for extremely short-term job training programs, which must have employment rates of at least 70 percent to remain eligible to participate in the federal student loan program.

As a result, the methodologies that for-profit colleges use to calculate these rates vary state by state and accreditor by accreditor, making them impossible to compare. And without a single standard in place, the schools can easily game the system.

Take Career Education Corporation, for example. According to the NY AG’s findings, officials at the company’s health education schools counted graduates as being employed if they worked for a single day at community health fairs. In some cases, school officials allegedly arranged for these fairs to be held so that they could pump up their institutions’ job placement rates.

These practices were not devised and carried out by “rogue” employees. The investigation found that “high-level Career Services managers” at the company’s headquarters “not only knew about the practice of counting employment at single one-day health fairs as ‘placements,’ but explicitly condoned and even encouraged the practice of recording such employment as ‘placements.’”

Meanwhile, the California AG found that in large part the placement rates that Corinthian Colleges (CCI) has disclosed cannot be substantiated. “The data in the disclosures published on or about July 1, 2012 for all campuses in California and online campuses does not match or agree with the data in CCI’s own database system and/or in student files,” the lawsuit states. “In numerous cases, the placement rate data in CCI’s files shows that the placement rate is lower than the advertised rate.”

For example, Corinthian “advertised job placement rates as high as 100% for specific programs when, in some cases, there is no evidence that a single student obtained a job during the specified time frame,” the AG’s office wrote in a press release announcing the lawsuit.

Some of Corinthian’s Everest College campuses went as far as paying temp agencies to place graduates in short-term jobs to help the schools meet the minimum job placement rates required by their accreditors, the lawsuit states. Others appear to have fabricated the data. In many cases, the documentation needed to verify placements was just plain missing.

The AG’s complaint includes excerpts from internal company e-mails showing that top Corinthian executives were fully aware of the problems with the data but did little about them. “Corinthian Colleges, Inc.’s CEO and/or senior management were, at all relevant times, aware of the falsity, inaccuracy, and unreliability of job placement data and the statements they made concerning the data yet they did not disclose that fact to consumers or investors, or take any action to make consumer disclosures and statements to investors accurate,” the lawsuit says.

These cases show that the Education Department needs to create a single national job placement rate standard that makes clear exactly what types of practices are allowed and which are not. And it needs to develop a strict regulatory regime that will hold for-profit colleges accountable for these types of abuses.

Students rely heavily on job placement rates when deciding which career college program to attend. The least we can do is make sure that schools are not cooking the books on the rates they disclose.

Syndicate content