Looking for our new site?

College Completion

Snapshot Report: Postsecondary Student Mobility Rate: 2011-201

June 27, 2014
By: Nicholas Brock
 

A new report from the National Student Clearinghouse Research Center examined student mobility rates from 2011-2013. The student-mobility rate is the percentage of students across all levels of study who enrolled in more than one postsecondary institution during an academic year. The report shows that students 20 years or younger accounted for the highest mobility rates, followed by students aged 21 to 24 years old. Furthermore, mobility rates reported were slightly higher among women than men.

Among the report’s other findings:

  • Student mobility rates increased from 8.8% in 2010-11 to 9.4% in 2011-12, stabilizing at 9.2% in 2012-13.

  • Just over 9% of all students attended more than one institution during the 2012-13 academic year.

  • Among students whose first 2012-13 enrollment occurred at a community college, 11.5% had also enrolled somewhere else by the end of the academic year.

  • Of all students who attended multiple institutions in 2012-13, nearly 40% moved between 2-year public institutions and 4-year public institutions (in either direction).

Freshman Year Financial Aid Nudges: An Experiment to Increase FAFSA Renewal and College Persistence

June 20, 2014
A recent paper from EdPolicy Works at The University of Virginia reports on the results of low-touch interventions to increase FAFSA renewal.  Researchers sent college freshmen personalized text message reminders that provided information about FASFA filing deadlines and how to get help with re-filing and financial aid more generally. Not only was the intervention low-cost -estimated at $5 per student - it had a large impact on community college students.  Community college students who received text message reminders were 12% more likely than students who did not receive this outreach to continue on to their sophomore year. The report argues that strategies such as this one, personalized text messages with useful and relevant information on financial aid, hold great promise for supporting an increasingly diverse population of students as they decide where/if to apply to college, file the FAFSA for the first time, and select courses after they enroll.

Among the paper’s others findings:
  • While much effort has been put into helping families initially complete their FAFSA, less has been done to ensure families re-file the FAFSA each year (required for students to maintain their federal financial aid).
    • Research demonstrates that nearly 20% of freshmen Pell Grant recipients in good standing do not successfully re-file their FAFSA, with rates particularly low among community college students and students enrolled in certificate programs.
    • Both of these findings point to FAFSA re-filing as an important gateway to persistence in college.
  • Community college students, in particular, have the potential to be impacted the most from this type of outreach as they are three times more likely to fail to refile their FAFSA than freshmen at four year institutions.
    • In addition, community college students usually get less individualized financial aid advising, are more likely to be first generation students, and typically work longer hours while enrolled in school.
  • Messages had no impact on freshmen at four-year institutions who traditionally have high persistence rates and high re-filing rate.
    • It is worth noting that the authors believe interventions of this type still hold potential for students at four-year colleges and universities, as participants in this study were all from Massachusetts institutions, who on average, have much higher retention rates than the national average.

Completing College: A National View of Student Attainment Rates – Fall 2007 Cohort PART TWO

March 7, 2014
The National Student Clearinghouse Research Center has released its second annual report on student attainment rates for student who first enrolled in postsecondary education in the fall of 2007. The report tracks students for six years through the spring of 2013.

Among the report’s findings:
 
  • Overall, 56.1% of first-time degree seeking students who enrolled in postsecondary education in fall of 2007 completed a degree or certificate within six years. Student completion rates, however, varied greatly by a student enrollment intensity.
    • Students who were enrolled exclusively full-time had a 77.7% completion rate.
    • Students who were enrolled exclusively part-time had a 21.9% completion rate.
    • Students who were enrolled on a mixed basis – periods of full and part time—had a 43.2% completion rate.
       
  • Students who began college at a more traditional age – age 20 or younger – had higher completion rates than older students who first enrolled in postsecondary education. By age groups, students who enrolled in postsecondary education in fall of 2007 completed a degree or certificate within six years as follows:
    • Among students ages 20 or younger, 59.7% had completed a degree within 6 years, 15.9% were still enrolled in school, and 24.3% were no longer enrolled.
    • Among students between ages 21 and 24, 40.8% had completed a degree within 6 years, 16.3% were still enrolled in school, and 42.8% were no longer enrolled.
    • Among students over age 24, 43.5% had completed a degree within 6 years, 12.5% were still enrolled in school, and 43.9% were no longer enrolled.
  • Student completion outcomes also varied based on the type of institution in which a student first enrolled.
    • Within six years of starting their postsecondary education at a 4-year private nonprofit, 70.2% had completed at a four-year institution, 2.6% had completed at a 2-year institution, 9.7% were still enrolled in school, and 17.5% were no longer enrolled in school.
    • Within six years of starting their postsecondary education at a 4-year public institution, 59.9% had completed at a four-year institution, 3.6% had completed at a 2-year institution, 15.0% were still enrolled in school, and 21.6% were no longer enrolled in school.
    • Within six years of starting their postsecondary education at a 4-year private for-profit institution, 40.1% had completed at a four-year institution, 2.3% had completed at a 2-year institution, 13.4% were still enrolled in school, and 44.3% were no longer enrolled in school.
    • Within six years of starting their postsecondary education at a 2-year public institution, 29.9% had completed at a two-year institution, 10.0% had completed at a 4-year institution, 18.9% were still enrolled in school, and 41.2% were no longer enrolled in school.
    • Within six years of starting their postsecondary education at a 2-year private for-profit institution, 60.4% had completed at a two-year institution, 3.0% had completed at a 4-year institution, 7.8% were still enrolled in school, and 29.8% were no longer enrolled in school.

How Full-Time are “Full-Time” Students

February 28, 2014
Complete College America has released a policy brief reporting the results of a survey looking at enrollment patters of “full-time” and “part-time” students. Specifically, the brief looks at the number of credits “full-time” students are taking. The report finds that most students classified as “full-time” are not taking enough credits to graduate on time.

Among the survey results:
 
  • 69% of college students are not enrolled in a sufficient number of credits to graduate on time.
  • 52% of “full-time” students are taking less than 15 credits, the standard number of credits needed to graduate on time.
    • Among 4-year institutions, only 50% of “full-time” students were taking 15 or more credit hours.
    • Among 2-year institutions, only 29% of “full-time” students were taking 15 or more credit hours.

STEM Attrition: College Students’ Paths into and out of STEM Fields

February 21, 2014
The National Center for Education Statistics has released a study looking at undergraduate student attrition in the STEM fields (science, technology, engineering, mathematics). Specifically, the study looked at student movements in and out of the STEM fields between 2003 and 2009. In the study, student attrition was defined as undergraduate students declaring a STEM major and subsequently switching into a non-STEM field or leaving postsecondary education without earning degree or certificate. The study aimed to explore STEM attrition rates compared to other fields, the fields into which STEM majors move, and various student characteristics of those leaving and persisting in STEM.

Among the report’s findings:
  • ​Among 2003-2004 beginning bachelor’s degree students, 28 percent entered a STEM major at some point between 2003 and 2009. Among beginning associate’s degree students, 20 percent entered a STEM field in that time period.
  • Among beginning bachelor’s degree students who entered a STEM field between 2003 and 2009, 48 percent had left the STEM fields by 2009. Among beginning associate’s degree students who entered a STEM field between 2003 and 2009, the STEM attrition rate rose to 69 percent.
  • Attrition rates were similarly high in non-STEM fields. Among beginning bachelor’s degree students who entered the education field between 2003 and 2009, 62 percent had left the education field by 2009. Among beginning bachelor’s degree students who entered a health sciences field or humanities field between 2003 and 2009, by 2009 58 percent had left health sciences and 56 percent had left the humanities. Similarly high attrition at the associate’s degree level was also seen in education (70%) and humanities (72%).
  • Among students who left the STEM field for another field, 22 percent of bachelor’s students and 16 percent of associate’s degree students ended up pursuing a business degree, and 12 percent of bachelor’s students and 20 percent of associate’s degree students ended up pursuing a health sciences degree. 

Completing College: A National View of Student Attainment Rates – Fall 2007 Cohort

February 7, 2014
The National Student Clearinghouse has issued a research brief looking at the six-year outcomes of students who first enrolled in postsecondary education in the fall of 2007. The brief provides data broken down by type of postsecondary institution and by student enrollment intensity (e.g., full-time, part-time).

Among the findings:
  • Within six years of starting their postsecondary education at a 4-year private nonprofit, 69.1% had completed at a four-year institution, 2.6% had completed at a 2-year institution, 9.8% were still enrolled in school, and 18.4% were no longer enrolled in school.
  • Within six years of starting their postsecondary education at a 4-year public institution, 58.4% had completed at a four-year institution, 3.5% had completed at a 2-year institution, 15.2% were still enrolled in school, and 22.9% were no longer enrolled in school.
  • Within six years of starting their postsecondary education at a 4-year private for-profit institution, 40% had completed at a four-year institution, 2.3% had completed at a 2-year institution, 13.3% were still enrolled in school, and 44.5% were no longer enrolled in school.
  • Within six years of starting their postsecondary education at a 2-year public institution, 28.6% had completed at a two-year institution, 8.8% had completed at a 4-year institution, 19.1% were still enrolled in school, and 43.5% were no longer enrolled in school.
  • Within six years of starting their postsecondary education at a 2-year private for-profit institution, 60.5% had completed at a two-year institution, 2.1% had completed at a 4-year institution, 7.4% were still enrolled in school, and 30% were no longer enrolled in school.
  • Students enrolled on an exclusively full-time basis had significantly higher completion rates within 6 years of first enrolling.
    • Of students enrolled exclusively full-time, 76.2% had completed a degree within six years, 3.5% were still enrolled in school, and 20.3% were no longer enrolled.
    • Of students enrolled exclusively part-time, 21.9% had completed a degree within six years, 10.9% were still enrolled in school, and 67.1% were no longer enrolled.
    • Of students enrolled on a mixed basis (i.e., periods of full-time and part-time), 41.4% had completed a degree within six years, 25.4% were still enrolled in school, and 33.1% were no longer enrolled. 

A Profile of 2012 ACT-Tested High School Graduates: College Choice Report Part 2 Enrollment Patterns

January 24, 2014
ACT has recently released data on 2012 high school graduates who took the ACT and enrolled in college. The report focuses on student enrollment patterns. In general and in keeping with past research, higher ACT scores, higher levels of parental education, and higher degree aspirations are associated with a higher likelihood of enrolling in a four-year college, attending a private four-year college, attending school out-of-state, and enrolling in colleges farther away from a student’s home. 

Lingerers in the Community College

December 20, 2013
A recent report from the Community College Research Center examines data on first-time college students at nine community colleges including demographics, course enrollment and performance, credential completion, and transfer rates. The report’s analysis is focused on “lingerers,” defined as students who completed 30 or more college-level credits, were still enrolled in their fifth year, but had not yet earned a credential.

Among the report’s findings:
 
  • Lingerers tended to be roughly similar to credit students (students who were enrolled in developmental education or college-level coursework) and completers in terms of ethnicity, age, and median household.
  • The vast majority of lingerers intended to earn a credential or transfer to a four-year institution, similar to the intentions of completers.
  • Lingerers appeared to be less prepared for college than completers.
    • 83% of lingerers were referred to remedial education, compared to 76% of completers.
    • 10% of lingerers tested as “college-ready,” compared to 17% of completers.
    • Only half of lingerers enrolled as full-time students, compared to 60% of completers.
    • Lingerers failed about 25% of courses in which they enrolled, while completers failed 10% of courses.
  • While lingerers attempted similar numbers of credits compared to completers, they earned many fewer credits than completers.
    • Lingerers earned, on average, 57 credits, while completers earned about 82 credits.

Snapshot Report: Degree Attainment

December 13, 2013
According to a recent report from the National Student Clearinghouse Research Center, attainment of science and engineering (S&E) degrees has increased by 19 percent for traditional aged students (26 years old and younger) and by 25 percent for older students (over 26 years old).

Among the report’s other findings:
 
  • Degrees in S&E accounted for 32% of all bachelor’s degrees in 2013, an increase from 30% in 2009.
  • There has been a 19% growth in S&E bachelor’s degrees over the last five years, compared to a 9% degree growth in non-science and non-engineering disciplines.
  • In 2013, students over the age of 26 accounted for 26% of all bachelor’s degrees and 18% of all S&E bachelor’s degrees earned.
  • Over one-third of degrees earned by traditional age students were in S&E disciplines, compared to one-fifth of degrees earned by students over 26.

Colleges Are Supposed to Report Pell Graduation Rates -- Here's How to Make Them Actually Do It

October 30, 2013
​Since 2008, the federal government has spent nearly $200 billion on the Pell Grant program. We know that this sizeable investment has bought a 50 percent increase in the number of people getting these awards. But how many graduates did these funds produce? What percentage of the individuals graduate? And which schools are doing the best with the lowest-income students?

Congress wanted to know the answer to all these questions. That’s why it included requirements in the 2008 reauthorization of the Higher Education Act (HEA) that required colleges to disclose the graduation rates of Pell Grant recipients, students who did not receive Pell but got a Subsidized Stafford Loan, and individuals who got neither type of aid. But it only asked institutions to disclose this information, either on their websites or potentially only if asked for it, not proactively report it to the Department of Education. The results have gone over about as well as a voluntary broccoli eating contest with toddlers. A 2011 survey of 100 schools by Kevin Carey and Andrew Kelly found that only 38 percent even complied with the requirement to provide these completion rates, in many cases only after repeated phone calls and messages.

Absent institutional rates, the only information of any sort we have about Pell success comes as often as the Olympics, when the National Center for Education Statistics (NCES) within the Department updates its large national surveys. These data are great for broad sweeping statements, but cannot report the results for individual institutions, something that’s especially important given the variety of outcomes different schools achieve. Instead, these surveys can only provide information about results by either the sector or Carnegie type of institution. And the surveys are too costly to operate more frequently.

Fortunately, there’s a chance to fix this problem and get colleges to report this completion data. The Department is currently accepting comments on its plans for data collection under the Integrated Postsecondary Education Data System (IPEDS) for the next several years (see here to submit a comment, here for the notice announcing the comment request, and here for the backup documentation of what the Department wants to do). This means there’s an opportunity for the public to provide suggestions before the comment period closes on November 14 as to what additional information IPEDS should include
.
To be clear, a lot of what the Department is already proposing to add into IPEDS through this collection will help us get a significantly better understanding of student outcomes in postsecondary education. First, it would implement some recommendations from the Committee on Measures of Two-Year Student Success, which Congress called for in the 2008 HEA reauthorization to capture students that are not currently captured in the federal graduation rate because they are not full-time students attending college for the first time. The committee’s recommendations, which are being implemented here, aim to capture those missing students by requiring colleges to reporting on the success rates of three additional groups: (1) students who are enrolled part-time and attending for the first time, (2) those who are enrolled full-time and have attended college elsewhere, and (3) those who are enrolled part-time and have attended college elsewhere.  Colleges would then report how many of these students either received an award, are still enrolled, transferred, or dropped out after six and eight years. And it will start this reporting retroactively so that the public won’t have to wait until 2023 to find out the first results.

Other proposed changes to IPEDS are smaller-scale but also important. Colleges would be asked to provide information on the use veterans benefits on their campuses. And the way for-profit colleges report their finances data would be better-aligned with the way public and private non-profit colleges provide this information.
But these changes still leave us without one obvious set of completion information—rates disaggregated by socioeconomic status. Sure, attending full-time can be a proxy for a student’s financial circumstances, but not as definitively as getting a Pell Grant.

The Institute for College Access and Success and others have already argued that the Department should add these data into IPEDS. In response, NCES has noted that improvements to the federal student aid database may make it possible to calculate completion rates for Pell students. But that’s an incomplete solution. That database is legally prohibited from collecting information on students that don’t get federal student aid, so there’s no way to produce the HEA-mandated graduation rate for students who received neither Pell Grants nor subsidized Stafford loans.

Of course, you can’t bring up any discussion of data reporting without running into the “B” word: burden. But remember, this isn’t new burden—colleges are legally required by an act of Congress to provide these graduation rates. Any huge encumbrance these represent (and I’d argue it’s probably not much since you would just be taking a subset of an existing cohort that has easy to identify characteristics based on student aid receipt) has already occurred. In fact, U.S. News and World Report is already getting some schools to provide this information, but it won't share the raw data.

In an ideal world, we would not have to beg and plead with colleges to tell us whether they are successfully using the more than $30 billion they receive each year to educate low-income students. Instead, we would have a student unit record system capable of processing all this information without adding burden to colleges or forcing them to rely on costly alternatives like the National Student Clearinghouse. But thanks to Virginia Foxx (R-N.C.) and the college lobby (primarily the private institutions), we don’t live in that world. Instead, we’re left with IPEDS where these data should be.  
Syndicate content