(shutterstock)

(shutterstock)

Attention College Shoppers

Graduates' employment data may not tell the whole truth

Back Article Jul 1, 2014 By Steven Yoder

Workers increasingly need a college degree to survive in today’s complex economy, so as college costs and student loan loads rise, parents and prospective students are asking tougher questions about the results to expect from a baccalaureate. But the answers they’re getting are often inadequate — schools’ data about graduates’ employment and salary levels can be misleading and inconsistent, rely on a range of data collection methods and be based on small samples.

College admissions consultant Regan Ronayne has seen it firsthand. A few weeks ago, she was researching a private for-profit career college in Southern California for a student who was comparing schools. The college’s statistics about graduate outcomes wowed her: 80 percent of students had a job within six months, according to its survey of graduates. Then she dug deeper into the numbers. The survey sample size was tiny, and almost all of the jobs landed were part-time.

The Post-College Crunch

Students contemplating college are running up against some daunting numbers. In the past 10 years, tuition has spiked 75 percent at California’s 4-year public colleges and more than 20 percent at its private colleges, even after adjusting for inflation.

In 2010, those costs forced almost half of California freshmen to seek student loans; 10 years earlier, only one-third did so. The amounts they’re borrowing have grown as well, totaling $8,000 for the average California freshman in 2010, an inflation-adjusted increase of 36 percent from five years prior.

The tenuous economy is treating degree-earners less kindly than in the past, too. Nationally, today’s grads are more often unemployed and are earning less than their counterparts from the early 2000s, according to a spring report by the U.S. Department of Education. Many of those who do find employment are doing jobs unrelated to their field of study, and as a result, they earn less than those employed in their chosen profession. All of this dampens income and employment prospects years into the future, as research shows that wages and career advancement are slower for those who graduate into a weak job market, even after the economy recovers.

So today’s parents and students are asking harder questions about their return on investment.

“Is a diploma from this university that costs an additional $10,000 or $20,000 really going to be worth it? These are conversations I wasn’t having as often 10 years ago,” says Scott Hamilton, an educational consultant from Future Stars College Counseling Center in Sacramento. 

Wide Divergence in Data Collection

Consumer demand or no, federal law already requires colleges to report data on their graduates. The 2008 Higher Education Opportunity Act mandates that schools collect information on their levels and types of employment and make that available to the public.

Most schools survey graduates to get that information, but the law doesn’t mandate which data to collect or how to get it. Colleges are free to use their own methods, and those can vary widely between schools.

Take the timing of the canvassing done at Sacramento-area colleges and universities. UC Davis surveys its graduates every three years. The University of the Pacific does so annually, nine months after graduation. Sacramento State issues two surveys: one a few weeks before graduation and another between one and five years after.

The level of specificity (and therefore the usefulness) of the three schools’ employment data also runs the gamut. UC Davis and Sacramento State both ask whether the job that former students are working in is tied to their degree, but UOP doesn’t. Of the three, only UC Davis collects information on graduates’ salary levels.

As for how students with specific majors fare in the market, all three offer different information. Sacramento State provides employment data sorted by degree program, such as the percentage of anthropology graduates who have a job. UC Davis provides similar data, but only for general categories of study. The school reports, for example, what proportion of those with a social science major have jobs in their preferred field. UOP offers no data on employment rates broken down by degree field.

Graduates’ response rates to each school’s survey also diverge. At UOP, 20 percent of students contacted last year took the survey, and at UC Davis it was 27 percent. Sacramento State came in at 39 percent for its annual survey, though the fact that the school administers it in April, before students leave campus, probably accounts for the higher number. All three schools make the point in their reports that those response percentages are generally considered acceptable in survey methodology.

Most schools survey graduates, but the law doesn’t mandate which data to collect or how to get it. Colleges are free to use their own methods, and those can vary widely between schools. 

But some national groups want more. Small class sizes could make schools’ numbers suspect — those who respond to surveys might be doing so because they’ve had better luck than their classmates. UOP’s response rate, for example, means its data rest on the experiences of 114 students, which is the number who filled out the survey. Since 53 of them aren’t working but are pursuing further education, the school’s employment data represent only the remaining 61 graduates. The National Association of Colleges and Employers, a collaboration between about 2,000 schools and 3,000 employers, thinks schools should get data on at least 65 percent of graduating classes. The organization says school researchers can go beyond traditional surveys by contacting faculty, employers and fellow graduates and by checking social media sites.

Take St. Olaf College, a small liberal arts school in Minnesota. It collected data on 630 of its 693 graduates in 2013. It starts surveying students about one month before graduation, following up twice by email with those who don’t respond. For those who still haven’t replied, St. Olaf spends the next five months garnering additional information by contacting faculty and staff, checking students’ LinkedIn profiles or entries in the school’s online alumni database, sending more email and snail mail reminders and calling students directly.

St. Olaf doesn’t gather salary data through that process or identify graduates’ names. But it does offer a searchable database on its website that gives fine-grained detail about the first place students land after graduating. For example, of the school’s nine 2013 American Studies graduates, four are pursuing master’s or law degrees, two are interning, one is volunteering in Israel, one is a full-time carpenter and the other is a full-time technical writer for a health care software company. That’s background information that could sway college-shopping students and parents curious about what the market might hold if they pick St. Olaf and choose that major.

The Quest for New Standards

The Obama administration may soon tighten the requirements for schools to collect high-quality outcome data. Last August, it announced a plan for a new college ratings system that the U.S. Department of Education will publish before the 2015 school year. The administration says outcome measures, including graduate earnings, are likely to be a key part of the ratings.

The National Association of Colleges and Employers thinks schools should get out ahead of more federal mandates. In January, the organization released proposed voluntary minimum standards for data collection. They call for more consistency in colleges’ survey methodologies — having all schools collect data annually within six months after students graduate, for example. They also urge colleges to make their employment data sortable by major and to include graduates’ salaries.

But others question whether any set of data can ever capture the value of a college education. Too many other outcomes are important but not easily measured, such as the personal connections made in school, says Jon Reider, director of college counseling at San Francisco University High School and co-author of Admission Matters: What Students and Parents Need to Know About Getting into College. Besides, the government doesn’t collect information about one metric he finds critical: the quality of teaching. “To try to measure [schools] on some single- or even 2- or 3-factor scale about outcomes strikes me as wishful thinking,” he says.

NACE’s research director Ed Koc agrees that college outcome data don’t measure the full value of the college experience. Still, launching students into fulfilling careers has been a key part of colleges’ missions going back to the days when Oxford and Cambridge were established to train students in divinity, law and medicine, he says. “[Training students for careers] isn’t the only aspect of a college education, but it is a critical component that gets translated into a meaningful professional outcome,” he says.

No matter the debate about college’s purpose, the fact remains that average college graduates still do far better than do those with only a high school diploma, earning salaries almost $18,000 a year higher. And that’s likely to keep students coming back.

Recommended For You