A Cure for Declining Enrollment: Launching the Right New Programs

Six in 10 respondents to Inside Higher Education’s 2017 Survey of Community College presidents reported declining enrollment, with 21 percent saying enrollment losses were 10 percent or more. In addition, many state governments are reducing funding. The combination is forcing cuts in college budgets that were already tight.

Can new programs restore growth and relieve the strain on budgets? If so, how can colleges fund these new programs when budgets are so tight? One answer lies in data-driven program portfolio analysis. Well-chosen new programs can significantly increase enrollment and tuition. Cutting a few failing programs can free up the money needed to fund new programs. However, making the right choices of programs to launch and cut requires a wide array of market data, a formal rubric for evaluating it and a sound process that uses institutional knowledge and builds consensus on challenging program decisions.

This work starts with defining the market or markets the school serves. Analyzing students’ home addresses can help schools define the geographic market(s) from which they actually draw students. Once the market is defined, there are four broad types of external data schools should gather:

Student Demand: There are several sources of data on student interest in academic programs, each of which has strengths and weaknesses. IPEDS is the most common and most comprehensive source. IPEDS is the Integrated Post-Secondary Education Data System. It collects data on all colleges accepting Title IV financial aid. IPEDS also provides a taxonomy for almost all programs (CIP codes. CIP: Classification of Instructional Programs. There are over 1,400 CIP codes in IPEDS.). Unfortunately, IPEDS is a very dated measure of demand: it reports completions, which occur years after students make their college decisions. While difficult to obtain for a large number of programs, current Google search data can be pulled for specific programs.

Employment Trends: What is the job market for graduates of a program? How large is it? How fast is it likely to grow? What kind of salaries can people entering the field expect? Much of this data is available through the U.S. Department of Labor Occupational Outlook Handbook. This information is also a little dated, since it uses survey data collected in the year before the jobs data is released. More important, the forecasts of employment growth are not accurate. Eighty-five percent of these estimates miss the actual growth rate by 50 percent or more. To get more current and detailed data, colleges can use information on actual job postings from several sources, including Burning Glass Technologies. Many schools are also required to post disclosures on their websites of actual placement rates by program. These disclosures provide a direct observation on colleges’ ability to place graduates in a specific market and program. We recommend using all three sources to get a complete picture on employment trends and opportunities for your graduates.

Competition: How many schools in a market offer a particular program? How large are their programs? How recently were they launched? Is the marketplace saturated? Answers to these questions can help an institution decide whether there is an opportunity to launch a new program and what to do with existing programs that have limited appeal. IPEDS completions provide a very complete list of competitors and their size (number of completions). To judge saturation, you may wish to look at completions per capita, trends in completions and Google’s competitive index.

Strategic fit: How well does the level of the degree offered match employers’ requirements? It makes little sense to offer an associate’s degree in a field of study where entry-level jobs call for a master’s degree. Two sources cover this topic: The Bureau of Labor Statistics (BLS) and IPEDS. BLS tracks the education level of people in the workforce, so you can see what degree employers want. IPEDS gives good data on the degree level students usually achieve in each field. This combination allows community colleges to choose programs that align with the degree levels they offer.

Once this data is assembled, you will have dozens of metrics to consider for each program and market. Now you will need a scoring rubric to evaluate the metrics and rank programs. This rubric should reflect your institution’s strategic goals. For example, an institution looking to differentiate itself might give greater weighting to competitive factors. It is also important that faculty (usually deans) and other constituencies review and refine the rubric, so they support and use the results in their decision-making.

Once the program portfolio has been assessed and ranked, an institution can make strategic decisions about each program’s future. These decisions fall into four categories — Start, Stop, Sustain or Grow.

Start: If a program has strong student interest, good employment prospects and limited competition, a college should evaluate launching the program. This evaluation should address the costs of the launch, the educational requirements of the program and its fit with the school’s curriculum.

Stop: If a program has few students in a market that is very small or saturated, or if it has poor student outcomes, the college should consider teaching it out. These are tough and contentious decisions. In our experience, these tensions are reduced when everyone can see the data and recognize that fair and consistent criteria have been used to make the cuts.

Sustain: If an institution offers a program in a field where the market is saturated but there is strong demand, sustain the program at its current level.

Grow: If an institution already has a program of study in a field with growing demand and the market has little competition, the program warrants additional investment to increase enrollment. This is often the best option for short-term growth.

Importantly, the assessment process should include relevant stakeholders, such as institutional leaders, deans and selected faculty, admissions, marketing, career services and labor representatives (if faculty and/or staff are unionized). In a workshop setting, participants can analyze the data and its strengths and weaknesses, review and — if necessary — revise the scoring, and use the data and scoring to make informed recommendations. The work can be completed in a two-day workshop. A well-run program workshop not only makes good program choices; it also speeds up the decision-making process. As important, in our experience, it greatly reduces the friction that often surrounds program decisions, especially the decision to stop or teach-out a program.

Stopping failing programs does not save huge amounts of money. In our view, adding one successful new program will improve budgets far more than cutting several small programs. Nonetheless, these small cuts are important, since they free-up funding for new programs that can make a big difference.

Community colleges have many small programs to consider cutting. Gray Associates recently conducted an analysis that revealed 53 percent of community college programs have nine or fewer completions. These programs yielded just nine percent of total completions. Cutting a few of these would have little effect on students and would generate savings to re-invest in growth.

completions

Case Study
A community college in the Northeast serving 15,000 students was facing budget shortfalls as a result of declining enrollment and reduced state support. The president concluded the institution needed to “right-size” spending priorities and program offerings. Although it regularly reviewed programs to make sure they were meeting workplace needs and were cost efficient, it needed a better model that would work across the institution. Because of its public service mission, it could not eliminate programs that weren’t profitable but were meeting essential community needs.

Gray Associates was brought in to conduct a program portfolio strategy workshop for 25 members of the institution’s senior team, faculty as well as administrators. It also performed a multi-dimensional analysis on the institution’s 70 existing programs plus 1,400 other IPEDs programs. As recommended above, this analysis examined data on student demand, employer needs, competition and strategic fit. The analysis also examined employment data at the state, regional, and local level that showed what job markets for different fields looked like in the cities and towns the institution served. Different scoring systems were used to evaluate transfer programs and career programs.

As a result of the findings, the school launched a hospitality and culinary arts institute and a new psychology degree. It also enhanced its computer and information sciences program and made changes based on the analysis to its distance education, class sizes and how it offers supplemental services associated with its program portfolio. Workshop participants identified programs in eight areas to discontinue — six certificates and two degree programs.

“Without the right tools and without the right facilitation…we could have fallen into arguing with one another and getting highly defensive about the decisions that were being made,” the school’s president noted. “While anybody can argue with data a little bit, because it came from so many different perspectives and more than one source…even if you disagreed with one or two of the data points, it was hard to disagree with all of them.”

There was unanimous support among workshop participants for the programs the school decided to launch or enhance. Surprisingly, there was also unanimous agreement on programs to Stop. Faculty proved able to make tough choices and agree to cuts in their own departments. The structure of the workshop helped people gain confidence in the process, the president said.

As important, the cuts freed up $200,000 that was invested in faculty for new programs. Even with these investments, the school still had a net savings of $140,000, a multiple of what was spent on the research and workshop.

Essentially, community college leaders have three strategies for managing their program portfolios:

  1. Grow at least one existing program that has potential to become larger.
  2. Start at least one new program to increase enrollment and tuition revenue.
  3. Trim the long tail of smaller programs to reduce costs.

Not only will this strengthen an institution financially by bringing in more revenue while reducing expense of unproductive programs, but student outcomes should also improve as programs are more closely aligned with local employer needs.

Community colleges can reverse negative enrollment trends by selectively adding new programs. However, to do this successfully, they will need sound data on the market for the programs they plan to offer, analytical techniques and a custom scoring rubric to rank and prioritize programs, and faculty confidence in and support for the system and process.