Measuring Success in Your Outsourced Services

Measuring Success with Outsourced Services 

Illustration By Brian Isham

In 2010, the Center for College Affordability released an article titled, “25 Ways to Reduce the Cost of College.” Number five on that list is to outsource more services.

The concept is justified this way: “Rather than expend vast sums from limited resources in an effort to perform functions in-house for which they do not have any particular expertise, colleges should focus on improving the value of their core functions, for which they do possess a comparative advantage.”

Indeed, the press is more and more filled with reports of colleges and universities choosing to outsource functions in the categories of student services, business, and education. The news goes both ways, telling how much money will be saved and how efficiencies will be gained, and how staff and students oppose outsourcing in favor of supporting the local community with in-house jobs.

Because outsourcing is a reality, it begs the question, once the decision is made and the transition complete, how do administrators establish and track performance and customer satisfaction? It’s a critical question, understanding that outsourcing has the potential to deliver a number of benefits, such as enhancing organizational flexibility, economies of scale, risk avoidance, and access to capital investment, as well as the potential for limitations, such as reduced collaboration, loss of in-house expertise, and loss of continuity.

Have Systems in Place

Administrators at Walsh College in Troy, MI, outsource their technology with Troy, MI.-based CareTech. They have a system for tracking performance and customer satisfaction that is working well enough in serving the private college’s 4,500 students that they recently renewed a five-year contract.

“We have both formal mechanisms and informal mechanisms in place to track satisfaction,” says Helen Kieba-Tolksdorf, CPA, Walsh vice president, CFO, and treasurer. “To begin, we track to the service level agreement. Both teams meet at least once a year to review the contract and service. In addition, we have an academic technology committee and an administrative technology committee, and they meet periodically to review issues. Finally, I have a formal weekly meeting with Joseph Esdale, our CareTech client executive, who has the Walsh title of executive director of Informational Technology.

“Informally,” Kieba-Tolksdorf continues, “both Joe and I maintain an open-door policy and are happy to talk with anyone at any time, should a need arise.”

Esdale agrees, noting more specifics on tracking performance. “We generate reports out of the ticketing/call system to track the average speed of answering a call and resolution rates. In addition, we randomly survey customers after a ticket closes. I review the surveys monthly and share them with Helen periodically. If there is ever negative feedback, I look at the ticket and the issue and talk with technicians about it. I estimate that only once every two to three months is there a response that is less than average.”

Esdale’s monitoring of service level, specifically first call resolution, is critical for businesses that provide call centers and service desks. However, in Top Five Customer Service Metrics (http://managementhelp.org/customers/service.htm), Barb Lyon recommends taking care in measuring the number of complaints you are receiving, as not everyone takes the time to tell you about a poor experience. On the other hand, if you ask your customers if they are satisfied with the service they received, “you are telling them that their satisfaction matters.”

The Customer Is Always Right

Lyon describes four other metrics for measuring performance and customer satisfaction. One is customer retention. She suggests focusing your retention strategies on customers who buy throughout your offerings. Another is response time. In our fast-paced world, response time is one way “we can communicate our sense of urgency and concern for our customers and their experience with our product or service,” says Lyon. To that end, set a response goal, and achieve it.

The third metric is time with the customer. Here, if you set benchmarks for duration and general time with the customer, it should be in relation to the ultimate goal of first-time resolution, as a satisfied customer not requiring follow-up is preferred to a speedy but unresolved interaction. The last metric is “churn.” Lyon says: “If you don’t know how much business you are losing, you won’t be able to understand how much new business you will require to stay out of the red.” She recommends using follow-up surveys, phone calls, and personalized emails for understanding lost business.

Administrators at George Mason University (GMU) in Fairfax, VA, do an annual survey to measure customer satisfaction. The public university, which boasts 20,000 undergraduate students, outsources services with Sodexo and various franchises. For a small fee, the survey is conducted and analyzed by the National Association of College & University Food Services (NACUFS). “I’m a big believer in measuring student customer satisfaction on our own vs. working with an independent survey provided by our contractor,” says Mark Kraner, GMU’s executive director for Campus Retail Operations.

The survey, which includes faculty and staff opinions as a subset, allows administrators to see from year to year how well they’re meeting their customers’ expectations. Plus, the survey has the ability to aggregate the scores against other schools to see how GMU compares to its peer schools and other schools in the area.

Take Action

But the most functional part of the survey is that it allows Kraner to put plans in place to address customers’ issues. “Every April we create action plans based on the survey results,” he describes. “We implement the plans in the fall.”

Lyon notes that the metrics chosen to measure performance and customer satisfaction depend on the type of business and your customer base, as observed in the differences between Walsh and GMU. So, if you’re considering strengthening your measurement metrics, experiment with different ideas until you learn what works best. Then pick one or two, and measure well. “Just do it,” she concludes.

As the Center for College Affordability’s article points out, outsourcing services and functions allows institutions of higher education to better focus on education. Outsourcing requires periodic measuring of your performance and customer satisfaction. The above five metrics can help you start the process. 

This article originally appeared in the College Planning & Management June 2013 issue of Spaces4Learning.

Featured

  • Florida State University Selects Architect for Lacrosse Stadium

    Florida State University in Tallahassee, Fla., recently announced that it has selected PBK Architects as the designer for its new lacrosse stadium, according to a university news release. The university’s women’s lacrosse team will play its inaugural season in spring 2026 along with the stadium’s completion.

  • Tennessee College Selects Designers for $72M Renovation Project

    Tennessee College of Applied Tech Memphis (TCAT Memphis) in Memphis, Tenn., recently announced that it has approved documents from DLR Group and 4FDesign for an upcoming multi-phased renovation project, a news release reports. DLR Group previously completed the college’s 2020 TCAT Statewide Master Plan Update.

  • PBK Announces New Chief Strategy & Growth Officer

    Architectural planning and design firm PBK recently announced the hiring of a new Chief Strategy & Growth Officer for its office in Houston, Texas, who will lead the firm’s strategic growth initiatives like mergers and acquisitions.

  • Understanding the Training of School Resource Officers

    SROs are now integral components of nearly every educational system in the country. But instead of being a more passive entity in schools, they have gradually become mentors to students, adding to their support network of teachers, parents, coaches, and other caring adults.

Digital Edition