The Overall Efficiency Rankings on the front page are an amalgam of the rankings in seven “Key Indicator” areas, all of which are based on the latest year of published data (2018-19). These rankings favour universities that prioritize instruction and student support, and keep other costs down.
The Efficiency Ranking for each Key Indicator is shown on the applicable page. It is based on current performance, measured from four different angles. (Other measurement angles are provided but do not contribute to the Indicator ranking.)
The “working papers” for the Efficiency Ranking can be seen in the Expenditure Rankings, accessible from that page. The analysis shows the following information for each university, as well as the average for the Top 25 and the U-15 (“a collective of some of Canada’s most research-intensive universities”):
- Expenditure levels for 2001 (the base year for these analyses), the latest year, and the previous year
- The Dollar Expenditure in the latest year
- The change in (inflation-adjusted) Dollar Expenditure since 2001
- The current Financial Impact of the change in Expenditure level – assessed from three angles
Financial Impact is the most insightful measure. It shows the dollar amount by which each university is over-spending (or under-spending) compared with its 2001 spending level, and compared with its current peer group average spending level. This measure brings reality to efficiency issues in a way that percentages can’t. It should be emphasized that these are apparent over-spends and under-spends. In some instances there may be a plausible explanation, but the “vigilance value” of the measure rests in the fact that it quantifies the issue and the importance of either satisfactory explanations or corrective action.
The Financial Impact numbers reinforce the point that relatively small percentage variances can have large dollar consequences. Using an actual example, it may not seem too significant that one school spends 3.4% of its General Operating budget on Travel, while the average Top 25 school spends 1.7%. However, that difference represents an apparent over-spend of more than $12 million at the 3.4% school – $12 million that can’t be spent in the classroom, or that must be covered by student fees.
Two situations create cause for concern:
- If the university exhibits a decline in spending efficiency compared with 2001.
The comparisons with 2001 define change over the longer term, some of which is a natural result of changing times. However, In other instances it’s the product of “issue creep” – small, annual changes that have turned into larger and more damaging shifts over time.
- If the university’s current level of spending efficiency is below that of its peers.
The peer group comparisons reveal particular issues at some universities, where a school appears to be under-performing its peers.
Needless to say, the concern would escalate if a school exhibits both characteristics.
This analysis for the most part revolves around clear comparables:
- Comparisons with the past are always insightful, because they look at the same school across time.
- Comparisons with other universities are also valid; while there may be differences between individual schools, they exist to do similar things, and must deal with similar change and challenge.
- The data is collected annually through processes honed over more than eighty years by organizations created by the universities for that purpose. The Canadian Association of University Business Officers (CAUBO) and Universities Canada, which the universities continue to fund and govern, work with Statistics Canada in compiling the data. See DATA SOURCES
- The submission of financial data by the universities is required to follow detailed reporting guidelines prescribed by CAUBO – see the latest guidelines HERE. (While per-student measures would be insightful in a number of areas, the reporting requirements regarding enrollment are not as well defined, making such measures less reliable. For that reason, analysis on a Per Student basis is infrequently used.)
- This analysis doesn’t consist of raw numbers but inflation-adjusted interrelationships, measuring a variable in the context of another relevant variable (e.g., Instruction expenditure per dollar of Tuition Fee income, or Academic Salaries as a percentage of Total Salaries & Wages).
- The ranking for each Indicator is based on four different viewing angles, which helps to ensure the validity of the results.
If a school shows well it’s because it is performing well in comparison to its peers on a range of fronts. The converse is also true.
The important thing is that it IS analysis – analysis of something that has long been opaque to those it serves, and to those who pay the bills. The only way to monitor the financial efficiency of our universities is to analyze the data they publish, and compare them.
This isn’t just about focusing on those that seem to be performing poorly, but also those that seem to be performing well; they demonstrate that higher standards are attainable, and offer learning opportunities to those that need them.
The rankings are also shown on a provincial level, because systemic responsibility for overall fiscal vigilance rests on both university boards of governors and provincial ministries. The provinces must respect the academic autonomy of their PSE institutions, but they still have a duty to ensure the effective use of public funds, and to safeguard PSE and students; student debt is not just a student problem, it’s a debilitating societal problem. The failure of most provinces to adequately fulfill that duty of vigilance has contributed to the current situation, and became more problematic after they turned down the funding tap in the years since 2009.
LINKS TO KEY INDICATOR PAGES
Indicators used in the Overall Rankings:
Staff Budget Deployment
Operational Support Costs
Central and Decentral Staff Costs
Support for Students
Other Efficiency Measures
Other Indicators – NOT used in the Overall Rankings:
Central Administration Cost
Other Salaries & Wages Analysis