Few Details Yet About Drastic New Funding Format

grant funding

NEWS & ANALYSIS

The administrations of colleges and universities throughout Ontario are anxiously awaiting details of the “nuts and bolts” of the Conservative provincial government’s intentions to tie their operating grants to “measurements of their performance outcomes”.

After a somewhat vague announcement about the plan when the provincial budget was unveiled in April, then-Minister of Training, Colleges and Universities Merrilee Fullerton* explained that funding would be tied to statistical reporting about graduation rates, graduate employment rates, graduate earnings, work-integrated learning opportunities (work-placements as part of curriculums), “community/local impact” (of the institution and its students), and other demonstrable gauges of “skills and competencies”. [*In June, as a result of a Cabinet shuffle by Premier Doug Ford, Fullerton was replaced by Ross Romano, MPP for Sault Ste. Marie.]

As part of the drive to foster employability in the Science, Technology-and-Trades, Engineering and Math-related fields – so-called STEM occupations – there has also been an indication that universities’ funding may be tied to their involvement in research-and-development, and colleges’ to their training of apprentices.

In total, according to a subsequent CBC News report, the government intends to tie approximately $3 billion (60 percent of the total) of its annual funding of colleges and universities to measurement-based outcomes.

The rather amazing facet of all of this is that the new funding system is supposed to be implemented, in phased-in fashion, beginning in the 2020-21 fiscal year (starting next April) ...

... Yet colleges and universities have not, yet, been told what metrics will be used, or how the “statistical proof” is supposed to be gathered or demonstrated to the ministry.

For a couple of decades, the two dozen colleges have been “graded” by the provincially dictated Key Performance Indicator (KPI) survey system. Every year, current students fill out questionnaires gauging their satisfaction with their educational experience, and with the services and facilities provided at their schools. KPIs also track each college’s graduation rate (how many people actually complete their multi-year educations, as opposed to dropping/flunking out at a midway point). A polling company engaged by the government also tracks the grad employment rate (six months after graduation), and tries to interview grads and their employers to gauge their satisfaction levels.

If the provincial government now intends to use grad-related information as a major factor in the new funding system, it will have to significantly improve the data-gathering method. Critics have long pointed to the grad- and employer-related KPI results as based upon insufficient surveying – and, thus, easily “skewable”.

Under the KPI system, the interviewing of recent grads is purely voluntary, and many don’t agree to participate in the process. Also, the grads are asked to voluntarily provide the contact information for their employers – and, also, give their permission for the surveyors to contact their new bosses. Many of the grads refuse to do that, because they don’t want their employers to be pestered. Even if they provide such information, the employer is not obligated to respond to the survey – and many choose not to do so. As a result, the sample-size of both surveyed grads and employers is actually quite small … and that, in turn, can severely skew the results.

For instance, Colleges Ontario, the organization representing the administrations of the two dozen schools, noted that a recent survey-year featured a total graduate population of 100,752 students. Of those, less than half (47,200) completed their “Graduate Satisfaction” surveys. And, of those, less than a tenth agreed to have their bosses contacted, so only 3,200 employers completed the Employer Satisfaction survey. Translation: Only three percent of grads are having their “employment satisfaction traits” evaluated by this KPI.

Think of what that may mean in a microcosmic example … A tiny school awards diplomas to 100 graduates in a particular year. At the end of the surveying process, only three of their employers have commented on their job performance. One of them isn’t thrilled. That one-out-of-three number means that it can be portrayed – and will be in this KPI scenario – that only 66 percent of employers are satisfied with this entire graduating class.

Aside from the quantitative issue, there’s also a qualitative question of “What is this particular KPI actually evaluating?”

Let’s say a student graduates from a program in a very complicated discipline; but, for one reason or another, secures employment in a totally unrelated and comparative menial field. Let’s say he has graduated as a brain surgeon, but is now working as a ditch-digger (no offence to ditch-diggers).

The boss of the ditch-digging crew completes the KPI survey. Maybe he is totally satisfied with his new employee, maybe he is completely dissatisfied with him.

In either case, what does that evaluation really mean? The employer is, obviously, not evaluating whether the employee was educated to be a competent brain surgeon. He, for the most part, is only providing an opinion about the graduate’s innate work-ethic, not about the specifics of his academically-provided skill-set …

… So, from the college’s perspective (and the government’s too), what constructive, informative value – if any at all – does such an individual KPI have? It doesn’t address the efficiency or efficacy of curriculum content or the educational delivery method. And, because it is so off-topic, it doesn’t really reflect on the potential-for-employer-satisfaction that would have been more realistically gauged if the graduate had actually been employed in his academic discipline. A response of “I’m satisfied/dissatisfied with him as a ditch-digger” says nothing about the school’s brain-surgery-training program, nor about his/her fellow grads, nor about the overall quality of the school as a whole. Such data is of no constructive use to the school in terms of improving itself, and should hold no weight with prospective students who are thinking about enrolling there to become brain surgeons.

For a very detailed look at this problematic KPI factor, see http://stclair-src.org/news/need-know-news/dissatisfaction-satisfaction-surveying

Aside from KPIs and the role they may play in the new funding system, the other acronym that will be involved are SMAs. Those are the Strategic Management Agreements that each college develops and furnishes to the ministry annually, outlining its specific goals for the coming academic – and fiscal – year.

Presumably, part of the new funding system will evaluate if a school achieved or exceeded its ministry-ratified goals (in which case it would “earn” its full grant funding), or failed to reach its objectives (in which case it would be “penalized” by reduced funding).

At this stage in the process, however, words like “may”, “might”, “perhaps” and “presumably” are still dominating the discussion. While it has announced its intentions, and set a very rapid schedule to implement those intentions, the government has still not provided full details about how the bureaucratic system will actually work – especially in terms of what will constitute “proof” that SMA-promised goals have been met (or not).

It is the latest example of this Conservative government’s modus operandi of announcing dramatic, even drastic, policies, without simultaneously providing extensive details about their practical implementation.

Stay tuned.