Producers, like any businesspeople worth their salt, demand an acceptable return on the investments they make.
It should be no different for Agriculture Canada program spending.
But once again, the department has caught the eye of the federal auditor general for falling short of expectations when it comes to evaluating the effectiveness of its spending.
Auditor general Michael Ferguson referenced Agriculture Canada and two other departments — human resources and skills development, and fisheries and oceans — as those most behind in their program evaluations in a report to Parliament April 30.
Read Also

Proactive approach best bet with looming catastrophes
The Pan-Canadian Action Plan on African swine fever has been developed to avoid the worst case scenario — a total loss ofmarket access.
In the report, Ferguson said five of 18 direct grant and contribution programs at Agriculture Canada have not been properly evaluated.
That ranked the department at the bottom with 72 percent of its program spending having undergone the necessary review for need and effectiveness.
Agriculture Canada quickly responded by promising to do better, as well it should. The next five-year cycle for program evaluation started April 1, and the department needs to improve its track record well before the cycle’s completion in 2018.
It is not the first time.
In 2009, then auditor general Sheila Fraser took Agriculture Canada to task for failing to conduct assessments on much of its program spending.
The department admitted it had completed about half of its planned program evaluations between 2004 and 2008, and only 11 percent of spending was evaluated for effectiveness.
Fraser cited the Prairie Grain Roads Program, which spent $175 million to strengthen roads that farmers were pressed into using more often to truck grain as railways and elevator companies centralized their operations. She stopped short of calling it a bad program. The problem was, she said, who knew?
Part of the problem then, she found, was that information on the effectiveness of spending was not regularly collected.
Just as it did last month following the 2013 report, Agriculture Canada agreed in 2009 that it must do better. Sometimes the song really does remain the same.
It’s true that attempts to evaluate complex programs face many potential pitfalls, but it is important that proper protocols and formulas are constructed and adhered to. Taxpayers deserve to know their dollars are being spent effectively in areas where they serve the most need.
Farmers deserve no less.
One cautionary note: safeguards must be built into accountability tests to ensure that evaluations properly assess long-term needs. Programs that might seem to be operating ineffectively at the time they are reviewed might actually be delivering needed goods over the longer term and providing increased stability to the industry. Regional spending needs and differences could also be easily adjusted for.
An overly simplistic, one-size-fits-all evaluation is not going to work.
But if properly done, evaluating spending for need and effectiveness could pay off for taxpayers, governments and farmers. It could point the way to greater efficiency gains and service improvements.
How often have we heard farmers complain of slow program payments? Are there opportunities to shift money around within the department, perhaps?
Proper evaluations could potentially divert funds more quickly to areas of need and unearth new areas where needs have been left unaddressed.
Such program performance audits based on efficiencies and effectiveness seem a no brainer.
We trust that during the next five-year cycle, Agriculture Canada will find a way to reach its own performance targets and that effectiveness tests become routine for all spending.