Long time readers of this blog might remember a video SBNation produced that would show up under our stories. It showed a dramatic burning flame background and carried on about how the Canadian teams were all bad. A narrative that sorts the NHL teams based in Canada into a group and claims they all have some characteristic in common seems to appeal to sportswriters on both sides of the border. It’s easy to frame a story that way. But even when those teams were all out of the playoffs in the same year, the differences between them were greater than the similarities.
We know that the various teams that happen to be in Canada are vastly different, not just in quality on the ice, but size of market, financial health of the franchise and quality of the management staff. So with that understanding of how things work, this CBC story making the rounds needs a skeptical look:
Pro sports foundations get poor marks from charity watchdog - CBC News
Canada's pro sports teams may have strong track records, but when it comes to their charitable foundations, a national charity watchdog says fans may want to consider giving money elsewhere.
The framing device of lumping a group of organizations together by type and applying a single and dramatic characterization sure looks familiar. The report the story is based on takes things further:
Advice from the Coach’s Corner – Keep your stick on the ice and your wallet in your pocket when it comes to giving to professional sports foundations.
Be a fan, cheer your favourite team. Yet be careful not to confuse supporting your team with donating to your team. The way professional sports foundations raise money, giving to them should be thought of as primarily entertainment, not intelligent giving.
That’s a very bold generalization that relies heavily on the definition of “intelligent giving”. It also generalizes from the foundations associated with seven NHL teams, the Toronto Blue Jays and, via MLSE, the Toronto Raptors and other MLSE holdings, while ignoring all other professional sports including the CFL.
How did Charity Intelligence Canada (Ci) get to this conclusion which the CBC took and ran with?
Our approach to charitable giving is different. Just as a financial analyst researches potential stocks to find the best investment opportunities, Ci uses similar research methods to find exceptional charities for donors. Ci also helps donors create a balanced giving portfolio to best reflect their giving interests and the change they hope to achieve. As a result, donors have the tools they need to give better and get higher returns on their donations - donations based on evidence.
This means that more of what donors can afford to give goes toward making a difference. Charity Intelligence’s analysis goes beyond plain subjectivity or narrow financial analysis to dig deeper to arrive at those charities proven to be the best in their field: not just ‘do gooders’ but ‘good doers’, too.
Ci created a weighted ranking system that combined four basic measures into one result which is styled as a star rating from one to four. They’ve applied this ranking to approximately 700 Canadian charities.
This ranking is a very counter-intuitive thing for a hockey fan who is used to seeing three stars handed out at every game on quality of play. Some of Ci’s ranking methodology is based on performance metrics, but the bulk of it is assigned on the basis of reporting transparency. What they are explicitly not doing, and they take pains to remind people of this, is making a value judgement on the quality of the work the charity carries out.
Charities are rated based on:
40% Donor Accountability - a charity’s grade on the quality of its reports, its social results reporting
20% Financial Transparency - audited financial statements provided
20% Needs Funding - cash and investments relative to what it cost to run charity programs for 1 year, called “program cost coverage”
20% Cost-efficiency - 15% for fundraising costs and 5% for administrative overhead within a reasonable range of 5%-35%
The cost efficiency and needs funding categories are based on hard numbers provided in the audited financial statements or in the public CRA tax filings. Cost efficiency is just a rating on a sliding scale that assumes less spent on fundraising costs or administration is automatically better. Needs funding is actually designed simply to down-rate charities with more reserve cash, not measure the overall sustainability of the organization.
The other two categories are about how and what the charity reports. Financial Transparency is a sliding scale of how and how quickly charities provide audited statements. The publicly available tax filings are not audited, and financial statement data is more reliable.
The tricky part of Ci’s ranking model is the big one: Donor Accountability. It is an assessment on the quality of reporting each charity does about how and on what they spend the donations they’ve collected. This is not a rating of the quality of programs. The in-depth explanation of how Ci scores this shows they have a methodology that is tailored to the type of charity, and they report this back to the charity in confidence.
For anyone even passingly familiar with model-based rankings or measuring of outcomes using systematic means, this all sounds very good. This is exactly the way a WAR model for players is created, by establishing criteria and weighting multiple factors into a single result. The difference here is that the results can be only one of four ratings, or five, if they ever give out zero-star ratings.
Every model for ranking hockey players begins with some measurable aspects of their results that are proven to be significant, but they also include some assumptions. They wouldn’t exist without it.
In this model to rate charities, Ci is rating, in part, on the very thing Ci needs to produce its reports. They have made the assumption that donors’ access to the information necessary to make good decisions is of high importance. They are an advocacy group promoting the point of view that donors should be making their decisions about how to give their money based on maximum bang for the buck or “intelligent giving”. Their expressed goal is to increase the amount of reporting charities do, and while donors might actually want that, Ci needs that to function, so they aren’t advocating from a neutral position.
The “Social Reporting” grade is the Donor Accountability measure worth 40% of the model’s output. It is supposed to be a guide for curious donors to tell them if enough information is disclosed, not a quality judgement on programs. However, it is calculated relative to other charities analysed not relative to a benchmark that says yes, this is enough.
Ci’s data shows that more charities get two stars, just under 40%, than any other rating, but that the three- and four-star rated charities combined are just a bit higher than that, though below half of what they analyze.
The way I read their rating methodology, the only way a charity can get to three stars is if their Social Reporting and Financial Transparency score is high. Even a charity that just funnels money in and out with no overhead and little fundraising costs — as some foundations do — would need to be meeting Ci’s standards on reporting to score over two stars.
In Ci’s report on sports foundations, none of the eight foundations analysed gets over two stars. MLSE Foundation comes the closest, since the report details show that they are the only one to get full marks in the financial disclosure category and their cost efficiency and program funding is good as well.
The Social Reporting grade for MLSE Foundation did them in:
When it comes to the foundation’s bottom line results – the work they do, the grants they make, and the results achieved in helping kids or sports – little information is provided to donors. Only Jays Care Foundation and Ottawa Sens Foundation donor reports earn a B+ grade by Charity Intelligence donor accountability scoring. Other team foundations earn between B and C-. Poor donor reporting on its results holds MLSE Foundation at a 2-star “average” charity rating, rather than a 3-star “good” rating.
In contrast, the Calgary Flames Foundation takes in around $4 million, with 65% of that coming from the 50/50 draw held at games. And yet, somehow this organization manages only 30 cents spent on programs for every dollar raised, the lowest of the charities covered in the report, and under the threshold that CRA considers cause for concern about fundraising practices. MLSE Foundation’s was at 79 cents, over the average for a Canadian charity and the highest in the report.
The Calgary Flames Foundation received a better grade for their reporting on how they spend their money, but they don’t publish their financial data publicly. If they had, they might well have achieved a two-star rating from Ci’s model, the same as MLSE Foundation.
For a rebuttal to this report from John Bean of the Calgary Flames Foundation, listen to this interview:
Related
John Bean discusses the questions surrounding the Flames Foundation - Sportsnet.ca
In making Social Reporting the biggest single factor in their model, Ci are placing a heavy emphasis on their system of analysis, which is complex and also confidential as to the way each charity is graded. It is here that Ci is making a value judgement — not about the charity, but about donors. They are promoting the idea that giving to a charity should be motivated not by your own enjoyment of an event, but by your informed opinion on the efficiency and value of how the charity operates. That sounds good on the surface, but it also doesn’t quite tally with how people operate in the world.
The truth is, people give to charity for a lot of reasons and one of them is to feel good about themselves. Ci wants you the donor to care mostly about how much information the charity provides. They also want you to care about how much is spent to raise the funds. This is a grey area in judging charities, but this point from the CBC article is worth considering:
“When you’re talking about a charity that has access to sports celebrities and they have access to all sorts of resources, you would expect that they are in a better position than your average charity to fundraise. And an argument could be made that they should be able to fundraise at an even lower cost,” says Mark Blumberg, a Toronto-based lawyer who works closely with charities across the country.
Blumberg also addressed an issue affecting some of the foundations profiled, namely that they are holding onto a lot of the money they have raised:
“I think that it’s questionable that an organization is doing active, vigorous fundraising when they are potentially sitting on very large amounts of money… So I would question why do they have such a large reserve,” said Blumberg.
He said there can be good reasons why a charity would build up reserves, for, say, a capital project or to ensure they can at least satisfy their commitments to other groups, but in his view this needs to be justified.
One thing to bear in mind here is that Ci, who include three years of data on their website, made this report from one year of activity. No leeway is made for an organization having an unusual year of savings or spending.
Another expert quoted by the CBC thinks you should look at outcomes to judge a charity:
Bruce McDonald, president of Imagine Canada, a charitable sector research and accreditation group, says ultimately donors really need to look for tangible outcomes in the community.
His group encourages people to “not only look at costs but as well really look and say ‘Are the programs that are being operated, are they working? Do they make sense?’
”And then, as a donor, am I comfortable with the investment that’s going to be required to have that kind of success?”
What McDonald is talking about is exactly what Ci is concerned with in their social reporting grade, but Ci might well be setting standards so high, well beyond what we as donors will really require, that the end result is flattening out the majority of charities into a single not good enough two-star rating. It’s a little like making a WAR model that ranks all the players below the superstars as equally mediocre. You need some distinction between results finer than a binary good/bad.
Ci is going so far as to tell us fans in their report that we should not give our money to sports foundations as a class of charity for any reason other than entertainment. This goes too far in two ways. First: by grouping charities that have good financial results but who don’t report information in the way CI likes in with organizations with legitimate red flags over their fundraising and spending activities, the conclusion that they’re all equally bad choices for donation seems like it made for a catchy headline. This conclusion isn’t supported by the report’s information unless you buy in on their model’s assumptions.
Sometimes a model says more about its makers than about the subjects it models.
MLSE is a hybrid charity, one that raises funds and distributes them to others, but also one that runs its own programs. Their website contains extensive information about their programs, and is very direct that they are all about sports and fitness. They are aiming their programs at racialized groups or economically disadvantaged groups, primarily children and youth, in Toronto. They are, in other words, taking the money they raise, a lot of it from attendees at games who shelled out big for tickets, and directing at the part of Toronto who will never come to a Leafs game.
If you go to the MLSE Foundation website, you can read through the multiple pages about what MLSE Foundation does, and then you can move on to the MLSE Launchpad site and find a lot of information on their programs and services.
A look through the Calgary Flames Foundation shows a very similar looking focus and program support on a much smaller scale.
Ci doesn’t clearly articulate why one charity’s reporting is better than the other, or ever tell me what it is they think MLSE isn’t disclosing that I need to judge from, but I haven’t got the time to read this much that they sweep aside as “too little”. What I have read fits in with my beliefs that professional sport in Canada owes an obligation to the young people least likely to ever grow up to play on their teams, and that they should be levelling the playing field for the benefit of those kids and society as a whole.
I researched MLSE Foundation when PPP chose to promote the MOTB initiative. If ever there was an example of charitable giving that’s about entertainment and sharing fandom as well as doing good, it’s that one.
Related
Leafs Money on the Board has Maple Leafs fans coming together for a good cause
Ci starts out with the premise that efficiency trumps emotion in who you should choose to give to. But they seem to be missing the point that emotion makes people give in the first place.
MLSE Foundation, just by what they do, is helping us all to confront who, in the city that our team plays in, gets to dream of being on the Leafs one day and who doesn’t. Donors are free to give that aspect of their work value. I do. I’m really struggling with this report and the CBC headline that equated MLSE Foundation with the Flames Foundation and everything in between, particularly given how much MLSE Foundation spends and how much the Calgary Flames Foundation banked in this one year.
I don’t want to overemphasize reliance on measuring the proportion of donations that get spent on programs. At scale, spending a lot on a fundraising event can return millions in donations even if the proportion looks bad. Ci does not make that sort of complex judgement, but they do seem to be overemphasizing access to the data they want to use to run their website over financial integrity.
I agree with the criticisms of levelled at many of these charities over their financial reporting. I assume that is bureaucratic inertia, where Ci speculates about league influence and darker motives in their report. I’d go further and say some of these foundations look like they are PR exercises first and charities a distant second, but the levelling of all of them into one category and telling you to keep your wallet in your pocket seems unwarranted in the extreme. The further generalization of that conclusion to all charities of this type that haven’t been analysed is particularly unsettling given that Ci lays claim to a scientific methodology underlying their results.
This isn’t a sports media site chortling over a few teams’ place in the standings. This is an organization advocating against you giving money to these charities or any others of this sort. If you google Calgary Flames Foundation, the CBC story is a top result.
I seriously question the value of Ci’s modelling and the way they draw their conclusions. More than one of these foundations is not like the others by the way I weigh and measure things.
Comment Markdown
Inline Styles
Bold: **Text**
Italics: *Text*
Both: ***Text***
Strikethrough: ~~Text~~
Code: `Text` used as sarcasm font at PPP
Spoiler: !!Text!!