Have you ever wondered how to use all the data your organization collects to measure your success and report to your Board? How do you show whether the organization is doing a good job?
Organizations and their Boards define what a ‘good job’ looks like with a series of objectives. These objectives, known as strategic directions or goals, are included in an organization’s strategic plan. One way to measure these strategic directions is to examine how successfully the organization’s services are being delivered using the data your organization collects.
Volunteer Alberta has five strategic directions. One of our strategic directions is to ‘Facilitate knowledge exchange and access to learning opportunities to strengthen organizations’.
Using this strategic direction as an example, we’ll investigate two foundational considerations to report on meeting this strategic direction by using data that we collect. The key is to ensure the data tells a meaningful story to the Board.
Selecting a performance indicator
Do you want to tell your Board the number of participants at a training session? Or do you want to tell your Board about whether your clients are more skilled or confident following a training session?
The answer is… it all depends.
The rule of thumb is both. Report outputs when your initiative is new and you are just beginning to gather data. Report outputs and outcomes when your program or tactic has been in place for a reasonable period of time.
Outputs: the scale or number of actual activities that your organization undertook (ex. number of participants at the training session, or the number of training sessions). Outputs answer the question ‘What happened?’
Outcomes: the value or impact of your program (ex. what people got out of the training session). Outcomes answer the question ‘Why does it matter?’
When starting a new program or initiative (ex. a training session), the number of participants and sessions are meaningful for the Board. When a year or two of the training has passed, outcome-based measures become more relevant. By year two and onwards, the Board wants to know whether participants are more confident, for example, or can apply something new to their jobs as a result of the training. Regardless, outputs (the numbers) are always required for context as they show the scale of the service (and any growth).
Reporting the performance indicator
Using our data, outputs, and outcomes, how do we report to the Board on our progress and achievement of our strategic direction: ‘Facilitate knowledge exchange and access to learning opportunities to strengthen organizations’?
There are multiple programs and initiatives Volunteer Alberta works on to contribute to this strategic direction, and we report on several different performance indicators to share our progress with the Board. One performance indicator might be ‘% of participants who feel they can apply something new to their job that they learnt at the training session’.
Data over several years is especially powerful as it shows trends. If this indicator % reduces, then it may indicate that the training is not as useful as it once was, or alert us that it may be time to review and update the training material.
In addition to numbers, data also includes additional context and stories. Ex. Did the facilitator change? Is there a particularly inspiring story from a participant that we can share? How is the organization’s communication plan impacting this particular training opportunity?
With the data your organization is already collecting, it’s likely that you have a good amount of outputs and outcomes, along with additional information that you can share with your Board and truly measure the success of your work against your organization’s strategic directions.
Have more questions about reporting data to your Board? Ask in the comment section!
Susan Gulko
Volunteer Alberta Board of Directors