Blogs

Why Do an Annual Report? The Discipline of Distilling Learning

A year after our turn from a more field-facing to donor-facing model, review our highlights, lowlights, and learnings. It’s not a laundry list of activities, we promise!

Print Page

It might be a surprise (or simply foolhardy) for TAI to release an annual report within weeks of our Chair suggesting such reports can be a waste of time. We are all too aware that annual reports tend to quickly end up in waste bins (physical and virtual). So why do a report and why read it?

First, it was a useful exercise for the TAI team and members to reflect on what is working and what is not.  2017 was the first year of implementing our 2017-2019 strategy. Preparing the report forced us to sit together and distill highlights, lowlights and above all, our learnings. These informed our early 2018 retreat and work planning. We hope the structure also makes this an easy and worthwhile read.

Second, for the first time, TAI’s annual report will duplicate as our grant report. As part of a strengthened commitment to being smarter funders, including making life easier for grantees, our members are initiating a shared reporting pilot. It makes sense that TAI – which relies on a grant structure to support the collaborative – be a test case.  The donor members have agreed that TAI’s annual report and a half-yearly update will serve as the grant report for all.  This further reinforces the need for the report to be structured around learnings to make sure the report offers useful insights as well as meeting the fiduciary and compliance reporting requirements.

So, what are examples of our learnings? Read our 2017 annual report to see those specific to shared member priorities on effective data use, tax governance, closing civic space and learning, but aggregating up, here are a few starters:

  • Be on the same page – Our members did not have a shared understanding of what each means by “grant-making practices”, “member collaboration”, “data for accountability” and “limited civic space”, despite agreeing on these priorities. Also, how each member’s individual strategies connect to the collaborative’s shared priorities and interests was not always obvious, so we focused on foundational work to identify common goals and problems across members’ portfolios and facilitated shared understanding on how dots connect. For example, we mapped our members’ approaches and developed collective pathways on accountable governance and tax issues (see related blog). To better understand what the necessary conditions are for transparency to lead to accountability, we developed a new framework with the Open Data Charter. We commissioned a study and survey on how grantees working on transparency and accountability experience closing civic space. We identified what different types of member collaboration could look like.
  • Collaborative learning requires an effective learning infrastructure – Learning together is critical to our strategy. We developed a plan for Collaborative Learning, Monitoring, and Evaluation to support and document instances of member collaboration, as well as adaptations in grantmaking practice. We started to build and review evidence and explored models for our staff and donor members to contribute to and learn from the transparency, accountability and participation field evidence and experience, along with our key stakeholders.
  • We need to bust siloes – Collaboration requires effective communication and breaking down siloes within organizations and across sectors. Some work streams spill over to other priority areas (e.g. data and tax; data and civic space on privacy), which was both an opportunity and a challenge. We found that deepened engagement with member organizations, beyond the Steering Committee and principal points of contact was valuable. This was no easy task and we continue to iterate and explore models that could effectively reach Project Officers for our activities and learning products. While we are mainly inward-facing, we connect with other stakeholders and engage on relevant issues (e.g. open contracting, natural resource governance).
  • Be flexible without losing focus – We continue to find the right balance between having fixed and clear target outcomes (our “north stars”), but still exercise a degree of flexibility to be able to immediately respond to needs in the TAP field. For instance, we scaled down our work on data privacy, in part due to bandwidth issue, but mostly as a reflection of varying levels of member engagement on the agenda. Our engagement with other stakeholders and on other areas in a “light touch” way allows us to keep abreast of opportunities and challenges in the TAP space and help better identify where collective TAI member action may have the greatest impact.
  • Find a governance structure that works – Experiences this past year affirmed the importance of having an assigned lead from each member for each of the work stream. Streamlining decision making and providing consistent and timely feedback among the Steering Committee, member project officers, and TAI staff was also critical, and we continue to explore more effective mechanisms.

There were operational hiccups – staff felt they were “stretched thin”, there was a backlog of learning products, and calendar coordination and paperwork were a drain on resources. How we will adapt, see next week’s blog introducing our 2018 work plan.