We Want to Learn, Now Let’s Talk About How

Like many organizations, Reboot works hard to learn from what we do. To us, this means continuously reflecting on our methods and processes. But I was inspired to pause and reflect more deeply about what “learning” means during a recent convening of actors who specialize in governance issues. This governance community is eager to understand its impact, and as a result, “learning” is a hot topic of conversation. However, the term is fast becoming a buzzword and runs the risk of turning into a fuzzword. Our strategy to mitigate this risk is to be precise about how we learn.

The convening, held in Rio de Janeiro, was for TALearn: a community of practitioners, researchers, and donors working together to improve learning in transparency, accountability, and participation efforts around the globe. (Read Brendan Halloran’s summary of the workshop here). Led by the Transparency and Accountability Initiative, this community unites around a shared understanding of what “learning” is, and what it is not. As a community, we believe “learning” has been sequestered in the “monitoring and evaluation” phase of international development projects, which traditionally rely on quantitative exercises, and are conducted either after the completion of interventions or at formal, often far-apart, milestones. Learning should instead be continuous and adaptive to relevant needs, contextual structures, and processes. For TALearners, it is a path away from rigid results-based frameworks that only measure outputs, and toward mechanisms that produce observable governance outcomes.

How TALearners Learn

During the four days in Rio, participants shared inspiring examples of how they learn. For example, one donor described a “pre-mortem” process for when they consider support for an organization. This exercise involves mapping possible red flags across the prospective project cycle. It gives the grantmaker an opportunity to troubleshoot challenges before committing formal support, and metrics that they can use long-term for assessing progress if they move forward.

Another example came from a representative of a non-profit in sub-Saharan Africa, who explained how some donors approach them with problem statements that do not resonate with community needs. To align on a more relevant starting point, the non-profit invites these donors into a co-creative problem definition process. This is an important but difficult action for a grant recipient, as it requires challenging a funder’s assumptions. Yet in their experience, this open dialogue creates vital space for both the nonprofit and the donors to learn, and sets the stage for effective long-term collaboration.

We saw several good practices of learning-in-action during a site visit to Meu Rio, a Brazilian grassroots advocacy organization. Meu Rio constantly tests new ways of mobilizing citizens through online and offline engagement. They run experimental sit-ins, rallies, and automated phone calls to policymakers. They also recruit new team members from the favelas where many of their citizen-driven campaigns arise, and analyze how this helps them successfully influence government to win those campaigns.

These examples motivated me to reflect on our own learning methods at Reboot. I am sharing them with the hopes that others will do the same. Some of our peers have already begun to offer their thoughts. Sam Polk and April Knox of Results for Development wrote an enlightening report on learning methods based on their organization’s experiences. Marine Perron and Janet Eng of Fundar offer their insightful learning around technology platforms in this post.

If we can collectively take stock of how we are operationalizing “learning” as a community, we can also leverage this information to transform how we assess the impact of transparency and accountability initiatives.

How Reboot Learns

One promising step toward being concrete about “learning” is an emerging dialogue that asks us to define, “learning for what and for whom.” Methods that are purpose-driven (e.g. learning for what) and user-centric (e.g. learning for whom and by whom) are critical to answering what and why we are learning. Alan Hudson, the Executive Director of Global Integrity, developed a framework, shared in this post, for his organization’s new learning strategy. It is a useful prompt for the transparency and accountability community to get specific about learning efforts.

I’m using a similar framework here to organize Reboot’s methods of organizational learning, with a focus on our internal “users.” This focus deserves one important caveat: Our client and partner counterparts are an equally critical element of our internal learning environment. However, since we do not direct their learning, I’m focusing here on the four levels of ‘users’ that we can most influence. They are:

  1. The individual
  2. Project teams
  3. Those responsible for learning across projects
  4. Organization-wide directors and strategists

At each of these levels, we operationalize our learning in different ways, depending on our priority “purposes.” For example, some efforts target inclusive, efficient, or effective collaboration. Others aim to improve project execution, such as through strengthening communications with partners. Finally, certain learning activities are developed just to contribute to core professional development.

Across the board, we aim to develop processes and systems to surface, share, and apply learning in everything we do—whether we are developing an SMS-based feedback system, guiding an organization’s strategy, recruiting colleagues, or communicating across our internal units. Here are just a few examples of our specific learning objectives and methods for the four internal user groups:

 

Icon of an individual User: Individual Rebooter
  • Using Slack, we share knowledge to grow knowledge. We endlessly share articles and reports through an online Slack channel, “Reboot Reads,” to identify and then discuss lessons from our field. Since Reboot works in multiple domains, this is especially useful for learning from sources curated by each other’s expertise.
Icon of a team User: Project teams
  • We develop inclusive processes to ensure all Rebooters are well-versed in ongoing work. Project managers facilitate an inception workshop at the start of every project. Everyone across our operations, design, and programs teams are invited to join these workshops to understand project basics and to offer their own feedback. Through continual process-focused debriefs and sessions to synthesize observations and formulate insights, we make time to intentionally reflect before, during, and after project cycles.
Icon of teams collaborating User: Strategy, Design and Communications teams
  • We continually refine our internal best practices for more efficient use. We developed a “library” of past and present projects, and core approaches. The library serves as a digital archive of much of our institutional knowledge, so that all team members have access to language and lessons learned as we consider how to tackle a new initiative.
Icons of operations at Reboot User: Operations and Programs teams
  • We trial new recruitment processes to best meet the evolving needs of our team and our applicants. Previously, we asked candidates to complete an exercise in the final stages of the interview process to understand their problem-solving approaches. We are currently testing an update to this process by asking candidates to complete part of the exercise earlier, and monitoring how it streamlines hiring.

 

Opportunities for Further Learning

Reflecting on how we learn at Reboot, I saw two opportunities for growth. One is learning across projects, or program learning. We have improved our ability to learn from individual projects, but we can be more intentional about “connecting the dots.” What do our engagements have in common? How do we tell the stories that tie various aspects of our project portfolio together? How can we use shared project insights to iterate on how we are operationalizing our theory of change? To help us answer these questions, in 2016 we will pilot a recurring organization-wide strategy workshop focused on sussing out common elements across our engagements.

We also have room for growth in individual learning. We are busy people, so we do not always take advantage of all the opportunities to learn. We are not likely to get less busy, but we can discover ways to encourage individual learning within existing constraints. To this end, I am planning a round of internal user research to better understand our own behaviors. Where and when do Rebooters gain knowledge— on a plane? On their daily walk during lunch? On a work-from-home day in the middle of the week? Based on these answers, we can design more appropriate spaces for individual reflection and innovation.

This is only a peek into how Reboot learns, and I hope it is a starting point to a larger conversation. If you have further examples, please share them with us on social media or in your own blog post. We are excited to develop a robust case for how we learn (and strive to learn) in the transparency and accountability community.

Futher reading.