From Refugee Camps to Research Labs: A Journey in Evidence-Based Education

Evidence-Based Programming By Nora Marketos Published on February 7

Evidence-based education is a specific approach which is increasingly becoming more prominent in the broader education sector. In my view, it holds a lot of promise and should become the norm. I've been working on this both from a funder's perspective and from an implementer's perspective. And I am therefore aware of the challenges and misaligned incentives that are at play to make this happen in reality.

In today's blog I have the pleasure to share with you a conversation I held with Jenny Perlman Robinson , the Senior Lead of Growth and Innovation at Youth Impact. I met her at my time at the Jacobs Foundation when we collaborated in the context of the "Real-Time Scaling Labs" that she developed and led in her previous role at The Brookings Institution . Ever since, I have been impressed by her passion, genuine interest and just her impressive depth of expertise in all things education, innovation, evidence and scaling. And she is also the right one to talk about what it means from a career perspective to work on these aspects.


Dear Jenny, Thank you so much for taking your time and sharing your reflections with me. Let’s start looking back at your career: your career path shows a progression from direct humanitarian work with the Women's Refugee Commission to policy research at Brookings. How did your field experience in education emergencies inform your later policy work and research on scaling education initiatives?

I have to say it was really fun reflecting on my career path. I'd love to say it was all so deliberate and well thought out, but honestly, it was more serendipitous – though quite connected in retrospect. At the Women's Refugee Commission, at the time part of the International Rescue Committee, I had the incredible opportunity to spend time with displaced communities across Afghanistan, Pakistan, South Sudan, Darfur, northern Uganda, and so many other places. Our role wasn't direct delivery but advocacy – we were really trying to document the experiences and proposed solutions of women and children in refugee camps and bring those to the various powers that be.

During those seven or eight years, I think I was struck by a few key things that really informed my work going forward:

First, what was really brought home was this very short-term project mindset in the humanitarian field. I mean, by nature of the work, a lot of it was frankly putting out fires, literally and figuratively. And when you consider that people were displaced for an average of 17 years – that's an entire education cycle for children – it really begs the question of what responses are actually needed to make it sustainable, right? This isn't just about a quick catch-up program but an entire generation missing out on an education.

Second, there was this really stark disconnect between the humanitarian side of things and the longer-term development needs. The humanitarian responses were food, water, and shelter, while the development needs were longer-term things like education and livelihoods. And you know, I'm certainly not the first to talk about this humanitarian-development divide, but it was definitely very profound and apparent to me at the time.

Third, and I'd say this really informed my work going forward, there was such limited evidence available to inform these humanitarian actions. Research was, I think for many valid reasons, seen as a luxury or a nice-to-have. When you're operating in an emergency crisis situation, responding rapidly with limited funding on these projectized cycles, the idea of crowding in data and evidence just wasn't really happening.

So, I think all of this really challenged me to think differently about my future work, you know? How can we think less about making short-term projects more effective and more about what sustainable change actually requires? How do we look at not just these discrete moments of crisis, but this whole continuum of crisis, post-crisis, rebuilding, and stability? And how do we bring research and practice closer together, making evidence not just a nice-to-have, but really a must-have in service of improving programs?


Thank you for sharing those highly valuable and insightful reflections on your trajectory and guiding questions! Having worked both in implementation and research roles, what do you see as the key disconnects between academic research on education development and on-the-ground realities? And how might these gaps be bridged?

You know, when we look at education development, there's traditionally been this divide between researchers and implementers operating quite independently – sometimes people talk about these as two artificial camps of "thinkers" and "doers." But that's clearly a false dichotomy, right? These two groups need to be operating much closer together.

The challenge is that researchers and implementers often have different motivations and incentives pulling them in different directions. Researchers are typically rewarded for getting published in peer-reviewed journals and producing novel work, while implementers are held accountable for increasing outputs – historically things like numbers of children reached or books distributed, though this is increasingly shifting toward impact and outcomes.

Add to this the timing mismatch, where implementers face shorter-term project cycles while research often requires longer timeframes, just think about a traditional Randomized Controlled Trial, or RCT, that can take more than two years. Though I should say, as I'm learning in my new role at Youth Impact , there are ways to do experimental research more rapidly – it doesn't always have to be two-year-plus Randomized Controlled Trials.

The funding landscape doesn't help either. While some funders like the Jacobs Foundation have shown real appreciation for combining research and action, funding still remains quite siloed. You have implementation funders focused on delivery, and research funders who are less willing to support implementation.

But you know, I'm actually quite optimistic about how we're bridging these gaps. We're seeing more efforts to bring these worlds together – like implementation science focusing on closing the gap between what we know and what we do, the Brookings’ work we did together with Millions Learning and the Real-time Scaling Labs, Abdul Latif Jameel Poverty Action Lab (J-PAL) and Innovations for Poverty Action embedding evidence labs in government structures, and the What Works Hub at Oxford University focusing on evidence uptake by governments. And at Youth Impact, where I am now, we're really founded on this principle of bringing evidence and action together. We're first and foremost an implementing organization, but with a robust research team of PhD researchers working to improve programs. It's like this well-oiled machine where incentives are aligned and the structures are all pulling in the same direction. I'd love to see more of this kind of integration in the field.


Thank you for sharing those fascinating insights about bridging research and implementation, especially through examples like the Brookings scaling lab and Youth Impact's approach. Speaking of Youth Impact, I'm particularly interested in your in-house team of PhD researchers working in service of implementation. What have you learned about making this collaboration successful, and what advice would you give to academics looking to transition into implementation roles?

I'm still in my first year at Youth Impact, so these thoughts might evolve – we should circle back sometime and hopefully I'll have an even better response!

I think a lot of it starts with the organization's DNA, particularly our founders. Noam Angrist is this brilliant researcher and academic scholar, while Moitshepi Matsheng is all heart and action, with this incredible way of engaging government bureaucracy and people. That combination created a powerful foundation for bringing these worlds together. It attracts people who are serious about evidence but see it in the service of action on the ground. I mean, sure, our work gets published in prestigious journals, but that's really a byproduct – it's not what drives us.

There's also this very deliberate intention to have research and delivery working hand in glove. When I'm on a call with our partners in Ethiopia or India, you'll have both a research person and a content delivery person there. What's really struck me is how all our meetings start with looking at the data – it's become this natural muscle, asking questions like: What are we seeing in the numbers? Are we reaching our targets? If not, why not? It drives our implementation decisions.

And finally, we've been incredibly fortunate with our funders. They really get what we're trying to do and understand its iterative nature. They know that what we're proposing probably won't be right at first, and they expect us to learn and adapt. They're funding experimentation and rapid learning, which I hope isn't too unique – we need more funders like this out there.


Throughout your work on education in developing countries, how have you approached the fact of being a white Western researcher/practitioner working on African education issues in the context of the current localization trend? How has your perspective on this evolved over your career?

This is in many ways my existential crisis, to be honest, and maybe that's good – I hope I never stop asking myself "what is my role in all of this?" Whether I was visiting women in refugee camps in Afghanistan or now working with Youth Impact, a Botswana-based organization, I'm constantly thinking: What am I doing here? What value am I adding? Asking local counterparts for participation, for example in research, taps into precious resources, and I want to be incredibly clear about that.

I've always been uncomfortable with terms like "expert" for those in international or northern-based organizations. At the risk of sounding trite, I believe those living the day-to-day realities are the true experts. Those of us coming from the outside are here to support whatever they want to achieve, full stop. But I'm also aware that even listening can be extractive – time and information is a valuable resource. So, I try to be very thoughtful about what you do with that knowledge, with something someone has entrusted to you, and how to make yourself most useful.

For me, the role looks different in different instances. Sometimes it's bringing in perspectives from elsewhere, crowding in global evidence and experience, or helping connect dots when people are focused deeply in one particular area. Sometimes it's about bringing in resources or providing neutral, structured space for exchange. But fundamentally, it comes down to listening and then being very deliberate about what I do with what I've heard to best support what folks are trying to achieve.

And you know, while I fully support the localization agenda and pushing more resources to the local level, I do think there's still a role for global actors – but the allocation needs to shift. The lion's share going to international players while local people doing the majority of the work get a trickle? That obviously needs to change.


Having worked on scaling education initiatives for many years, what's your perspective on the current "scaling trend" in international development? There seems to be a push to categorize projects as either innovation or scaling as this is currently fancy, despite the important spectrum of work in between. How do you see this evolution in the field, and what are its implications?

This is a fun question since we started this work together! When we began at Brookings 10-15 years ago with the Scaling Labs, there was much less attention on scaling in global education. You had work on scaling in development broadly, and in specific sectors like health and food security, but education was different. Over the past two decades, I think it's net positive that we're seeing more interest, attention, and funding going towards this.

Of course, there are risks – it could become just another fad that doesn't last, or turns into empty jargon that means different things to everyone or nothing at all. I think we still often think about scaling simply as increasing numbers, and there's more work we can do as a community to not lose sight of impact, sustainability, and equity. That's become the second-generation issue we need to unpack further.

I do think donors can play an important role in this growing momentum around scaling. Rather than setting up a pie for everyone to compete over, how can we create more opportunities to collaborate, learn, and share? While competition can be healthy, in this case, it might be counterproductive.

The innovation versus scaling divide is interesting. Previously, innovation was the big fad – the shiny, sexy new thing, whether technology or whatever else. However, I have come to appreciate that innovation isn’t necessarily creating something new—it can also be about bringing a tried, true and tested approach to a new location or to more people—in other words, a form of scaling. So, in this sense, innovation, adaptation and scaling go hand in hand. I see this at Youth Impact now, where we maintain that innovative spirit of experimenting and testing, while thinking about how to achieve larger-scale, sustainable change. It's really about bringing the two approaches together rather than seeing them as two ends of a spectrum.


As a data-driven organization, how is Youth Impact thinking about and planning to leverage AI in your work? What opportunities and potential impacts do you see for using AI to advance your mission?

Youth Impact is experimenting quite a bit with AI, and my colleagues in India are probably furthest along in this thinking. They’re doing fantastic work with the government in Karnataka, implementing a version of Teaching at the Right Level – a low-tech, one-on-one tutoring program called ConnectEd.

Specifically, the team is looking at AI in several ways. First, for cost efficiencies, particularly helping teachers who gather and submit student learning data for analysis. We're exploring ways to reduce their administrative burden and friction points. Then there's the potential for AI to help deliver the program – whether through nudging parents and caregivers about upcoming phone sessions with students, or delivering follow-up tutoring sessions directly. Through this experimentation, the team approaches AI as a tool to support educators rather than replace them.


As you reflect on your career trajectory and your recent move to Youth Impact, what excites you most about the next chapter? And what advice would you give to those just starting out in evidence-based education programming?

Oh goodness, there's so much there! But at the end of the day, for me, it's really about interacting with people. Having that opportunity for human exchange has been profound for me. The idea of being a lifelong learner, constantly learning from others, whether it's researchers with their rigorous methods, teachers I just met in the Philippines delivering teaching at the right level in their classrooms, or policymakers trying to disrupt education systems – that opportunity to learn from folks is just such a gift and privilege.

And what really motivates and drives me is trying to make whatever small difference or impact I can in the world. That’s been my North Star. It comes back to the essence of what guides me: Where do I feel like I can make the biggest difference in my own small way? How do I keep pushing and challenging myself? And certainly, learning and enjoying myself while working with phenomenal people along the way doesn't hurt.


Thank you so much, dear Jenny, for sharing those reflections with me. I wish you all the best in your exciting new role at Youth Impact and look forward to staying in touch!