Friday Notes, July 11, 2025
Dear Friends —
People who believe that data, analysis, and evidence make essential contributions to repairing the world have often seen a pathway to impact through policy. Document a problem, apply natural and social science concepts to arrive at possible solutions, measure or model the relative costs and benefits of alternative approaches, and see the fruits of all that work pay off with a law, regulation, budgetary appropriation, or government program design that makes life better for a lot of people.
In some policy environments and for some types of problems, there’s been a lot of success — witness the salience of evidence use in macroeconomic policy, global public health programs, and cash transfer initiatives in low-income countries.
When there has been success, we’ve been able to learn something about what makes evidence uptake more likely:
Engage the intended users of the data and evidence early and often, taking as a starting point their conceptualization of a problem and an objective.
Deploy methods and researchers that are seen as trustworthy: reliable, valid, and transparent.
Choose your moment of greatest influence, with attention to the policy windows that open soon after elections, in the run-up to a planning cycle, in the midst of a crisis, or when a State of the Union is being drafted.
Communicate in the language that is native to the policy audience not the peer-reviewed journal.
I have seen with my own eyes that this recipe can work. I also have seen that it often does not.
In policy environments that are hostile toward expert knowledge, and/or when the analysis is around problems that have a contested moral valence, the track record of evidence use is disappointing. In fact, there are probably more visible examples of policies that defy sound evidence than those that affirm its value and, as On Think Tank’s Enrique Mendizabal has written, “Even when the evidence is robust, systemic resistance, political interests, and operational challenges often prevent its full integration into policy.”
Pessimism about the potential for analysis and evidence to make a positive difference through the policy pathway is on the rise as contestation over truth and the critique of the expert class is intensifying in many quarters. But maybe, just maybe, there is another pathway to explore: data and evidence to support the success of social movements.
In many ways, social movement leaders face challenges that can be well served by empirical methods. They have to figure out how to characterize and bound the problem they are working on; they have to understand the types of social changes that will make a real and sustained difference; and they have to discern the messages that change hearts and minds. In theory, these are all opportunities for contribution by people who know how to structure research questions, how to think about and generate representative data, and how to make sense of the text and numbers.
Off the top of my head, here are a few examples of social movements that have benefited in intentional and major ways from analytic work: the movement in India to improve basic education quality and access, culminating in the Right to Education Act; the anti-tobacco movement that led to greater restrictions on tobacco production, advertising, and sales; the awareness raising around the school-to-prison pipeline and the need for prison reform; the protection of the rights of informal workers; and more. In each case, activists, practitioners, and researchers worked together, each bringing unique capabilities.
If and when evidence mavens see that a policy door is shut and turn their attention to working with social movements, I bet they can apply those earlier lessons. They’ll need to learn how to foster early and deep engagement in a spirit of service; make sure the methods and the people using them are trustworthy and trusted; find the right timing for joint work; and focus on clear communication. I also bet that some significant mindshifts will be needed, including recognizing and respecting multiple forms of knowing and reorienting toward goals that are harder to conceptualize than a policy “win.” This is not work for the faint of heart, but it may be an extraordinarily powerful pathway.
[Useful work on this topic: Chicago Beyond’s “Why Am I Always Being Researched?” and IDinsight’s “A Framework for Partnerships between Research Organizations and Social Movements.”]
Here is an unsurprising fact: A considerable number of U.S. and European high school and university students submit work that is not their own — application essays, term papers, chapters in their master’s theses or doctoral dissertations.
Here is a surprising fact: Many of the paid but uncredited academic writers are from Kenya.
Oxford professor Dr. Patricia Kingori is one of the researchers who has studied the industry, and writes:
[T]ens of thousands of young and highly-educated Kenyans . . . . so-called “Shadow Scholars” are part of a vast global online marketplace, an invisible knowledge production economy, where students and academics in the global North solicit and pay for their services in exchange for confidential and plagiarism-free essays, theses, dissertations, qualifications and publications.
A new documentary about this is making the film festival rounds. Here’s the trailer:
I had been to the U.S. Open tennis tournament twice before I admitted to my tennis-playing life partner that I had absolutely no idea how the scoring worked. It’s an embarrassment that is seared into memory. So in case you’re in that situation as you watch this year’s Wimbledon finals, here is absolutely everything a person needs to know about the history of tennis’s quirky scoring.
Have a good weekend,
-Ruth