What makes great evaluation
In this section we talk about evaluation specifically, not just tracking or assessing impact. That’s because an evaluation goes beyond the question: what happened? Instead, an evaluation plan presents a formal approach for determining the extent to which certain objectives have been achieved and how the film project contributed. Think of evaluation as learning about how well your Strategic Plan is working.
As with all other stages of this process, your evaluation will be unique to your film. That’s why we emphasise learning and tracking.
Learning
Great evaluation is not only about understanding whether the objectives of a film project were achieved, but also about learning throughout the project.
One of the most common mistakes is to think of evaluation as a final product, rather than a process. It’s an easy mistake to make, because you’ll probably pull everything together into an evaluation report at some point. But, while that’s an important part of the process, the journey and what is learned along the way is often just as important (and sometimes more).
The right way to think about evaluation is as an iterative process of learning, which starts as soon as the film is put out into the world, or even before that, as you work with potential partners and stakeholders to prep for broadcast or theatrical release. And, because it is ongoing, it means you are in a position to react to what you are learning and refine your impact strategy - and even your evaluation approach - along the way, making the most of every opportunity.
What you track and measure for your evaluation - and the results - will ideally tie back to the unique goals and objectives you identified in your vision and Strategic Plan, while also accounting for any strategic midcourse shifts. In other words, your evaluation plan should align with your impact strategy.
A strong evaluation plan will look at both the influence of film and the contribution and impact of the campaign around the film. In other words, whereas a film might deepen the understanding of an audience member, it is the event planning that might lead an audience member to take a specific action. A strong evaluation plan will take stock of, and account for, the distinct but intersecting components of the campaign strategy, so that the evaluation can uncover what aspects of the campaign are working as intended and which may need to be adjusted.
A strong evaluation plan is proactive and inclusive. Share it with funders and partners. In fact, create it with them. Ask what matters to them, why they funded the project, and consider their goals as well as your own. Done right, evaluation will reveal information that multiple and varied stakeholders are looking for whilst ensuring - and informing - a strategic and powerful path forward.
A strong evaluation is also honest and attentive to the role of a film within larger contexts. In other words, it keeps the film in its place rather than centring it more than it needs.
A strong evaluation knows its limitations and states them clearly. In other words, it knows what it can say and what it can't say. It’s clear on what it can back up with concrete evidence and sound analysis versus what it suspects or assumes. Correlations are fine in evaluation - and they’re often the only thing that is possible, since it can be so difficult to accurately measure causation. Just be clear and upfront about your process and analysis so that your reader doesn’t interpret such a finding as causation.
Yes, time and resources are limited. But if you applied for and received funding, then some kind of reporting will be necessary. You can use those requirements as a starting point for what kind of evaluation to pursue, even if you do nothing else.
This is why we were so careful to emphasise in the Planning chapter that the Strategic Plan you came up with was only version 0.1. Because the unexpected will happen, and evaluation will help you to respond to it faster.
Pro tip:
Start early, start small... just start
The worst thing you can do is build evaluation up into such a big deal in your head that you don’t start collecting some data, and learning something, right from the beginning. Even if you just start by asking one question of everyone at a rough-cut screening – that’s a start. You can always improve it next time.
Here are 4 easy steps to get started:
- First, lay out your goals for the project (hint: you did that at the start of this guide).
- Next, brainstorm the kinds of outcomes your activities could prompt on the way to your goals (hint: better understanding, new commitments or collaborations, new narratives, etc.)
- Then, identify all the kinds of observable and collectable data you could gather to indicate that those outcomes are happening and determine how you will collect data (hint: the indicator matrix in 6.5 can help).
- Start collecting and tracking your data! (Eventually you will analyse it and it will go into your final report.)
We’ll dig into this in more detail in the next sections. But first up, a case study to bring it all to life.
Case study: The End of the Line
The sample Impact Plan from The End of the Line that you’ll find in the planning chapter is a long way from where the team thought they were going when they finished the film. Initially, their focus was firmly on changing political structures – and specifically on achieving a ban on Bluefin tuna fishing. The role of public awareness in their working theory of change was as a means to gather signatures on a petition to this effect.
But entry and exit surveys carried out at two London launch screenings of the film highlighted a different role for the public. The film achieved the major increase in awareness and perceived urgency that the team hoped for, by 85%. But what they didn’t necessarily predict was the commitment to personal behaviour change. Across the whole audience, commitment to buying sustainable fish almost doubled from 43% to 84% after one screening of the film. Although both remained important, this refocused the team’s approach more on personal consumption than on petition signing.
Another shift came soon after. Changing the procurement policy of restaurants and food retailers was part of the strategy – but relative to the focus on political change, not the main part. But then British supermarket Waitrose stopped selling swordfish and joined as a distribution partner for the film. And Pret A Manger founder Julian Metcalfe, after attending a preview screening, announced a total change in fish sourcing policy on the day the film launched. Then another major UK grocery chain, Marks & Spencer, announced a change to their tuna sourcing in the week after opening.
It became clear to the team that corporate policy change could be directly triggered by this film, and that this could make a massive impact, both directly on overfishing and through corporate communication to their customers on public awareness. They realised this would be an alternate way to build pressure for political change. So, they shifted strategy and began to deliberately seek this kind of change – resulting in policy change from major food brands like Sodexo and Compass, and restaurants and celebrity chefs including Jamie Oliver, Tom Aikens, Antonio Carluccio and Raymond Blanc.
The film team gathered data, thought through the implications of early successes, reoriented their strategy, and moved. That’s great evaluation.
If you’re curious, open, and take an evaluative approach to your campaign development, then you’re likely to make changes to your strategic planning. We see this over and over again. Take Blackfish. The team didn't set out to campaign with their film. Initially their main goal was to raise mass public understanding of a previously unrecognised issue. But, due to the swell in public sentiment, the film came to the attention of animal advocacy groups who subsequently brought attention to the issues the film raises through their own campaign work, rather than it being initiated and driven by the film team. It was an unexpected turn of events in terms of the trajectory of the film, but ultimately resulted in an extraordinary, organic and unprecedented outcome in terms of its impact, known as ‘The Blackfish Effect'.
Tracking
Great evaluation tracks more than numbers.
Perhaps the greatest fear around evaluation stems from the idea of reducing everything to numbers. Damn right. Great art cannot be reduced to a formula, so neither should your evaluation plan. You also need qualitative data - data that is about understanding how and why your film is having impact.
Just as you would gather material for a film - where you collect lots of small interviews, stills and B-roll footage together in a file - collect every article about the campaign, list every community screening (ideally with audience estimates), file every 'your film changed my life' email and make a note of every local politician who referenced your film in a speech. Everything.
Like the film material, this all adds up. Over a period of months and years, you’ll start to create a rich picture of your film’s journey through the world and how it has affected people and institutions and contributed to movements.
Yes, this takes up time, but it needn’t be overwhelming. Put regular evaluation time in as a team, ideally every week - because it’s most seamless and works best as a little-and-often process. Setting aside time in your diary and sticking to it is a super-simple but very effective way of making you keep on top of it.
Great evaluation tracks consistently
Lindsay Green Barber of Impact Architects suggests film teams systematise their data collection. One of the most valuable and fundamental principles of evaluation is consistency. In other words, if you ask all your partners the same three questions, it offers you a basis to make meaningful statements about what is happening across partners.
But don’t be deterred from asking more tailored and customised questions as well. While having a set of core questions that are asked consistently across partners is important, it may also be important to ask different questions to different types of partners or audience members to shed light on their unique backgrounds, contexts, and/or participation.
Great evaluation tracks only what you need to know
While numbers are often quicker to come by, they are not necessarily the data you need. Twitter followers and audience statistics, while they are easily available and might well be important if your primary Impact Dynamic is about Changing Minds, are far less relevant if your work is more focused on Changing Structures.
In the worst-case scenario, you’ll spend time and energy gathering data, yet once gathered, it won’t tell you anything material about whether your strategy is working or what you might need to change. There’ll be noise, not signal.
Pro tip:
Don’t confuse reach with impact.
Audience viewing figures, whether offline (from cinema/DVD & Digital/community screenings/ schools) or basic online metrics (like the number of web views on YouTube, Vimeo, Facebook, Google Analytics, Twitter) have been the standard way that the media industry have demonstrated or defined success: the highest rated show, the number one box office hit… But these tend only to define marketing and broad distribution success.
In the business of impact evaluation, these figures are useful in so far as they demonstrate reach or perhaps penetration into a target audience - but they do not prove impact. We don’t know what happened to the viewer, what it meant to them, or what if anything changed.
Having said that, in contexts where film teams are contending with intense censorship, simply getting a film seen can be very important. The Insignificant Man team, for example, experienced a protracted legal battle with the Central Board of FIlm Certification in India, which sought to stay the film’s release. The team eventually won a landmark judgement, allowing them to release the film. Given that this was a film about corruption and democracy, getting the widest possible reach for the film was in and of itself a boon. But how does this context influence the shape and focus of an evaluation?
Remember that a strong evaluation plan tracks progress with respect to intended outcomes. In other words, the fact of this judgement doesn’t tell us much about the film’s impact on viewers broadly speaking or how it shaped thinking related to corruption and democracy in India. It does, however, tell us something about the film’s impact on some viewers (the Board and related government representatives).
But context is also important to understanding the life of a film and its impact. And an evaluation must be attentive to it and to unintended consequences like this legal battle, as well as its implications. As such, clearly the film and this case together influenced popular conversations in India. Through an evaluative lens, it’s useful to look at this whole picture, but also to separate the component parts, i.e. the film’s impact as distinct from the impact of the case.
Geek Out: Ideas for further reading
Case study: Budrus
A critical aim for the Budrus team was to shift the media narrative to give more recognition and validation to the role of nonviolent protest in a conflict that is too often lazily depicted as universally aggressive.
Neither audience numbers nor social media followers nor press column inches, nor even surveys would help them understand whether they were having this impact. No numbers would help. But nor, for that matter, would focus groups or other traditional qualitative means of gathering data.
So instead, knowing they needed a different way to assess whether they were having the desired impact, the team partnered up with the PR firm Strategy One, and took on a content analysis of all the media relating to the village of Budrus before and after, looking at both quantity and quality of coverage – both whether coverage increased, and how the story was told.
The results showed that, while there had been some limited media coverage of events in Budrus prior to the release of the film, nearly all of that coverage was conducted through a law-and-order lens, treating the protests in Budrus as disturbances of the peace. On the other hand, after the release of the film, most of the media items incorporated the key message that the team had laid out early in the production process: that the people of Budrus were engaged in a nonviolent struggle to save their lands and olive trees. The study showed conclusively that, beyond putting Budrus on the map, the film successfully shifted the media narrative about events in the village - from one about chaotic riots to one about a strategic, nonviolent campaign.
The Budrus team recognised the difficulty of gathering data against one of their key Impact Goals, asked the experts, and found a way to get the data they needed to assess and inform their unique strategy. That’s great evaluation.

The report “‘No Fracking Way!’ Documentary Film, Discursive Opportunity, and Local Opposition against Hydraulic Fracturing in the United States, 2010 to 2013” is an in-depth study of the film Gasland. In it the authors, using content analysis and other sociological research methods, find that Gasland had a significant impact on public discourse and activity. It also finds that the film contributed to the success of anti-fracking mobilisations and municipal bans on fracking. This retrospective look at what happened is a powerful testament to what’s possible with film.
But let’s be honest: not every film team can or even wants to hire a social science research firm or university researchers to do a deep dive evaluation of their impact campaigns, nor should they. There are benefits and limitations to different evaluation methods.
This Active Voice Lab report offers valuable food for thought about different approaches to assessing impact, as it compares and contrasts a Doc Society Impact Report and a Harmony Labs social science evaluation. It finds there are benefits and limitations to each.
Geek Out: Ideas for further reading
So what’s the right choice for you?
Impact Assessment: This often take a lighter touch and can be more focused or limited to a particular component of a project. Often, it offers a case study with anecdotes and some qualitative and quantitative measures that you’ve gathered over the course of your campaign. It can be a valuable and relatively accessible way to tell a story about what happened. Just take care not to make grand statements that you can’t back up with strong evidence.
Formal Evaluation: This tends to be a more robust and comprehensive examination of the impact of a project. If you decide to partner with a social science researchers to help you evaluate the impact of your efforts, you’ve made the decision to go deep. Just be prepared for what you get. A true evaluation may bring back answers you don’t like, so be honest about whether or not that’s the direction you’d like to go in.
The truth is, it’s rarely so cut and dry, and hybrids are possible too. The intent here is simply to invite you and your team to consider where on the spectrum you lie. Discuss together your needs and the purpose of the evaluation, as well as the capacity and resources you have, before settling on your approach to measuring impact.