What makes great evaluation
As with all other stages of this process, evaluation will be unique to your film. That’s why we emphasise learning and tracking. Evaluation methods and conclusions will tie back to the unique goals and objectives you identified in your vision and Strategic Plan. Think of evaluation as learning about how your plan is working and tracking the things that are actually happening.
Great evaluation is about learning throughout the project
One of the most basic mistakes is to think of evaluation as a final product rather than a process. It’s an easy mistake to make, because you’ll probably pull everything together into an evaluation report at some point. But while that’s an important part of the process, it’s just that: a part.
The right way to think about evaluation is as a process of learning, which starts as soon as the film is put out into the world, or even before that, as you work with potential partners and stakeholders to prep for broadcast or theatrical release. And because it is ongoing, it means you are in a position to react and refine your impact strategy, making the most of every opportunity.
Yes, time and resources are limited. But if you applied for and received funding, then some kind of reporting will be necessary - you can use those requirements as a starting point for what kind evaluation to pursue, even if you do nothing else.
This is why we were so careful to emphasise in the Planning chapter that the Strategic Plan you came up with was only version 0.1. Because the unexpected will happen, and evaluation will help you to respond to it faster.
Start early, start small... just start
The worst thing you can do is build evaluation up into such a big deal in your head that you don’t start collecting some data, and learning something, right from the beginning. Even if you just start with by asking one question of everyone at a rough cut screening – that’s a start. You can always improve it next time.
'The country needs and, unless I mistake its temper, the country demands bold, persistent experimentation. It is common sense to take a method and try it. If it fails, admit it frankly and try another. But above all, try something'
Put regular evaluation time in as a team, ideally every week. Because evaluation is best as a little-and-often process, setting aside time in your diary and sticking to it is a super-simple but very effective way of making you keep on top of it.
Like gathering material for your film - where you collect lots of small interviews, stills and B-roll footage together in a file, collect every article about the campaign, list every community screening (ideally with audience estimate), file every 'your film changed my life' email and make a note of every local politician who referenced your film in a speech. Everything.
Like the film material, this all adds up. Over a period of months and years, you’ll start to create a rich picture of your film’s journey through the world and how it has affected people, institutions and contributed to movements.
Yes, this takes up time, time you might rather be spending elsewhere. But taking on an impact project does mean committing to this process to some degree. You may decide it’s not for you, but if it is, this is what we found that works.
TheNorman Lear Center in collaboration with The Center for Investigative Reporting have been testing an Offline Impact Tracker for journalists in the newsroom. The aim is to make the daily capture of evidence more simple. First results are good - and the team are now looking to adapt this for documentary filmmakers.
Case study: The End of the Line
The sample Impact Plan from The End of the Line you’ll find in the planning chapter is a long way from where the team thought they were going when they finished the film. Initially, their focus was firmly on changing political structures – and specifically on achieving a ban on bluefin tuna fishing. The role of public awareness in their working theory of change was as a conduit to this.
But entry and exit surveys carried out at two London launch screenings of the film highlighted a different role for the public. The film achieved the major increase in awareness and perceived urgency that the team hoped for, with 85% of the audience upping their sense of urgency about the issue of overfishing. But what they didn’t necessarily predict was the commitment to personal behaviour change. Across the whole audience, commitment to buying sustainable fish almost doubled from 43% to 84% after one screening of the film. Although both remained important, this refocused the team’s approach more on personal consumption than on petition signing.
Another shift came soon after. British supermarket Waitrose came on board as distribution partner to the film before launch (and stopped selling swordfish in order to do so). Changing the procurement policy of restaurants and food retailers was part of the strategy – but relative to the focus on political change, not the main part. But then Pret A Manger founder Julian Metcalfe attended a preview screening, then arranged a screening for senior management, then announced a total change in fish sourcing policy on the day the film launched. Then another major UK grocery chain, Marks & Spencer, announced a change to their tuna sourcing in the week after opening.
It became clear to the team that corporate policy change could be directly triggered by this film, and that this could make massive impact both directly on overfishing, and through corporate communication to their customers on public awareness, and that in this way pressure could be brought for political change. They shifted strategy, and began deliberately to seek this kind of change – resulting in policy change from major food brands like Sodexo and Compass, and restaurants and celebrity chefs including Jamie Oliver, Tom Aikens, Antonio Carluccio and Raymond Blanc.
The film team gathered data, thought through the implications of early successes, reoriented their strategy, and moved. That’s great evaluation.
Great evaluation tracks whatever you need to know
Perhaps the greatest fear around evaluation stems from the idea of reducing everything to numbers. Damn right. Great impact in this field depends on the power of great art, and great art cannot be reduced to a formula. And when we think of evaluation as a process of learning about the impact we’re having, it seems obvious that numbers alone aren’t going to do it. You need qualitative data also: data that is about understanding how and why your film is having impact.
'Different projects require different methodological approaches to impact assessment. Moreover, a comprehensive approach to impact assessment typically requires the application of multiple methodological approaches that address different levels of analysis that reflect the different spheres of potential impact (e.g., on individual attitudes/behaviors, on media debate/discussion, on public policy)'
So while numbers are often quicker to come by, they are not necessarily the data you need. Twitter followers and audience statistics, while they are easily available and might well be important if your primary Impact Dynamic is about Changing Minds, are far less relevant if your work is more focused on Changing Structures.
In the worst-case scenario, you’ll spend time and energy gathering data; yet once gathered, they won’t tell you anything material about whether your strategy is working, or what you might need to change. They’ll be noise, not signal.
So really the headline for this page should be: 'Great evaluation tracks whatever you need to know – and only what you need to know'.
Don’t confuse reach with impact.
Audience viewing figures (whether offline from cinema / DVD & Digital / community screenings / schools) are relatively easy to come by. Ditto the basic online metrics like number of web views YouTube, Vimeo, Facebook, Google Analytics, Twitter. These stats have been the standard way that the media industry have demonstrated or defined success. The highest rating show…the number one box office hit…
In the business of impact evaluation, these figures are useful in so far as they demonstrate reach or perhaps penetration into a target audience - but they do not prove impact. We don’t know what happened to the viewer, what it meant to them, what if anything changed.
'With Budrus, we measured success both by soliciting qualitative feedback from the range of audiences we came into contact with (hearing, for example, from a Palestinian village that was so galvanized after watching the film that they had one of their most organized and spirited protests ever the following day, or from an Israeli-American for whom the film served as a central catalyst to become more involved as an activist living in Jerusalem), and by allowing a major public relations firm to conduct an independent audit on the media impact of the film'
Case study: Budrus
A critical aim for the Budrus team was to shift the media narrative to give more recognition and validation to the role of nonviolent protest in a conflict that is too often lazily depicted as universally aggressive.
Neither audience numbers, nor social media followers, nor press column inches, nor even surveys would help them understand whether they were having this impact. No numbers would help. But nor for that matter would focus groups or other traditional qualitative means of gathering data.
So instead, knowing they needed a different way to assess whether they were having the desired impact, the team partnered up with the PR firm Strategy One, and took on a content analysis of all the media relating to the village of Budrus before and after, looking at both quantity and quality of coverage – both whether coverage increased, and how the story was told.
The results showed that while there had been some limited media coverage of events in Budrus prior to the release of the film, nearly all of that coverage was conducted through a law-and-order lens, treating the protests in Budrus as disturbances of the peace. On the other hand, after the release of the film, most of the media items incorporated the key message we had laid out early in the production process: that the people of Budrus were engaged in a nonviolent struggle to save their lands and olive trees. The study showed conclusively that beyond putting Budrus on the map, the film successfully shifted the media narrative about events in the village from one about chaotic riots to one about a strategic nonviolent campaign.
The Budrus team recognised the difficulty of gathering data against one of their key Impact Goals, asked the experts, and found a way to get the data they needed to assess and inform their unique strategy. That’s great evaluation.