One of the things that clients with little expertise in evaluation ask me is where to get started in this line of work. They know they should be evaluating their project, programme or whether they’ve achieved they’re organisational goals for the year, but they’re unsure where to begin. Here’s a checklist of questions and signposting links which I wrote for the Engaging Libraries toolkit, with a few additions to bring it up-to-date. These are equally as relevant for other types of cultural organisation.

It’s important to think about what to measure and how you’re going to measure it before you begin your project. There are lots of good reasons for taking the time out to do this, but primarily it means that:

  • You’ll have a clear roadmap of what you want to achieve and with whom, so you and your team can work from the same plan and towards the same goal
  • You can design and pilot data collection tools before activity begins, saving you time and the stress of doing both at the same time
  • You can involve your target audience in helping to co-create your evaluation plan, especially when it comes to intended outcomes
  • You can collect data immediately from those you want to have an impact on, rather than retrospectively attempting to this at a later date (risking missing data and/or recording experiences when beneficiaries may not give you an accurate reflection).

Ultimately your evaluation plan needs to measure what matters; it needs to be realistic; and it needs to work for you and your project. Don’t be scared to flex the models and approaches that are recommended in the signposting links below.

Getting started

Questions to ask yourself:

  • Why are you evaluating your project in the first place? What do you hope to achieve through doing evaluation?
  • Who’s the evaluation for? What will you do with the findings?
  • What are your professional principles of evaluation? How are you embedding these in your evaluation activities?
  • What type of evaluation will you need to do? (Front-end, formative, summative?) How will you use the findings from different stages of a project to change what you do?
  • Which staff or volunteers will need to be involved in the evaluation process? Do you have the resources and skills in-house, or will you need to bring in some freelance expertise?
  • Do you want to evaluate the process of the project as well as the experiences of the public involved?

Useful links:

Centre for Cultural Value Evaluation Principles

Libraries Connected Evaluation Toolkit

Suffolk Libraries Impact and Engagement Toolkit

Guidelines for Good Practice in Evaluation – UK Evaluation Society

Creative People and Places example evaluation of projects

SHARE Evaluation Toolkit for Museums

SHARE Data Driven Museums

Measuring what matters: the planning process

  • How can you present and communicate your evaluation plan to colleagues? Will you use a logic model or an alternative framework that makes more sense to you? What kind of plan is going to be most effective and user-friendly for you and your team to use (so that it doesn’t sit on a shelf)?
  • What are your overarching project aims, inputs, activities, outputs, and outcomes? Do you know the difference? Are you consistently applying the same definitions?
  • Can you involve your target beneficiaries in helping develop your evaluation plan? How could they help you to define the outcomes they hope to experience through your project?
  • What does success look like? What are your indicators: what evidence will demonstrate that you’ve achieved your intended outputs and outcomes?
  • Who are the direct beneficiaries of your project? Are there any indirect beneficiaries that you also need to consider?
  • Which data collection methods are going to be most appropriate to capture the evidence you need? How will you measure ‘distance-travelled’? What baseline information will you need? Do you have this data already or will you need to collect it?
  • How will you evidence attribution? i.e. How will you demonstrate that something has happened as a direct result of your work or intervention, opposed to something else?
  • How will you evidence unintended outcomes?
  • What process evaluation mechanisms should you put in place? Will you embed reflective practice for your staff, volunteers, artists?

Useful links:

Example logic model

Story of Change explainer

Gibbs Reflective Cycle

A guide to social return on investment evaluation (SROI) including downloadable ‘how to’ guide.

Arts in Health evaluation guide

NCCPE guide to evaluating public engagement projects

ILFA Generic Learning Outcomes (GLOs) and Generic Social Outcomes (GSOs)

Data collection and ethics

  • When will you collect the data? Who from? Where? How will you make sure the sample is representative, robust, and free from bias (including unconscious bias)?
  • Which are the most appropriate data collection tools that will help you evidence you’ve achieved your outputs and outcomes? Which will help you measure what really matters?
  • What are the right questions to ask that will give you the evidence you need as outlined in your plan?
  • What might you need to consider in order to work ethically and legally when you’re collecting and storing the data?
  • Do you need to consider any specific ethics or inclusive approaches e.g. if you’re working with vulnerable groups? How will this impact on your methodology/approach? Who else might you need support from?
  • Can all your intended beneficiaries access the methods you’re using (e.g. think about potential barriers including physical and intellectual)?

Useful links:

Creative Research Methods: A Practical Guide (Kara, H; 2020)

Family Evaluation and Audience Research

Collecting public engagement data online

NEF Measuring Health and Wellbeing

Happy Museum project evaluation tools

Market Research Society Code of Conduct

Analysing and interpreting data

  • How will you systematically collect and record the evidence? Where will you store it so that it’s safe? In what format? Do you have the right data controller and/or data processor systems and processes in place? How will you stay on the right side of UK GDPR?
  • Which method will you use to analyse quantitative data versus qualitative data? Do you have the systems and skills in place to do this effectively and robustly?
  • Will you need to buy any particular software or kit to help you record data e.g. digital recorders?
  • What is the data telling you? Be cautious of unconscious bias and look for insight rather than reporting on simply information.

Useful links:

Information commissioner’s office

Association for Qualitative Research

Better Evaluation

NCVO guide to analysing qualitative data

Sharing your findings

  • How will you tell the story in a succinct and interesting-to-read way, presenting insight rather than simply information?
  • Thinking about the audience for your evaluation findings, what format is the best way to share and disseminate your findings? Will you present your findings online or in a report format? In another creative way? Will you need any support e.g. from a graphics team or copyeditor?
  • How can you share your findings inclusively? Think beyond the neurotypical.
  • How will you use the evidence – for tactical and strategic purposes going forward?

Useful links:


Andy Kirk Visualising Data


How to create infographics with impact

Have you found anything else that’s helped you with your evaluation? Drop me a line and let me know.