If you’re used to collecting data for evaluation face to face, the last year has probably come as a bit of a shock in terms of how to effectively evaluate public engagement activities that have (unexpectedly, and rather rapidly in many cases) moved online.
And if we’re putting down our survey clipboards, abandoning our feedback walls, and mothballing our mood trees – what do we use instead?
What to do first
Remember evaluation is about assessment – it should be a ‘warts and all’ review of whether you’ve achieved what you set out to do. That won’t have changed. But based on my experience over the last year, if your activity shifts online I’d recommend that you:
- Review your evaluation framework. Make any necessary changes to your inputs, activities, outcomes, outputs and indicators.
- Review your data collection tools. These should originally have been informed by the outputs and outcomes you need to measure. Consider:
- Will the tools still collect the data you need?
- What might you need to change?
- Do you have any new outcomes or outputs to think about measuring because of the shift to online engagement? How will you evidence these?
- Which is going to be the best tool to collect data in the digital space you’re using?
- How will you ensure everyone can take part (making sure your evaluation is inclusive and accessible)?
- Will all your data collection move to online or will you still use some offline methods too?
- Will your timetable or roles and responsibilities for data collection need to change?
Using online qualitative tools to collect your data
‘Online qual’ has been around for a long time and is a field in its own right. It’s something I’ve used in my role as independent evaluator for various museums, libraries, galleries, archives and heritage sites. And market research professionals have used various asynchronous (research take place over a series of days) and synchronous (real-time) methods for many years. But I don’t often see similar approaches discussed or used in the cultural sector.
The most common online qualitative tools used in the wider market research industry include online focus groups, live chat, digital diaries, mobile ethnography, depth or one-to-one online interviews, online communities, bulletin boards, and even chat bots! Often technology can help rather than hinder, providing insight and depth that you just don’t get with traditional methods. For example, asynchronous online qual can give greater depth and more considered responses as the engagement takes place over a longer period of time, with participants able to share their responses through text, pictures and video.
Here’s just a handful of online qual examples which could replace some of the approaches you typically use in a face-to-face scenario.
|Typical face to face qualitative tool||Example online qualitative alternative|
|Face to face vox pops/focus groups/|
|Real-time webcam (online depths or focus groups – either using standard video conferencing platforms or specialist market research software for greater functionality and analysis)|
Asynchronous discussion (text/mobile)
|Post-its or feedback cards and feedback walls|
|Use off-platform collaborative whiteboarding and mind-mapping tools like Mentimeter, Padlet, Slido, MindMeister, Group Map or Google Jamboard to gather feedback using virtual ‘post-it’s (I’m not a fan of the inbuilt ones like on Zoom, mainly due to user experience onboarding difficulties)|
You can use the chatbox to collect feedback in response to questions but remember this is likely to be attributable rather than anonymous
|Use free digital journals like Penzu or before/after one-to-one real-time webcam (online depths) |
Asynchronous discussion (text/mobile)
|Participatory journey mapping||Use online facilitation tools and collaborative whiteboards such as Trello, Miro, Mural or Google Jamboard. Small groups or pairs can work on their whiteboards separately, and then bring them to an online focus group for discussion together.|
Which you use will depend on what you need to measure, your overarching objectives and – in certain cases – budget. Above all you’ll need to make sure your approach and sample is representative, ethical and accessible for those taking part.
What about online quantitative tools?
Over the last year I’ve mainly needed to think about analytics and online surveys or polls. So let’s take each of those in turn.
This useful article and downloadable excel spreadsheet from the Culture is Digital project can help you work out which digital metrics are useful to measure, and how to capture them. Take these ideas as a guide and think about them when you’re reviewing your evaluation plan.
If you need to evidence the number of people engaging in your online activities, it’ll depend how you’re delivering them. For example, if you’re using Zoom at the most basic level you can see the number of participants engaging through the participants tab. Or if you have a business, education or enterprise priced package you can find out the number of people attending your Zoom and other statistics on your account dashboard.
You can view Instagram analytics on a business account and access Facebook Insights if you’re delivering activity on social media channels. And if your activity is hosted on your website (for example, an online exhibition or a digital download) you’ll want to track your website users through something like Google Analytics. If you don’t know where to start with any of that, I’d recommend taking a look at the free support and tutorial videos available from the Digital Culture Network or Google Garage.
Surveys and polls
There’s various options for creating new online surveys (or converting your paper-based feedback forms). This helpful article from quantitative data expert Adam Pearson will give you the basics to consider but my recommendation is the same – try online survey platforms out first before spending any money, make sure they do what you want them to do (with analysis as much as the question formats) and pilot them. More than once. I’ve been writing online surveys for many years and I will always find something wrong with them during piloting.
Think about how you’ll share your survey online – this will depend on where your activity takes place. If you’re hosting an online exhibition you might want to embed it in your website content (or have it as a pop up). If you’re running an event and have already collected emails (within GDPR!) you might want to email it to participants afterwards, or simply share the link in the chat box. If you’re using Zoom and have a paid-for account that allows a vanity url, you’ll be able to automatically redirect participants to your online survey when they ‘leave’ the meeting (you can do this through your account management settings).
Whichever you choose, the ‘ask’ (i.e. why should someone complete it) is more important than ever before – survey fatigue is running high. Think about what’s in it for them, what makes you so special that they should spend time completing it?
If you’re used to including ‘light touch’ voting or rating activities in your activities (beans in jars, voting lights or voting pins on walls), you can use the online sticker functionality on whiteboards.
I’ve learned a lot about quantitative online data collection from cultural sector support organisations this year (thank you!) and there’s also various online guides which can help you get to grips with the basics.
- Don’t abandon your evaluation plan if your activity is moving online – whilst your data collection tools may be different, the same principles with evaluation apply i.e. remember your evaluation aims and target beneficiaries: what do you need to evidence, which is the best way to collect that evidence and what are the right questions to ask? Keep measuring what matters!
- Don’t get obsessed with the technology – think about what you need to measure first, the questions you need to ask and then work out which is the best data collection tool to use. If you’re designing online data collection tools for the first time, test them out on colleagues or willing family members!
- If you don’t think existing data collection tools will capture the information you need, design something yourself – just prototype and test it. Don’t be afraid to throw out the rule book as long as your approach is ethically sound and works within best practice guidelines (MRS code of conduct for example).
- Build in time before, during and at the end of your online activity for the evaluation tasks you’d like to do (and in my experience, you’ll need to allocate more time than you would ordinarily in a face to face environment).
- Think about accessibility – are you being inclusive with your evaluation? How can you make sure you include everyone who’s been involved in the activity has an equal chance to participate in your evaluation? For example, if someone is visually impaired are they able to access your evaluation tool satisfactorily? What about digital poverty – are there any considerations to take into account – for example, someone without broadband may have enough data to watch a workshop, but will they have any left to take part in a 60 minute focus group?
- If you need to evidence ‘distance-travelled’ (i.e. progress between before and after experience with your outputs and outcomes), think about how you can do this using online data collection tools and when you need to do it.
- Consider how you can create a welcoming environment just like you’d normally do in a face-to-face scenario. For example, send goody packs in the post with snacks, coffees/teas; or use music and icebreakers.
- Make sure your participants are ‘onboarded’ if you expect them to use any of the additional features of virtual spaces such as whiteboards. Will they need 1-1 support beforehand, or will a guidance sheet sent in advance suffice?
- Conduct your online evaluation ethically – think about online evaluation in the same way as you would do in a face-to-face scenario e.g. in terms of requiring consent, and/or working with vulnerable groups. Do you want people to share their feedback with you privately or openly in a group with everyone? Have you asked for consent to do this? Have you given them instructions on how to give their feedback privately/to everyone? How will you respond publicly to any negative feedback? How will you record evaluation activities, and get permission to do this? Use the updated Covid-19 MRS guidelines to help you plan and risk assess (this also includes advice for any face-to-face activity during Covid-19), and draw upon good practice guidelines such as these from The Evaluation Society in your online evaluation work.
Hopefully this post gives you a few pointers and ideas to think about. But if you’d like to talk to me about my experience adapting evaluation frameworks and online tools over the last year, or have a challenge with your online qualitative evaluation that you need help with, please consider booking a power-hour with me and I’ll do my best to help.