by Author Dougie Lonie

Published on

You are here:

The role of evaluation in music projects… some thoughts from a road trip

Returning from a very stimulating (and just a little bit exhausting) trip around the country, working with 200 people or so over the past two weeks, I felt it was a good time to collect and share some initial thoughts and reactions to where we are with all this evaluation and outcomes stuff.

One is a slight blurriness as to where we are – which I am OK with. I felt that in every workshop we never quite nailed what the specific role of ‘evaluation’, or more specifically, ‘gathering and using data in projects’ was. And I think that’s probably because it changes quite often and because everyone’s needs are a bit different.

Evaluation or Action Research?

Are we trying to judge whether a project has been successful overall and why? Or, are we trying to make sense of what is happening in a project and change it as we go along so that everyone benefits?  As with most pesky questions, the answer is a bit of both.

I think we can apply action research principles to our evaluation approaches. Most would agree that research should be participatory, bottom-up, negotiated with participants throughout, and findings continually fed in to improve interventions.  This is plainly not ‘make it up as you go along’.  Evaluation frameworks must still be rigorously designed to establish what is practical, doable, and ultimately good enough to judge whether things are going to plan.    

I think it might be helpful as organisations to think about the design of our methods, analysis, and reflection as action research – seeking to understand and improve projects, and accepting failure, rather than thinking of it as evaluation, with its semantic associations with cost, worth, assessment, and judgement.

Being evidence-led organisations

The other big thing that occurred to me is that having strong research methods, making time for analysing data, and reflecting on the findings, must happen at an organisational level.  When applying for project funding and inevitably designing evaluation frameworks, it is essential that people discuss how a project will help achieve your organisational aims or mission and how you will know if this has happened.  If you can’t see the link then it’s not worth wasting time on.

I have seen organisations have evidence-based breakthroughs in recent years (and become much more financially stable as a result).  It is those that have an outcomes framework embedded at the organisational level, with associated indicators and methods for exploring impact at all levels.  More often than not this has also required the CEO or senior management team to admit that they need a bit of help in this area.

One small way forward is to get your boss (whoever that is – including a CEO or Chair) to commit to developing frameworks, methods and tools with you, to encourage them to challenge analysis of data, and to ask how any evaluation or research outputs will be used by the organisation. Small and gentle challenges will engage them, and they will begin to see the value of being a learning and evidence-led organisation.   

Bring in the bodies

While an intention of these workshops was to support organisations in self-evaluation, I’m not suggesting that everyone should or could become an expert in social research. It is obviously useful to bring in external experts when designing and doing evaluation, whether on business planning, research and evaluation, or topic specific knowledge.

However, we need to know how to brief these people properly and judge whether their proposals are any good or not.  We also need to manage them and ensure we are comfortable with the quality of their work. We can only do all this if we understand how and why we need to generate and analyse data as organisations and how this is linked to our organisational outcomes.

Funder as partner, not parent

Lastly, as a funder all we want is an honest account of success and failure, an account of how projects have developed based on research and evaluation findings, and a range of evidence, honestly interpreted and fairly presented by organisations. If we are all doing this we can start to make better sense of how projects achieve positive change, where people experience problems, and how we can all get round them. First and foremost your data, analysis, findings and outputs are for you as an organisation to better understand what you do and achieve. Everyone else, funders included, come after.    

The Youth Music funding programme is based on the reports that people send us, and we can create the most responsive programme and make the best funding decisions if people are sending in decent evidence. As we have promoted in this year’s learning report and various other resources over the years, we see our role as supporting organisations in their work by sharing knowledge and practice, whether funded by us or not.  

We don’t always get it right, but we are slowly bringing evidence to the centre of how we work as an organisation and want everyone else to do the same.

The move towards being more evidence-based is a slow process and it’s not always easy, but I think we’re all heading in the right direction – if only I had a clear set of indicators and methods to be sure…