by Author sammaliksq1

Published on

You are here:

Case Study; Evaluating our Evaluations – A Youth Music Funded Project

Project Background

The project was a music project divided into two subprojects, 1) BAND SCHOOL: Teaching young people skills in live music, instrument playing and performing skills.  2) HOLIDAY HARMONIES: A boot-camp project during school holidays engaging with emerging/aspiring young musicians.

Our monitoring and evaluation plans

When we initially set out our monitoring and evaluating plans for the project, let's say we weren't shy with our choices of activities. Our intentions were to get more accurate indicators of young people needs whilst seeing changes in their journeys. Simultaneously we wanted to improve the quality of our workshops, up skill facilitators and develop project management skills as a result of this.​

Out monitoring activities set out in two areas;​

1. For participants

  • Attendance Register; tracking attendance
  • Project Tracking Document; tracking tasks, outputs and assignments.
  • Running Records  - journaling what happened session to session
  • Portfolio; A collection of music, films, images and written work  
  • Individual Learning Plans; Setting goals and outcomes from the project
  • Notes; from group discussions and one to one meetings
  • Questionnaires; from young people, parents, carers and referrers

 

2. For Workforce

  • Development Plans; documenting of personal and professional goals
  • Supervision reports; discussions, actions and tasks
  • Self Reflection Reports; done by team after an external training was attended
  • Staff Training Matrix; record of what training was done and when
  • Case studies; Reports and case studies like this

 

Evaluating Our Evaluations

Initially it started of well, we captured useful information, and even more! We started getting an idea of what we set out to achieve and were getting impressed of the volume too.​

When it came to collating this information into a report, a document for us to review. We realised we were hit with a substantial amount of workload. Although we did endeavour through this, but then there would have been a prolonged period between the collecting of information (sometimes 3 times in 4 sessions) and reviewing them (4-5 weeks later) Once this happened, we reviewed the great work in front of us and noticed that we were collecting the same or at least similar information more than once, differently. Whilst it created additional workload, it also caused confusion. Responses were different to the similar indicators/questions we asked. Resulting in not getting a clear indication of needs, goals and their journeys.​​

 

One step back – two steps forward

Once we noticed our flaws, we spent time reflecting and was able to come up with a solution to streamline our methods. We went back to our original plans and to make tweaks in our collecting and reviewing of data.​

We combined our project tracking document and running records into one. This way we were tracking who is doing what, when and how it's going session by session in one document. Also make it somewhat reviewable at the same time.​

We opted using only the musical development scale provided by Youth Music’s Evaluation Building Tool as a standard monitoring document. We tweaked the questions and also took out the middle option, we found respondents would sometimes just select this as an option for I don't know or I can't think. Whilst there's a time a place for this, we felt by taking it out it allowed participants to think and respond more specifically.​

We also replaced some team meetings with a weekly email thread. Just the way we used to share views and briefed actions in meetings, we did this via emails too and eliminated the need for some meetings (and creating minutes) This way we were able to do both and save some time and work.​

Finally, we went back to being ‘human’.  Whilst we tried to minimise influencing the procedures, we noticed we dehumanised the process as a result of this. It gave too much focus on box ticking, numbers and stats and nearly lost focus how participants felt. So we started talking and playing more to get an understanding of their feelings and emotions.  This was done with more regular group discussions, 1to1s and even our Facebook group chat contributed too.

 

Conclusion

As a result of going through this journey we have been able to streamline our approach and developed efficient processes that captured information but also make it presentable at the same time. Also become more human in doing so.  For us this was the most important thing, regardless of the chosen processes, procedures or methods we chose, being human makes it more empathetic.