One week to go...

One week to go...

Just one week to go until the end of the trial!

A short message from director John Knell...

​Your Value Story – Quality Metrics Case Studies

​Your Value Story – Quality Metrics Case Studies

We would love to hear more about how you’re interpreting your data. The website was set up as a resource hub for you – and we will start to include case studies about evaluation stories - we’re calling it the #MyValueStory series. In order for us to put this together we need your help. If you are happy to share and would like us to feature your case study on our website please send your thoughts over to

You can share as much or as little as you like. Here are some ideas of what to include:

·       Summary of you & your team

·       Details of the event

·       What you were expecting to get out of the trial

·       How you collected your responses

·       What was the most challenging element of the process?

·       What was your most interesting finding

·       What finding shocked you the most

·       What will you do differently next time

·       Quotes from your team/ quotes from your audience

·       Images from the event

As well as downloading each of the charts in your dashboard as images, there are lots of simple ways to manipulate your raw data. For some ideas to get you started visit our Resources section (scroll down to Evaluating & Sharing your results) which is regularly updated with new walk-through videos).

Remember, three evaluations isn’t an upper limit – you are welcome to continue using the platform until the end of July.

As ever, if you have any questions about your evaluations and how to make the most of them please do not hesitate to get in touch with any of the Culture Counts team.


An update on the Quality Metrics National Test

An update on the Quality Metrics National Test

With over 14 000 public respondents and a network of over 1500 self and peer assessors across England we are seeing some high quality evaluations take place as part of the Quality Metrics National Test.

Most organisations have run one or two events so far and have more planned (the majority of organisations have signed up to evaluate three events with the quality metrics before the end of May 2016).

Over the next few months we will be in touch with organisations about their data and what value and insight can be extracted from it. We are also in the process of collecting metadata such as the artform and location of specific events. This will enable us to get the most value out of the anonymised aggregate dataset, results of which are to be published as part of a public report due in July. All organisations in the trial are invited to our next Quality Metrics events, if you haven’t received your invitation yet let us know

There is still another 9 weeks left for evaluating work. If you would like assistance in interpreting the quality metrics data or have any questions about using the Culture Counts dashboard, please contact us or browse ideas in our Resources page.

Participatory Metrics Development: update

Participatory Metrics Development: update

The participatory strand was more popular than anticipated, with 24 organisations taking part. As many organisations in the ACE funded portfolio cover participatory work through their programme of activity there is much interest in measuring the quality of participatory work effectively. Organisations were invited based upon existing relationships; knowledge of the organisations’ work who had expressed interest in the main quality metrics trial, a judgement on which organisations had the capacity to absorb both the main trial and this participatory metrics strand; and on specific recommendations from ACE. Two arts organisations which sit outside the portfolio, Fun Palaces and Brighter Sound, were invited to take part due to their expertise in participatory work.


Two inception workshops have been run to date, with one held in London in December, and one in Manchester in January.  The conversations at the inception meetings clearly demonstrated the rich expertise and processes that go in to producing participatory work within the attending organisations.


Facilitating testing is underway and is managed in a similar way to the quality metrics trial. All organisations in the participatory metrics strand can see in the Culture Counts dashboard a template participatory evaluation with the long list of participatory metrics. They can then choose to edit the metrics, remove those that they wish and add in others.


In other words the existing long list of participatory metrics, produced in the original pilot phase led by Re:Bourne / New Adventures, are being actively refined in the sense that participating organisations are being encouraged to choose freely from that list (so we will see which of the dimensions are being chosen most frequently), and also to either edit and refine them, or to add in additional metrics. The evaluation data and the shared interpretations of the cohort of participating NPOs should produce rich insights into how best to refine and develop the participatory metrics.



Some emerging themes have already started to take shape from the work so far:

·      The need to adapt individual dimensions into different metric statements for different contexts


·      Tracking behaviour change. Whilst the length of the trial mitigates against any substantive tracking of participants, there is significant interest in how far a platform like Culture Counts will allow organisations to track participants across the course of a participatory experience and thereafter. We are going to explore with some of the organisations (the Book Trust) what is possible within the confines of this national test period.  


·      Some of the participating organisations raised the possibility of developing both summative and formative approaches. Summative assessment is after the fact, post event. Formative can be at anytime (before, during and after) and can be used to inform and critically reflect on how to best continue within a process.


·      Organisations expressed the need for us to be clear about the distinction between quality assurance and outcome measures. 


·      The need to explore the subtleties between interaction, participation, and learning.


·      There seems to be a clear gap in the current participatory metrics – where it aligns with CYP Principle 1 – around excellence.


·      How might we need to expand and refine the respondent types in participatory work – i.e. more than a self/peer/participant respondent frame – extended to include the practitioner and commissioner/producer. How well defined are the roles in the activity / artist participant etc.? The participating organisations have observed that in some instances of participatory work the roles are more clearly defined than in others.


·      What is the vision of the participatory work? Is it deliberate and thought through? How can this intention be captured by the self assessment? Is the vision that of the producer or the practitioner – how well has that vision been communicated? This is linked to deliberate innovation in different contexts, for example how far is the artist repeating content irrespective of the purpose of the participatory event, or are they innovating in new ways to meet clear goals. 


·      Experience for the artist as a participant can contribute insight e.g. some of the best work occurs when the artist is stretched alongside participants.


·      What criteria are used for selecting participatory artists? How can that align with these metrics when considering the quality of the participatory experience? Is the artist inspiring? Do they have expertise?


·      What other quality indicators can be used to capture other objectives of participatory work such as process, production, social change, collaborative understanding? How can the optimal conditions for learning and engaging be measured and included?


·      Formative and long-term impact - repeat testing of metrics with time tracking would be useful over longer participatory experiences. Some participatory work extends to alumni relationships.


·      What are the development options to facilitate access? We could might consider: images, cultural differences, languages, disabilities


The draft long-list of participatory metrics can be found here 



Another chance to trial the Quality Metrics and Culture Counts

Another chance to trial the Quality Metrics and Culture Counts

Prior to this Quality Metrics National Test Phase, the Quality Metrics project received support from the Digital R&D Fund for Arts and Culture, which was a joint initiative between Nesta, the Arts & Humanities Research Council, and Arts Council England.


Under the terms of our Digital R&D award, we had to develop the Quality Metrics and the Culture Counts platform so that we could offer a ‘free trial’ to Arts Council England National Portfolio Organisations and Major Partner Museums. The terms of the trial are that participating organisations will use the Culture Counts platform on a self-supporting basis with minimum levels of support. Their data will not be shared with other organisations or analysed as part of the Quality Metrics National Test phase.


We are now providing those NPOs and MPMs not taking part in the Quality Metrics National Test phase the chance to take up that ‘free trial’ offer made possible by our Digital R&D Award. Participating NPOs and MPMs will gain access to the Culture Counts platform between April and July 2016.


If are you interested in this opportunity, please get in touch with us by March 1st 2016.

The Quality Metrics National Test is now well underway...

The Quality Metrics National Test is now well underway...

The Quality Metrics National Test is now well under way in England. Currently 150 National Portfolio Organisations and Major Partner Museums are using Culture Counts to help measure the quality of arts and cultural work. The national test provides an opportunity to trial and assess the widespread use and application of the quality metrics across a range of organisations and art forms, including dance, theatre, visual arts, music, museums, literature and combined arts.


This website has been set up for those participating organisations, and the wider sector, to learn more about the project and to share findings and insights.


Alongside the main trial, we are also working with a number of organisations to further develop a set of participatory metrics, building on previous work undertaken in collaboration with Re:Bourne. Read the Developing Participatory Metrics report here


As with the Quality Metrics, we hope the diverse group of organisations working on this strand will generate insights that are of significant interest across the cultural and heritage sectors.