We need to talk about Arts Council England

Before I get on my soapbox, I want to start this blog post by saying there are LOADS of good people who work at Arts Council England (ACE). I have a lot of friends and sector colleagues who work there, all of whom I like and respect a great deal. In no way is this blog post directed at any individual, and if any of you are in direct contact with anyone at ACE at the moment, please remember to be kind.

I’m writing this because I genuinely care about data – specifically arts and cultural data – and helping organisations understand it. Data is essential for unbiased, objective decision-making – it can help you understand your audiences,  inform business strategies and audience development plans – but in order to do that, it needs to be robust. I’ve worked on ACE data collections throughout my career and let’s face it, they are never straightforward. However, if you are an NPO funded organisation, you need to take part in various mandatory collections as part of your funding agreement. I’m going to talk about two of those collections and some of the issues and challenges they have recently raised.

First up, the ACE NPO Annual Return. My first direct question to ACE is… How hard is it to hit spell check before you issue a mandatory Excel data collection template?

Figure of typos contained in the 2023 ACE NPO Annual Return spreadsheet.

All of those words were featured if not once, then multiple times throughout the Excel data collection template. Grantium, ACE’s online portal, wasn’t much better. To be honest, by the time I came to input this year’s data, I’d stopped looking for spelling errors – mainly because I was distracted by missing questions! 

The questions

A – Workforce – was missing questions A32 & A33.

C – Activity – there was an error on C18.3 which meant I could not submit the responses and move on to the next section. 

D – Audiences – was missing question D14.

C18.3 in particular made me see red. Do you know why? Because I reported the exact same error last year. ACE has had a whole year to fix it and it is still wrong! All of those answers then had to be submitted manually. That can’t be easy to manage at ACE’s end – accurately mapping answers against the relevant NPO submission. There are A LOT of submissions. Which brings me to my second direct question to ACE… Who proofs and/or tests your data collections before they are shared nationally?

Proofread!

For years, I was responsible for all marketing-related print at the venue I used to work for; whether that was something we had produced internally or something that we had been sent for approval. As someone who used to proofread every day, I have the following bits of advice.

  1. Hit spell check on the document you want to proof / are sending to design.
  2. Proof it on screen – how does it look? Is there anything obviously wrong?
  3. Print it off.
  4. Print off the original text i.e., the Word document or whatever file was used to create what it is you are proofing, and, this is crucial, compare it against what you are proofing.
  5. Get a colourful pen (I always liked red; it made me feel like a teacher).
  6. Physically tick off amends as they are made and keep the physical proof as record.
  7. Proof it again. Is there anything you’ve missed? Repeat steps 5 and 6 as needed.
  8. On the final draft, make sure all previous amends are still in place, then give it another once over.
  9. Get someone else to give it a second look.
  10. Proof it one last time and sign off with confidence.

Why is all this important? Because ACE could have avoided these errors and issues had they simply spell checked, proofed, and tested a document before it was shared across the sector. These are simple, basic tasks that should always be completed for anything any organisation is rolling out. Not doing so shows a general lack of care, attention to detail and professionalism. Rolling out a national collection full of typos and inconsistencies gives the impression that it isn’t very important to ACE. They say they need this data to help advocate for the sector with government, but they don’t seem to care about doing it rigorously. It doesn’t inspire confidence and we risk it becoming a box-ticking exercise.

Illuminate

I was genuinely excited for the new data collection to be unveiled, especially as PriceWaterhouseCoopers (PWC) were involved – they have a stellar reputation, and I was expecting great things. However, when the launch came what the sector got was *another* data collection template that contained basic errors, and a reporting platform that would not be ready for another six weeks. The design of some of the questions also raised concerns. For example, the demographic questions did not map against the Census, making comparison to historical Audience Finder data and valuable open-source Census data, difficult, or in some cases, impossible. It seemed nonsensical. And then there were the new requirements on sample size, which had jumped from 380 in Audience Finder days, to 1,000 (if you were an organisation with an audience of 10,000 plus). That’s 620 additional responses. It is unrealistic and unnecessary. 

The whole thing was frustrating and made me angry. I immediately emailed ACE and began to share my thoughts on LinkedIn, tagging both ACE and PWC in each post I made on the subject. To their credit, ACE responded to my emails and explained their rationale behind certain decisions. They also took on board some of the requested changes to the mandatory question set and made them. 

What ACE didn’t do, was communicate when these changes had been. Yes, the template’s name was updated to include the date of the update, and they listed the changes at the end of the document, but I didn’t see any social media posts or emails alerting organisations to the changes. Maybe I’m being unfair and I’m just not on the right mailing list, but I heard everything through the grapevine. That meant organisations who were getting on with collecting data in an alternative platform while they waited for the Illuminate platform to become available, were potentially collecting inconsistent data that may not be accepted in the first manual upload. 

My concerns

In case it is helpful, I am going to share the concerns I have about the Illuminate collection and what issues I think it has raised. Whilst some of my points have already been addressed, I think it’s important to have them all listed in one place. I would have preferred my list to finish on a nice, round, even number like 30, but I have run out of steam so it’s 27. There might be things I have not spotted or considered yet. If you think something is missing, please do share it in the comments. It’s important that we learn from each other and keep documenting this.

Some of these points could be useful if you are planning to make a response to ACE, or even better a joint response where you team up with other organisations who share your concerns. It is so important that you are vocal on this. If we do not sort this out now, the sector is going to be left with a flawed data collection and that is simply not good enough and it is going to set everyone back.

Before I start, it is important to acknowledge how difficult it is to design a data collection that suits everyone. In reality, you probably can’t, and I don’t envy ACE the task. However, I do think we, and by we, I mean the sector, should expect a higher standard than what we have been given. At the moment, the only two benefits I can see to the new collection are the fact that the data is not going to be monetised, and you are not obligated to use the Illuminate platform.

In short

  1. My main gripe is that the collection was clearly not proofed or tested properly before it was  released.
  2. It is not the handful of questions that you were promised. 
  3. The Illuminate platform was not ready at the time of launch.
  4. The sample sizes are unrealistic for most. ACE has since introduced a minimum target. Full details of sample sizes can be found here
  5. The Illuminate platform does not have the capability to process email addresses. Thinking of including a consent-based data collection within your survey? You can’t. Want to incentivise your survey to increase responses? Sorry, no. It’s not helpful, is it?
  6. It brings an administrative burden. Currently, only a few ticketing systems integrate with Illuminate; meaning quarterly manual uploads for a lot of NPOs. 
  7. The surveys also bring an administrative burden as you have to survey per event – you cannot have one survey that serves you across the year. Yes, you can create a template which you can use going forward, but you still need to repeat that step for each event.
  8. And if you’re surveying outside of the Illuminate platform, get ready for your quarterly uploads.
  9. Speaking of the survey – it is no longer anonymous. There are so many specific questions about the event itself, as well as postcode and demographic data, you could likely map that information back to an individual. 
  10. Also, not everyone has access to an iPad or tablet. Where is the paper version? 
  11. I know it must be difficult to design a collection that suits all NPOs, but it was missing the General Admission experience. That seems pretty fundamental.
  12. What about participant data? E.g., should a ten-week participation project be counted as one occurrence or ten? What if sessions are at different locations? Should the data be uploaded each quarter (if the project is on-going) or should you wait until it is completed? Clarity was needed. 
  13. Want to add a specific question to the survey? If you want to include one, you must submit it to ACE, and it needs to be relevant to enough organisations to warrant inclusion in the question bank. If it doesn’t, tough luck.
  14. I’ll say what you’re all thinking – the demographic questions are intrusive and will likely not sit well with your audiences.  
  15. Not all the demographic questions map against the Census.
  16. The age question read 60-64 then 64-69. It was corrected to 65-69.
  17. In the ethnicity question, you could only be British if you were White. The full category labels were amended to Asian/Asian British, Black/Black British, and White/White British.
  18. Ethnicity was further amended to bring it more in to line with the Census collection, with the Latin American full category being removed and instead being included as
    sub-category under ‘Any other ethnic group’ (a full category that was previously missing).
  19. Ethnicity was amended a third time – with ‘self-describe’ options within each sub-category being removed and an ‘Any other…background’ option being added to each sub-category. The format was also updated to reflect how data will need to be uploaded into the Illuminate platform.
  20. For the question “What is your country of residence (if not UK)?”, ACE provided a List of Countries csv file which is downloadable from their website. ‘United Kingdom of Great Britain and Northern Ireland’ has been included on the list as an answer option when it shouldn’t have been. It has now been removed.
  21. The sex and gender questions do not map against the Census. ACE designed these questions in consultation with the LGBTQ+ Foundation and they are used across all of their collections.
  22. In the socio-economic question, you can be long-term unemployed i.e., more than a year, but if you’re short term unemployed i.e., less than a year, you are ‘Not Applicable.’ Seems harsh. 
  23. The experience question ‘What was the name of the activity/event that you attended?’ was amended to include the General Admission experience.
  24. The previous visits question ‘When did you last attend an [NPO] event?’ was amended to include the organisation as well, again to reflect General Admission experience.
  25. In the same question, the mathematical symbols ( > and < ) were replaced with an en dash ( – ).
  26. Also, for the record ‘I have never visited before’ should not be the last option, it should be the first. It should also read ‘This is my first visit.’
  27. What segmentation model will replace Audience Spectrum? And when will it be available? For the record for those using Audience Spectrum – it hasn’t been retired and is still available.

In Summary

That’s a pretty long list, and yes some of the points have already been addressed or fixed, but all of these things should have been ironed out during the consultation, development and testing process before Illuminate was released as a mandatory national collection. ACE is a publicly funded organisation, and they need to do better. You deserve better. I think it is time for NPOs to hold ACE to account. Organisations have enough challenges in the current economic climate to deal with without having to bust their guts administering and delivering a flawed mandatory data collection. 

So, what can you do? 

  • If your CEO isn’t aware of the issues yet, get them up to speed.
  • Consider linking up with other organisations to form a joint response.
  • Speak to your Relationship Manager. If they can’t answer your questions, ask to be put in touch with the person at ACE who can. Please remember that not all Relationship Managers have been given the necessary information and aren’t 100% up to speed.
  • If you are collecting surveys outside the Illuminate platform, consider including some introductory text to the demographics section. In this, I would clearly state that you are required to collect this data by ACE as part of your funding agreement. Put the onus on them.
  • If you get complaints, acknowledge them, and reiterate that you are required to collect that data on behalf of ACE and then put them in direct touch with ACE.

Finally, the data protection person inside me can’t help but say….

  1. Make sure the Data Processing Agreement is signed and you know where it is.
  2. Update your Privacy Policy – specifically the section around data you share with third parties.
  3. Update your Records of Processing Activities (ROPA) documents to reflect the processing.

What do you think?

I honestly would love to know what you think. I get that people are a bit nervous about saying anything publicly, especially if they work at an NPO organisation – everyone is scared of upsetting ACE. If that’s the case, please feel free to email me or direct message me on LinkedIn and I promise I will treat anything you say in confidence.  For general updates, please follow my page, Kate Fitzgerald Consulting Limited, on LinkedIn

Other resources recommended to me on this topic, subsequent to publishing:

Marge Ainsley, Consultant – How Much Do You Care?

Marge Ainsley, Consultant – Google Doc detailing Illuminate queries that have been raised with ACE and/or PWC.

2 responses to “We need to talk about Arts Council England”

  1. Daniel Bernstein avatar
    Daniel Bernstein

    Thanks Kate. Wow – 27 and I think I could add a few more! Interesting and important article that I hope gets wider readership (especially at ACE & PWC…)
    The main one I would add is that I think we will be distorting the data that is gathered towards ticket buying audiences (from box offices)….who are not representative of the culture-consuming public (free outdoor events etc). We get live audiences of over 1million/annum The only advice we get from PWC is do 1-2-1 audience surveys or a QR code. The first is really time consuming; QR codes assume you have a smart phone; data and the will to do it. We’ve found it is predominantly female and not representative of our audiences.

  2. We had planned on starting to use Illuminate surveys for our general visitors next week (on an iPad on-site). However, after many hours spent setting them up – I’ve just realised that the survey link only works once per device which makes it completely un-usable for us. Have raised the issue and requested an urgent fix!

Leave a Reply