Our recent online seminar for trustees on diversity, equity and inclusion in evaluation, held in partnership with the Clothworkers’ Company, explored what ‘good evidence’ looks like, what the barriers in charities’ existing evaluation practices are likely to be, and practical changes that can be made to existing practice.

The event was chaired by NPC’s Amina Ali, Principal for Data & Learning, who was joined by three speakers: Lisa Raftery, Trustee of the Rosa Fund for Women and Head of Grants at Social Investment Business; Sanjukta Moorthy, Director and Planning, Measurement, Evaluation and Learning Consultant at The SMC Group; and Tony McKenzie, Trustee of Groundswell and Reconnection Tour Manager at Engage Britain.

The scope and purpose of evaluation has been evolving over the years, but how evidence is gathered and used is less often addressed. This matters from a diversity, equity and inclusion (DEI) perspective as evaluation done wrong can be exclusionary and further disadvantage marginalised groups.

Ideally, evaluation should be culturally responsive and participatory, with a focus on equity and an awareness of systemic biases and inequalities. In Britain, this includes examining the sector and organisations’ roles and histories, particularly their complicity in centuries of colonial exploitation. It is important for us to understand how deeply embedded this history is, to call it out when we see it, and to sit with the discomfort. For this to happen, there needs to be culture shifts within organisations and a rethinking of the principles behind each piece of work.

The role of trustees in promoting DEI can be as a thought partner, encouraging deeper critical thinking of your organisation’s roles, history, practices, and approaches to data and measurement and evaluation. What’s more, people in an organisation need to be able to talk freely and challenge the status quo. We hope this blog can stimulate discussion about why DEI approaches are important to evaluation practice.

Relationships with your communities

It’s important to ‘work with’, rather than ‘do things for’, people. Organisations should make their communities part of the decision-making process, rather than paternalistically handing down plans from on high. Evaluation will be more effective if it has buy-in from the people involved in their services. This can begin at the programme design phase, where communities are central to the framing of a project and determine what success would look like to them. This shift in accountability is not only respectful of your communities, it is likely to make programmes more effective, impactful, and relevant.

Understanding the different layers of people’s identities will give you a better sense of what is appropriate for an evaluation to measure. For example, in certain communities, publishing data about the gender of participants could be a threat to their security due to prevailing attitudes towards women and a culture of victim-blaming.

Evaluation is not solely about setting the direction for people at the top, including funders, but about meeting the needs and answering the questions of people in these communities. Relationship building in communities is important and takes time, and Tony McKenzie notes that funders may have to be patient if they are eager to see tangible results as soon as possible.

Centring lived experience

Lived experience is ‘the experience(s) of people on whom a social issue, or combination of issues, has had a direct impact’.

As discussed at our recent event, people with lived experience are well-placed to inform solutions because they understand the problems you are trying to tackle. It’s possible to design a service and conduct an evaluation, but then receive responses that are hard to interpret (from mixed feelings to people not using the service)—and that might be because of a lack of lived experience in the service design or in the setting of questions in the evaluation.

Make sure the people you’re co-producing and evaluating with have the support they need, because lived experience is something that stays with a person; indeed, they may still be living through difficult experiences. To take this into account, there are three main areas to consider: the preparation (how you bring people in); the evaluation process itself; and what you do afterwards (what the aftercare is like). Don’t make people regurgitate their trauma unnecessarily, make sure they’re looked after, and show them that their contribution is valued.

You should also talk to grassroots organisations who are closer to the issues if you’re not close to them yourself. This is what Rosa did for their Covid-19 Emergency Fund for BME Women’s Organisations; Rosa went out and spoke to the grassroots women’s organisations it supports, which led to stronger insight into the needs of Black and minoritised women during the pandemic.

It’s a good idea to consider the language and tools you use and how they are understood by participants. Is your language empowering or disempowering? For example, calling people ‘beneficiaries’ or ‘service users’ may be too clinical or demeaning. It’s important to be mindful of the power dynamics at play while evaluators are collecting data from participants.

On a related point, you need to be specific about the demographic groups you’re trying to reach and how people self-identify—if you’re using inaccurate terminology, it can put people off getting involved in your evaluation or even offend them.

The practical side of evaluation practice

The questions you actually ask in your evaluation can make a huge difference, as some issues might only become visible if you ask a particular question. For example, when Lisa Raftery first started working for Homeless Link, statistics on rough sleeping were gender-blind, which meant there were no figures on how many women in particular were sleeping rough. There was then a problem when organisations found more women with more complex needs coming through their services and they didn’t feel equipped to help them. When problems that are described anecdotally are backed up with data, organisations then have concrete knowledge they can use to better equip themselves to deal with future problems.

During the data collection stage, try to minimise overwhelm; work out which data is essential to gather and which is nice to have. To avoid interfering too much with people’s days, Sanjukta Moorthy recommends minimising site visits and trying to do as many things as possible in one trip. When visiting your communities to take photos or videos for social media, ensure that you are travelling with an ethical photographer and that the images are respectful of the people. Telling their stories using their own words (as quotes) helps demonstrate your respect for them, and ensures that communities are not seen as promotional products for your programmes.

You may wish to invest in database software to assist with your analysis; Lisa shared that having a database makes it easier to capture the data, run reports and start to see patterns. You can break down the data in order to make (in)equality more visible. For example, when assessing a funding stream, you could find out how many applicant organisations were minority-led and how many of these applications were successful.

When you find gaps in your data—for example, minority-led charities being less likely to reach a certain stage in the funding application process—interrogate them. Ask why this gap is happening and what makes this situation different to the others. The more you do it, the more you get used to it and the more you stay alert to possible inequalities. Crucially here, openness between the board and the team is essential for being able to talk about and address these gaps.

Next steps

NPC is partnering with the Charity Evaluation Working Group and the Social Investment Consultancy to see how the sector can advance more equitable evaluation practices. The working title for this partnership is the Equitable Evaluation Collective (EEC). Through the facilitative work of the EEC, we’re aiming to influence, generate energy and momentum, and build a community of practice around this work. If you’re interested in learning more, you can fill out this Google Form here.

The scope and purpose of evaluation has been evolving, but less so how evidence is gathered and used. In this blog for trustees, learn about 'good evidence' and the role of diversity, equity and inclusion in evaluation: Click To Tweet

Footer