Learn how you can support us with money, time or expertise and help us to help maximise impact. NPC

6. Analysing and using data

Making sense of the data you have collected, finding patterns and themes, so you can learn and improve.

In our guide to Understanding impact, we explore how to use your theory of change to build a measurement and evaluation framework. In this closer look we explain how to stay on top of data entry, analysis and reporting.

Don’t be afraid to be honest about failure. Although it may be difficult, learning from things that have not worked is the best way to improve services. Click To Tweet

Data entry

You will need a system for entering questionnaire data into a spreadsheet. If you’ve used a paper survey, then this will probably have to be done by hand. Keeping your questionnaire clearly laid out and short will help the data entry process. It’s a good idea to build your database in advance and to test your draft questionnaire with the person assigned to data entry. Remember to factor the time it takes to enter data into your evaluation planning.

For most charities, Excel is the best tool for entering, storing and analysing data. It is the easiest to use, most widely available and most familiar. Excel can do what most charities need in terms of analysis. Over time, you may find that you have more advanced analysis needs and will need more sophisticated software such as SPSS, SAS or R. These are harder to use so we’d encourage you to start with Excel.

Think carefully about how you set up your database. Each respondent should be a row and each variable a column. The top row of each column should list the variable name. As shown in the screenshot, ordinal questions and single-code nominal questions can be assigned to one column only (columns B, C, D, E), which makes them easier to analyse, but nominal ‘multiple choice’ questions will need to have a separate column for each possible answer (columns F, G, H).

Use Excel’s pivot table function to summarise data, filter different subgroups, conduct cross-break analysis and create charts, without affecting the underlying spreadsheet. Microsoft provides plenty of tips on how to do this and what else you can do with Excel.

From our experience, we’d recommend that you:

  • Never split or merge cells in your database. This can damage it irreparably.
  • Keep a master database of the raw data only. Do your analysis and data manipulation in a copy of that database. This means that if you damage the data you can always go back to the original.
  • Give each respondent a unique reference number and take any personal data out of your analysis spreadsheet to be stored separately and securely.
  • Use the validation function to create drop down boxes in the spreadsheet, this will make data entry quicker and eliminate errors.
  • Consider storing open-ended response data separately as this can make a database unwieldy. Remember to keep the respondent’s unique reference number associated with each response so that you can merge it back later if you need to.
Show Less

Data analysis

Data analysis is the process of making sense of the data you have collected. It involves examining the data you have collected to find patterns and themes, and drawing conclusions about what it is telling you, so you can learn and improve.

We’ve found that many charities feel pushed to over-claim and say that they have found definitive ‘proof’ that their intervention is effective. In reality, evidence can rarely provide a completely definitive answer to our questions. By its very nature, evidence is ‘partial, provisional and conditional’. Evidence may only be relevant to a particular context or time, however rigorously applied the methodology or well thought-out the design.

How to analyse your data

Data analysis should start with what you need to know. Go back to your priority areas, evaluation questions, and work your way through the five types of data framework. Your main aim should be to compare the data collected to your theory of change. This is sometimes called contribution analysis.

The key questions to consider when reviewing your evidence are:

  • Assess congruence: Do the results match the theory?
  • Disaggregate results: Look at patterns of outcomes among different users. It can be particularly valuable to compare different cohorts (those who fully engage and those who dip in and out), different sequencing of your interventions, and ‘dosage’ (the difference between those who meet a key worker weekly or monthly). Try to think beyond ‘what works?’ and ask ‘what works for whom in what circumstances?’.
  • Consider other explanations: Could something else apart from your project be causing the change?
  • Ask participants and facilitators: What are their interpretations of the results?
  • Make counterfactual comparisons: Try to estimate what would have happened if the activity had not been run.
  • Analyse costs and benefits: All evaluations should show how much a project has cost.
  • Consider you’ve learned: What advice would you give to someone delivering a similar intervention?

Remember the difference between attribution and contribution. Your work exists within a wider system and other factors will contribute to any changes you see. You should ask yourself who and what else is involved in supporting this person or community. Consider their contribution and challenge yourself on how much of the change you see is down to you or whether there are other plausible explanations.

How do your results compare with similar activities or organisations?

Compare your results with others to understand how you are doing in relation to your peers. This can offer insight into the success you can expect from your work and suggest ways you might improve.

Show Less

Reporting

Your reporting should be impartial and transparent. Evaluation is not intended to justify projects. Rather it should be an impartial analysis of a project’s strengths and weaknesses. We’re increasingly seeing funders who are more interested in whether projects have been properly assessed than what the actual results say. Therefore, if your evaluation is going to be credible you will need to show that you have been impartial in:

  • Designing your research: Your theory of change will help with this.
  • Collecting evidence: This will derive from the quality of your evidence collection and combine a wide range of sources.
  • Analysing evidence: This comes from the quality and rigour of your write-up. You should give a full description of your methods and include relevant data in your appendices.

Don’t be afraid to be honest about failure. Although it may be difficult, learning from things that have not worked is the best way to improve services. Furthermore, if we are not sure a service will work, it is better to implement it on a small scale and learn quickly to minimise cost.

 

Read our full guide to using your theory of change to build a measurement and evaluation framework.

Related items

Footer