Data quality assessment to support initiatives focussed on those most-at-risk youth

Client: Ministry of Business, Innovation and Employment

The context

High rates of young people not in employment, education or training (NEET) have become a concern internationally. Research suggests disengaged youth are more likely to experience social exclusion, increased psychological distress, and poor quality of life,[1] impacting on individuals’ ability to lead fulfilling lives with purpose, balance and meaning to them. Beyond wellbeing, the costs[2] related to NEETs are significant,[3] for instance, the estimated per capita cost of each NEET rangatahi in New Zealand was $26,847 over the 2012-2015 time period. Investing in NEET rangatahi could benefit employers by enabling them to access prospective employees to match their needs, while enabling young people to find jobs, train for better jobs, improve their incomes, support their whanau and engage and contribute to their communities.

He Poutama Rangatahi/Youth Employment Pathways (HPR) Strategy was launched in January 2018 as an investment in people (not programmes). The Strategy focuses on a sub-group of young people, in particular those 15-to 24-year-olds currently NEET who face the greatest challenges in gaining sustained employment.

The HPR monitoring and evaluation plan focused on addressing the Strategy’s accountability requirements over time. Knowing that achieving and evaluating[5] the goal of sustained employment would take time, the focus was to first monitor quality and progress across the individual initiatives and rangatahi over the initial 36 months. A monitoring approach and reporting system was established to estimate progress systematically across HPR providers according to the expected achievements – providers first target and enrol those most at risk, then engage and support these individuals, and ultimately help them make progress on the pathway towards sustained employment. Specific measures and reporting tools were created to help MBIE and providers delivering the initiatives monitor this progress along a theoretical pathway.

The project

To guide effective decision making, government required data that represents progress accurately, demonstrating useful and up-to-date information about HPR providers and rangatahi. Providers collect and report statistics quarterly to MBIE. Data quality was assessed according to five quality dimensions – completeness, consistency, timeliness, validity, and uniqueness – for each dataset provided to MBIE from the different providers. The strengths and weaknesses of the different data, as well as an overall quality score, were summarised so Ministry officials could understand the quality of data and ultimately how they could use these data. The quality review was also used to help understand if other activities were required to improve data quality and reporting among the providers.

[1] Stain, H., et al. (2019). Study protocol: a randomised controlled trial of a telephone delivered social wellbeing and engaged living (SWEL) psychological intervention for disengaged youth. BMC Psychiatry, 19(1), 136.

[2] The 2016 OECD report (Society at a Glance) defines NEET costs as the gross labour income NEET could command if they were employed, measured as the gross labour cost (including social security contributions).

[3] Estimated loss of income totalled between 0.9% – 1.5% of the OECD GDP. OECD (2016). Society at a Glance 2016: OECD Social Indicators. Paris: OECD Publishing.

[4] McGirr, M. (2019). Not just about NEETs: A rapid review od evidence on what works for youth at risk of limited employment. Ministry of Education.

[5] The impact evaluation was designed to estimate what was achieved for local youth and also whether this achievement would have occurred without HPR support. The difference the activities made would represent the value-add of HPR, and use Statistics New Zealand’s Integrated Data Infrastructure for this purpose.

Image credit: “Work hard… always” by leonfhl is licensed under CC BY-NC-ND 2.0

Evaluation of the Alert programme pilot

The context

The Alert Program® (ALERT) was developed by occupational therapists, Williams and Shellenberger, in the United States. ALERT promotes a shared language to communicate arousal levels in children, and provides tools and strategies to strengthen their self-regulation skills. The Ministries of Health and Education worked together with two primary schools in the Wellington region to pilot ALERT during Term’s 3 and 4 of the 2019 school year. The pilot adapted ALERT as a school-wide approach and provided school staff with training and support. As staff learn and embed the skills, students are expected to improve their ability to self-regulate and both teachers and students will improve their overall wellbeing.

The evaluation

Standard of Proof evaluated the pilot ALERT programme and assessed its effectiveness in the New Zealand context. We used a mixed method approach, pulling together and then synthesising different types of data from various sources. We also used the pilot as an opportunity to test evaluation measures, data collection approaches and the overall systems required for an eventual impact evaluation while increasing the evidence base over time.

We thoroughly enjoyed working closely with the project working group and the pilot schools, and are proud to have delivered purposeful evidence that is informing decisions on future investments in the programme.

Improving the Health Promoting Schools self-assessment tool

Client: Ministry of Health

Evaluation is about finding out what works well and what doesn’t. An effective evaluation will help you celebrate your success and help you improve your programme. This begins with collecting the right data and providing accurate information – and this means having the right measurement tool.

In collaboration with our partners Cognition Education, Standard of Proof tested whether the Health Promoting Schools (HPS) school self-assessment tool was providing accurate indications of school health and wellbeing so that the school workforce can trust the results and make decisions about the next steps in their delivery approach.

We applied the Rasch measurement model to the evidence collected from the self-assessment tool. This technique provided a robust reflection of the tool, and provided useful modifications to the self-assessment – making sure it provided reliable evidence.

The analysis examined:

  • Does the assessment tool have clear questions and response categories so it can provide reliable measures of all groups?
  • Are the range of questions and response categories both easy enough to accommodate schools with low ability, and hard enough to accommodate schools with high ability?
  • Are the scaled response categories able to provide accurate measures of growth over time?
  • Does the overall assessment tool provides a single measure of school health and wellbeing, which can reliably test any relationship with student achievement?
  • The results were presented to a national hui with the HPS workforce, and the findings are being used by the Working Group to further improve the value of the self-assessment tool for school communities.

Measures for He Poutama Rangatahi initiative

Client: Ministry of Business, Innovation and Employment

Psychometric analysis ensures young people can get the support they need

“Standard of Proof helped a government agency develop a questionnaire which would help organisations understand the strengths, needs and progress relevant to each young people sustaining employment over time. The factors that could help these organisations identify needs and progress towards their long term goal – sustained employment – included aspects of individual’s foundational needs, employability and opportunity.

The intended use of the questionnaire highlights the importance of providing accurate information. If the data misrepresent a young person’s ability to sustain work, inappropriate decisions might be made that could have negative effects on the person. For example, the young person may be placed into work too early, before they are able to sustain a job, which may have a lasting negative impact on their future employment opportunities. Inaccurate information may be worse than having no information at all.

Standard of Proof, with the support of the University of Western Australia, applied the Rasch unidimensional measurement model to determine if the organisations and young people could use and interpret the data as intended, and if government organisations could trust the results. Of even greater importance at this early stage, the technique produced diagnostic information necessary to improve the questionnaire, making it easier for young people to respond to and providing information that everyone can rely on in order to inform the ‘right’ next steps.

Good measurement ensures an efficient approach and accurate data

Client: Ministry of Education

Good measurement ensures an efficient approach and accurate data

Standard of Proof recently provided support for a national randomised trial in New Zealand. Interesting in this case, the evaluation randomly allocated “clusters” of people (rather than individuals) into control and treatment groups. Why is this important? Because your sample size needs to be significantly larger in such evaluation designs.

Let’s take an example. A class includes a group of students (ie, a “cluster”). If your evaluation and activity are being delivered across multiple schools, how can you randomise students easily and efficiently? One approach (more) common in clinical trials is to embed a clustered design element in your evaluation; in this example, we would randomise by class – our “cluster” – rather than by student.  Although this may be a simpler allocation approach, it may not be as efficient for data collection and analysis.

Clustered data have a pre-existing group structure. Using this class example again, we would reasonably assume that children’s achievement scores are in some way influenced by their cluster (i.e., the teacher, their class, the characteristics of having enrolled in this specific school, etc). This “relatedness” in the cluster will have a significant effect on your sample size requirements. If measuring progress at the student level, what may have been a study requiring 200 participants may now require 2000.

Such considerations are useful to know up-front. Given the right balance, you can certainly design an efficient and practical randomised trial.”