Improving lives through data insights and evidenced based approaches

At Maudsley Learning, we are committed to advancing mental health education and practice through our expertise and dedication to research. Maudsley Learning is comprised of a multidisciplinary team of clinical, psychology, technical, and education experts, with a shared commitment to engaging with and advancing evidence-based practice to improve mental health and wellbeing for all.

Our in-house research and development team brings together mental health and research expertise to drive this commitment through our established training evaluation processes and drive to deliver innovative research in this field. Our team has built a strong reputation through the successful publication of applied research, underpinned by multi-disciplinary approaches and theory that mirrors the nature of the research-practitioner roles that make up the team. Our team is motivated to deliver research that is innovative, creative, and collaborative, with the goal of sharing the world-leading expertise held within our team, organisation, and partnerships, to improve the lives of the clients and communities we serve.

Theoretical background

Our research and evaluation methodologies are informed by a combination of biopsychosocial-spiritual, educational, and organisational psychology theories. We integrate mixed methods approaches with pragmatic research design to delve into the complex nature of healthcare systems and delve into the diverse lived experiences of mental health and well-being across varying contexts.

Our methods and approaches remain flexible and adaptable, tailored to the specific settings of each research and evaluation projects. Our overarching approach to research and evaluation aims to dissect the underlying mechanisms behind specific outcomes, seeking to comprehend what interventions are effective, for whom, how, why, and in what context whilst simultaneously considering economics and return of investment.

At the core of our approach lies Nielsen and Abildgaard’s (2013) Process and Effect Evaluation Framework. This framework emphasizes the necessity of considering both contextual factors, particularly the intended design of the program, and the effects of the program, and how these effects manifest themselves in various contexts. It encompasses process and effect elements, delving into the strengths and limitations of the program, perceived benefits, and their correlation with the unique context in which the program operates.

We also draw upon Parlett and Hamilton’s (1972) illuminative evaluation framework, tailored explicitly to glean insights and evidence through learning and development processes such as staff engagement and consultation, employing appreciative inquiry and illuminative evaluation techniques.

The third model guiding our evaluation approach is the 'Clear box' model from Scriven’s (1994) Three Box Evaluation Model. This model clarifies the numerous processes and contextual factors that may influence the implementation, experience, and success of an educational program. It seeks to understand how and why the training was effective, for whom, and in what setting.

Lastly, anchored in classical education evaluation theory, our approach is shaped by Kirkpatrick’s Four Levels of Training Evaluation (2016). This model offers a comprehensive framework for assessing training programs. The Reaction level gauges participants' immediate thoughts and feelings towards the training, offering insights into their engagement and motivation. The Learning level, the focus shifts to assessing knowledge acquisition and skill development through various assessments. Behaviour evaluates if participants apply learned skills in real-world settings, crucial for practical effectiveness. Finally, Results measure the broader impact of the training on organisational goals, such as increased productivity or improved quality of work, providing an overarching view of the training's contribution to organisational success.

Grounded in these theoretical frameworks, our methodological approach aims to provide research and evaluation at a high level. Doing this through highlighting operational dynamics, their impact, and to generate recommendations for future program development or research areas and identify key areas.

References

Kirkpatrick, D., & Kirkpatrick, J. (2006). Evaluating training programs: The four levels. Berrett-Koehler Publishers.

  1. Nielsen, K., & Abildgaard, J. S. (2013). Organizational interventions: A research-based framework for the evaluation of both process and effects. Work & Stress, 27(3), 278-297.
  1. Parlett, M., & Hamilton, D. (1972). " Evaluation as Illumination: A New Approach to the Study of Innovatory Programs". Occasional Paper.
  1. Pawson, R., & Tilley, N. (1997). An introduction to scientific realist evaluation. Evaluation for the 21st century: A handbook, 1997, 405-18.
  1. Srivastva, S., & Cooperrider, D. L. (1987). Appreciative inquiry into organizational life. Research in organizational change and development, 1(1), 129-169.