Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Authors:Judith F. Fynn, Karen Milton, Wendy Hardeman, Andy P. Jones Abstract: Evaluation, Ahead of Print. The use of multi-agency partnerships, including research-practice partnerships, to facilitate the development, implementation and evaluation of public health interventions has expanded in recent years. However, gaps remain in the understanding of influences on partnership working, and their capacity to facilitate and use evaluation, as well as the characteristics which lead to partnership effectiveness. We applied qualitative methods to explore experiences of stakeholders who were involved in partnerships to deliver and evaluate a national physical activity programme. We combined thematic and network analysis, and drew on concepts of evaluation use, knowledge exchange and organisational systems to interpret our findings and develop a conceptual model of the relationships between partnership characteristics and processes. Our model identifies key partnership characteristics such as high levels of engagement, regular communication and continuity. Furthermore, it highlights the importance of implementing organisational structures and systems to support effective partnership working, knowledge exchange and capacity building. Citation: Evaluation PubDate: 2022-05-04T11:44:13Z DOI: 10.1177/13563890221096178
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Authors:Elliot Stern First page: 139 Abstract: Evaluation, Ahead of Print.
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
First page: 143 Abstract: Evaluation, Ahead of Print.
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Authors:Hans Bruyninckx First page: 144 Abstract: Evaluation, Ahead of Print.
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Authors:Cobi Calyx, Summer May Finlay First page: 150 Abstract: Evaluation, Ahead of Print. This article proposes improvements to an open framework for evaluating participatory science, including projects framed as citizen science. An original proposed framework, while valuable in its comprehensiveness, used problematic language that makes it unworkable in many international contexts. In countries like Australia where Indigenous data sovereignty matters profoundly, language about ‘target groups’ and ‘easing access’ to knowledge can harmfully perpetuate colonial discourses. The original proposed framework is sufficiently useful that it is worth constructively revising, so critique in this article is aimed towards collaborative progression of an open framework more suitable for international use. As well as replacing ‘target groups’ with partnership approaches, we argue that ‘easing access’ to knowledge for exploitation is a frame perpetuating the colonial doctrine of discovery, proposing recovery as an alternative aligned with several international movements for social justice and sustainability. Citation: Evaluation PubDate: 2022-04-27T08:37:20Z DOI: 10.1177/13563890221085996
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Authors:Chris Bonell, Emily Warren, GJ Melendez-Torres First page: 166 Abstract: Evaluation, Ahead of Print. We reflect on how qualitative research can be used to develop or refine theories about how the mechanisms triggered by intervention enactment might generate outcomes, referring to examples from a ‘realist trial’ of a whole-school health intervention. Qualitative research can explore mechanisms directly, by asking participants how they think interventions work, or indirectly, by exploring participant experiences of intervention-related actions to understand the conditions and consequences of these actions. Both of these approaches can inform theorisation of how mechanisms are triggered and generate outcomes, and how this is contingent on context. We discuss methods for sampling, data collection and data analysis, and recommend dimensional analysis as a means to analyse qualitative data on mechanisms. We then consider how to draw on qualitative research to inform hypotheses to be tested statistically. Citation: Evaluation PubDate: 2022-04-02T11:35:17Z DOI: 10.1177/13563890221086309
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Authors:Marthe Hurteau, Caroline Gagnon First page: 182 Abstract: Evaluation, Ahead of Print. Evaluators often find themselves on ‘rough ground’ as they try to do the right thing and do it well. They face unanticipated ethical issues requiring decisions that are subtle and nuanced, and they do not always find the expected guidance in current ethics guidelines. In the present article, the authors offer practical wisdom as an alternative to the deontological approach. First, we develop this complex concept and illustrate its contribution to addressing and resolving ethical issues. Second, we present a study and the model that emerged from it to illustrate how practical wisdom contributes to finding solutions to ethical dilemmas and taking the necessary action. Citation: Evaluation PubDate: 2022-04-01T10:48:20Z DOI: 10.1177/13563890221086306
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Authors:Seweryn Krupnik, Maciej Koniewski First page: 192 Abstract: Evaluation, Ahead of Print. Qualitative comparative analysis is increasingly popular as a methodological option in the evaluator’s toolkit. However, evaluators who are willing to apply it face inconsistent suggestions regarding the choice of the ‘solution term’. These inconsistent suggestions reflect a current broad debate among proponents of two approaches to qualitative comparative analysis. The first approach focuses on substantial interpretability and the second on redundancy-free results. We offer three questions to guide the choice of a solution term in the context of impact multi-method evaluation research. They are related to the intended use of the findings, goals of the analysis and regularity theory of causality. Finally, we showcase guidelines through three potential applications of qualitative comparative analysis. The guiding questions would almost always lead to choosing the substantial interpretability approach. However, the redundancy-free approach should not be disregarded. Whatever the choice, researchers should be aware of the assumptions each approach is based on and the risks involved. Citation: Evaluation PubDate: 2022-04-19T01:13:25Z DOI: 10.1177/13563890221088015
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Authors:Matilda Miljand, Katarina Eckerberg First page: 210 Abstract: Evaluation, Ahead of Print. There is a demand for scientific knowledge to make informed decisions in environmental policy. This study examines expectations of knowledge use, and how knowledge stemming from systematic reviews (SR) is being used through an analytical framework that distinguishes between instrumental, conceptual and legitimising evaluation use, as well as between process and product use. Empirically, we investigate knowledge generated from six SRs conducted through the Mistra Council for Evidence-based Environmental Management from the perspectives of those carrying out the SR and their targeted stakeholders. Our study reveals ways in which SRs are used and some characteristics that improve and some that hamper their usefulness. While the systematic method and the comprehensiveness of the SRs contribute positively to the usefulness, we found that the SRs produced were simultaneously too focused (lacking multiple perspectives), and too general (providing evidence on the effects of an intervention only at the general level) thereby restricting their usefulness. The time and resources it takes to produce an SR can also affect its usefulness compared to a traditional review. Citation: Evaluation PubDate: 2022-02-26T08:25:37Z DOI: 10.1177/13563890221076540
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
Authors:Sarah D. Klier, Raphael J. Nawrotzki, Nataly Salas-Rodríguez, Sven Harten, Charles B. Keating, Polinpapilinho F. Katina First page: 231 Abstract: Evaluation, Ahead of Print. While “systemic thinking” is popular in the context of capacity development and evaluation, there is currently a lack of understanding about the benefits to employing systems theory in evaluation capacity development. Systems theory provides a useful orientation to the work involved in complex systems (e.g. national evaluation systems). This article illustrates how evaluation capacity development practitioners can use systems theory as a conceptual tool to gain a better understanding of the functional aspects and interrelationships present within a given evaluation system. Specifically, the systems theory perspective can help elucidate the reasons for the success or failure of a given evaluation capacity development program or activity. With the goal of motivating evaluation capacity development practitioners to use systems theory in their work, this article presents a systems theory framework for evaluation capacity development and offers practical examples of how it can be adopted. Citation: Evaluation PubDate: 2022-04-05T08:33:07Z DOI: 10.1177/13563890221088871
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
First page: 252 Abstract: Evaluation, Ahead of Print.
Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.
First page: 255 Abstract: Evaluation, Ahead of Print.