Loading…
This event has ended. View the official site or create your own event → Check it out
This event has ended. Create your own

View analytic
Monday, July 25 • 2:30pm - 3:00pm
Developing a Systemic Framework for Evaluation Models and their Applications

Sign up or log in to save this to your schedule and see who's attending!

2755 The following paper presents the development of a systemic framework for the classification of evaluation models, based on the reflective process that takes place when selecting an evaluation model and the study of processes of marginalization. For such purposes, several classifications proposed by various authors for systemic methodologies are taken into account. We should begin by stressing the importance of the concept of assessment or evaluation as it allows us to make judgments about the performance of organizations, projects, programs, staff and activities at different levels enabling the implementation of activities or actions to reduce the gap between the current state of a system and its desired state. These activities not only seek a gap reduction but are also oriented to process and human group sustainability through the achievement of best practices that will bring benefits in the long term. When selecting an evaluation model, the evaluator is usually based on the best-known features, such as the methods used, the research questions that it follows, and the kind of problems that could be targeted. However, as evaluation is entirely based on judgments, each assessment model necessarily has a set of underlying values that are rarely taken into account and should be aligned not only with the purpose for which the evaluation is done but also with the moral characterization of the problems it tackles. Such judgmental nature, implies that any judgment must be based on a set of guiding principles, standards or ideals that determine the position of the object evaluated with respect to such values. An individual, which in this case is the evaluator, must carry out a reflective process to establish this set of elements. For this reason, this paper describes the development of a systemic framework that seeks to classify the various models of evaluation of projects, policies and programs according to the values underlying each of them considering their deontological and methodological bases. In this paper deontology comprises the ethics and principles underlying the evaluation profession and specifically in the conducted evaluation process, while methodology is seen as the basis that validates a set of procedures and tools. For the development of this framework we took into account the framework for the classification of systemic methodologies proposed by authors such as Banathy and Burrell & Morgan, as well as the theory of “knowledge-constitutive interests” proposed by Jurgen Habermas and the context classification of a problem. The development of such a classification allows the individual that is conducting the evaluation to be able to select an appropriate and accurate methodology in accordance with the purpose for which the assessment will be carried out.

Chairs
avatar for Jennifer Wilby

Jennifer Wilby

Vice-President Administration, International Society for the System Sciences
Vice President Administration (2011-2016), Trustee and Vice President (2008/9) for the International Society for the Systems Sciences. | SIG Chair:    Critical Systems Thinking and Practice. | Jennifer Wilby is an emeritus senior researcher in management systems and sciences in The Business School, University of Hull. Jennifer's research interests include: developing systems resilience and flexibility in the management of complex systems... Read More →

Monday July 25, 2016 2:30pm - 3:00pm
ECCR 1B55