Saturday, November 19, 2011

Quality

Reading Cooksey & Caricelli (2005) on Quality, Context and Use : Issues in Achieving the Goals of Metaevaluation.

There are many defined standards of quality and they basically relate to the evaluation models which are used in a study. Each model has a number of criteria under which a study can be evaluated against and therefore any metaevaluation must take into account these quality standards.

That brings me to the question in regards to my own study, what are the quality standards being employed. Since i have created a set of criteria based on a number of different models ie Scriven, Patton, Stufflebeam, Owen, Chesterton and Cummings. Should I be meeting with Stakeholders to find out what they constitute as quality for our context? This may help in turn, in ensuring the evaluation findings are put to better use (Johnson et al., 2005).

Another finding in this study referred to the lack of transparent information in the final reports. I'm finding in my search of the identified data (final project reports) the same thing, a lack of detailed data for a metaevaluation. The reports intended audience is different in the internal projects to the external projects.

The finding is that an organisation should identify what it may need from a metaevaluation and then ensure all evaluations that are conducted will one day be able to be metaevaluated. This would include defining the methodology in detail (for example).


Saturday, November 12, 2011

What are we evaluating?

Some (many) of the project reports i have been reading seem to blur the line of evaluation. Every project has an outcome and most have a set of resources so that learning can be taken from the project and new knowledge and understanding created. However when evaluation is discussed in the report it tends to be the evaluation of these by-products of the project rather than of the projects' processes.

In addition, many of the projects discuss feedback on products (such as resources or programs that come as deliverables of the project) under the heading of evaluation. Is this the same? Different?

Monday, November 7, 2011

Form for me

This form is to allow me to enter the data i'm gathering from the reports.

Friday, October 21, 2011

evaluation or research

Leading on from last weeks ramblings on the differences between research and evaluation, Owen (2006 p.64) says that the inclusion of the planning and communicating stages is what differentiates evaluation from social research. The 'middle section' is the research which uses similar ranges of data collection and analysis techniques.

I want to concentrate on this difference. Try and prove that concentrating on the first and 3rd sections of the pie (picture) will help improve evaluation practices. ie the interactive instrument helps with these sections. and help with including time to do them.

Need more readings around time taken for each of these sections. Need to continue reading Owen.

Update: Nov 2011

Alkin & Taaut, 2003 write that the goal of research is generizable knowledge but the purpose of evaluation is context-specific (p.3). They Quote Cronbach & Suppes (1969) and conclusion-oriented research vs. decision-oriented evaluation.

update March 2012

Alkin (2011 - Evaluation Essentials form A to Z) states on p.8 that research seeks conclusions and evaluation leads to decisions. researchers ask their own questions in order to seek conclusions they can use to add to the knowledge bank. Evaluation answers questions that are important to a particular person - the stakeholder or client, say.

Alkin talks briefly about the definition of evaluation (which are goal orientated, around merit and worth) but he directs the readers focus to the processes which allow one to reach the point of being ready to judge merit and worth.

Update August 2012

reading Mackenzie, N. M. & Ling, L. M. (2009). The research journey: A Lonely Planet approach. Issues In Educational Research, 19(1), 48-60. http://www.iier.org.au/iier19/mackenzie.html

quote Mertens (Mertens, D. M. (2005). Research methods in education and psychology: Integrating diversity with quantitative and qualitative approaches (2nd ed.). Thousand Oaks, CA: SAGE.) though i can't give a page number of this as its in html (open access)


"This highlights the decisions we take as researchers when we aim for convergent outcomes or divergent outcomes. If our research is so prescribed and directed as to push us towards particular desired outcomes, it is convergent and in fact, may potentially not be research at all. Mertens (2005) makes a distinction between research and other forms of activity such as evaluation.
The relationship between research and evaluation is not simplistic. Much of evaluation can look remarkably like research and vice versa. Both make use of systemic inquiry methods to collect, analyse, interpret and use data to understand, describe, predict, control or empower. Evaluation is more typically associated with the need for information for decision making in a specific setting, and research is more typically associated with generating new knowledge that can be transferred to other settings. (p.2)
In fact, much of the prescribed and funded so-called research we undertake for convergent outcomes which fit the agenda of funding bodies is probably more akin to evaluation than research. Research which does not work towards pre-determined or prescribed outcomes and thus can produce divergent outcomes more in the spirit of what we understand as true research."

some words

empirical : derived from or relating to experiment and observation rather than theory
epistemology: The branch of philosophy that studies the nature of knowledge, its presuppositions and foundations, and its extent and validity.

Thursday, October 13, 2011

Accepted

Wow, amazed that I didn't have to make any modifications to my proposal! After a few technical hiccups, I received the following email:


Dear Elaine,

I am pleased to advise you that you have been offered a place in the PhD in Education program for Semester 2, 2011 at Macquarie University.

And then:

Dear Elaine,

I am pleased to advise you that you have now been enrolled as at 30 September 2011in the Doctor of Philosophy in Education for Semester 2, 2011 as a Part Time On-site Domestic candidate.  Your expected date of completion will be 30 September 2019.

wow that's a loooong time away!

Sunday, October 9, 2011

More mq projects

I revisited the MQ Grants 'previous winners' webpage to find it updated with many more final reports. There are now 29 in total from across the competitive and strategic grants categories.

I've been thinking about how to best make use of these reports. I could compare the final report with the initial application (were available, and when not, I could approach the awardee and ask them for a copy). This may highlight changes, things they said they would do but haven't.

I've been grappling with the 'criteria' - should i use the excellent questions provided by Chesterton and Cummings and supplement with Datta etc? Stufflebeam has an excellent Metaevaluation checklist but it is too detailed with 300 items to be checked. If we could use this it would provide excellent quantitative data. But I thik we cannot.

The main issue for me at the moment is my first question is 'what evaluation forms and approaches were used in this project?' And I have a feeling my answer for majority is 'none'. Maybe its about terminology. Some things that are covered by the word evaluation are things like data collection - isn't this research though? Now I'm struggling with the difference between research and evaluation.....
some more searches required me thinks.

An article by Julia Coffman, Consultant, HFRP based on Scriven's work:

http://www.hfrp.org/evaluation/the-evaluation-exchange/issue-archive/reflecting-on-the-past-and-future-of-evaluation/michael-scriven-on-the-differences-between-evaluation-and-social-science-research


How are evaluation and social science research different?Evaluation determines the merit, worth, or value of things. The evaluation process identifies relevant values or standards that apply to what is being evaluated, performs empirical investigation using techniques from the social sciences, and then integrates conclusions with the standards into an overall evaluation or set of evaluations (Scriven, 1991).
Social science research, by contrast, does not aim for or achieve evaluative conclusions. It is restricted to empirical (rather than evaluative) research, and bases its conclusions only on factual results—that is, observed, measured, or calculated data. Social science research does not establish standards or values and then integrate them with factual results to reach evaluative conclusions. In fact, the dominant social science doctrine for many decades prided itself on being value free. So for the moment, social science research excludes evaluation.¹However, in deference to social science research, it must be stressed again that without using social science methods, little evaluation can be done. One cannot say, however, that evaluation is the application of social science methods to solve social problems. It is much more than that. 

Sunday, September 25, 2011

Lots of Links - more on meta-evaluation


Wondering whether I should relook at the mq projects and contact leaders anyway and see what they wrote in their evaluation methods section of application but then would need to ask Barb if I can get a copy of those applications as they would not be publicly available. And then do I tell people that I have looked at their application before I interview them? Would need to disclose this.

Need to add the word metaevaluation to the tile of phd.

Mark, Henry and julnes 2000 reasons for evaluating. This would be more relevant than the ref I used in my proposal which was 1998 and was only editors notes....maybe?

Add the links to the uni west Michigan website, to my blog for future reference. http://www.wmich.edu/evalctr/checklists/about-checklists/

Usability evaluation report , useful for my useability testing some nice references and a way to layout the findings etc.

Papers to be read
Metaevaluation, stufflebeam
evaluation bias and it's cntrol, Scriven:
Metaevaluation in practice, selection and application of criteria. Cooksy & caracelli
Metaevaluation revisited, Scriven
Metaevaluation as a means of examining evaluation influence, Oliver
A basis for determining the adequacy of evaluation designs, sanders & nafziger
Mandated Evaluation: Integrating the Funder-Fundee Relationship into a Model of Evaluation Utilization
Mayhew, F.       2011 May 9. Mandated Evaluation: Integrating the Funder-Fundee Relationship into a Model of Evaluation Utilization. Journal of MultiDisciplinary Evaluation [Online] 7:16. Available: http://survey.ate.wmich.edu/jmde/index.php/jmde_1/article/view/315/315

Monday, September 12, 2011

this weeks thoughts

today I have been wondering if I am on the right track. It seems that all ALTC projects have to be externally evaluated. So am i looking to see what was done with that evaluation and therefore apply findings to mq L&T projects?
If yes, then i need to ensure the questions i ask are relevant to this purpose.
I'm trying to help mq applicants run an evaluation that is not time consuming and.....

Saturday, September 10, 2011

new search criteria

I need to do a new lit search with the word metaevaluation. I also need to search through evaluation journals.

evaluation checklists

today i decided to start compiling the data and building the questions to be used to conduct the metaevaluation (new word i think i may use in my title).

found the evaluation centre on the university of west michigan's website. A plethora of resources. Patton and Scriven publish there.

They also have an online journal of interdisciplinary evalaution.http://survey.ate.wmich.edu/jmde/index.php/jmde_1/about/submissions#authorGuidelines

I'm wondering if i can publish my proposal there? Is that something you do? Publish your proposal? Maybe I can rewrite the proposal as a series of thought provoking sections which lead to the phases. Will check with Marina.

Still no news form the faculty on whether i can actually do this thing yet!

Saturday, August 27, 2011

Evaluating reports

Have started this weekend looking at the evaluation sections of the final reports to see what was carried out. It's strange. I'm sure the grant applications say you have to evaluate, and many of them say they did but it seems they are evaluating resources made or coming from the project than the actual project itself. Many use an external evaluator, but so far what they do seems disparate and not structured or based on any research or framework etc. Maybe I need to get hold of those external evaluation reports to find out. In fact this is something I hadn't considered when I wrote the proposal. I mean the need to contact external evaluators etc.
Need to revisit the altc site and check and also revisit the c&c framework to remind myself what it is they are recommending.
I wonder whether their recent review found the same thing, am I just replicating what they have already done?

Reading Owen to remind myself of forms and approaches.
Proactive
Clarificative
Interactive
Monitoring
Impact
Question, can I apply these to projects as they were written for programs.
Can I read each final report and see if any of the evaluations fall undone of those forms? Or if they don't then could. Suggest they ought to.
If that was the case then the interactive could ask questions and suggest one of these approaches in the future.

To read

There is a webiste that happened to come out of MQ and am just assessing the altc report as part of my first exercise, with resources for candidates and supervisors. Timely!

Need to as for access details from the dean of research tho which means I have to officially be a candidate, but haveny heard a peep yet. How long will it take? It's been 3 weeks so far.

1. Here's the URL for reminder anyway
http://www.first.edu.au/public/mcontacts/index.html#mac


2. E. Jane Davidson’s (2004) Evaluation Methodology Basics.

3. Right First Time: How to Ensure the Success of Evaluation Projects
Gordon, Greta
Commerce and Administration, Victoria University of Wellington
March, 2009
http://survey.ate.wmich.edu/jmde/index.php/jmde_1/thesis/view/15

update: September 2016:
just re-reading this and found that the resource for supervisors is really interesting:

http://first.edu.au/?page_id=68

Ethics

Started to fill in the 30 page application. Wondering whether I should do one per phase and then extend it as I move into a new phase. It's too hard to fill in otherwise. Will ask Margot for advice.

Update: Submitted ethics ap. on Nov 3rd. Hopefully will make deadline and get through before ethics closes (end of Nov.) Have only applied for phase 1 as it got too complicated to try and write it for phases 1 & 2.

Sunday, August 7, 2011

Philosophical assumptions

Someone asked me 'what is the theoretical framework around which your PhD is centred? I was stumped. Even though I had spent weeks thinking and reading on this, and had written a nice little half page under the heading in my proposal, i was unable to actually verbalise any coherent response! A prattled on about pragmatism, but what exactly does that mean in the context of my PhD.

I am now reading chapter 2 of Cresswell and Plano-Clarke's Designing and conducting mixed methods research. and relooking a the pragmatist worldview:

  • consequences of actions
  • problem centred
  • pluralistic
  • real-world practice oriented.
What is my worldview?

Evaluation of a project is left to the end – that is why it is often lacking.
No one wants to admit that their project ahs not achieved its goals and so will skimp on the evaluation or massage the results of the evaluation to make the project appear successful.

How is feedback from any evaluation fed back into the project? IS there ever time or money to realistically do this?

By asking the different people involved in the project, what their view is, will ensure different perspectives are heard.
Epistemology – collect data from available reports and interview MQ staff on those projects as this is easiest and ‘works’.

Combine data – methodology Use a checklist but also do interviews.

Casual discussions and add quotes when reporting on results.

The sticking point for me is the theoretical lense. 

 Theoretical Framework
A theoretical orientation for a mixed methods study would be the use of an explanatory framework from the social sciences that predicts and shapes the direction of a research study.

For example could use a change management or leadership lense.

Which social science theory will I use? IS the developmental evaluation (Patton) or MERI my framework?


To read (check if in library):

Mertens, D.M. (2009) Transformative research and evaluation. New York: Guilford Press.

Friday, August 5, 2011

commencement

Well I thought at the beginning of the year that I would be starting my PhD in semester 1 (at the latest), but here we are at the end of week 1, semester 2, 2011 and I just submitted my proposal!
Was so happy to receive this email:


Dear Elaine,

This email is to acknowledge receipt of your application for the PhD in Education for commencement in Semester 2, 2011 at Macquarie University.

You will receive further correspondence from us once the initial assessment of your documentation has been completed.