Deborah Southwell, Deanne Gannaway, Janice Orrell, Denise Chalmers & Catherine Abraham (2010): Strategies for effective dissemination of the outcomes of teaching and learning projects, Journal of Higher Education Policy and Management, 32:1, 55-67http://dx.doi.org/10.1080/13600800903440550
Came across this article and found some useful information in it, and its related ALTC (2005) report with same name. The report has been reviewed in 2011 and is now in the OLT repository, here.
Summary:
The paper looks at a range of Australian and International funding bodies who support L&T projects and asks the question 'How can a national institute for Learning and teaching in higher education maximise the likelihood of achieving large-scale change in teaching and learning across the Australian higher education sector, especially thorough its grants program and resource repository and clearing house?'(p.58)
Now whilst this covers national schemes, there are some important findings that could be applied to smaller scale projects and schemes such as the one I'm looking at and there are also some interesting findings on evaluation. Furthermore the ALTC report identifies some other international funding bodies which i could follow up on in terms of evaluation requirements etc.
Questions from the guiding framework of the project team are included and a couple would be relevant to my study:
What are the influences on and factors affecting a teacher's decision to make use of a project pr process that is being scaled up? What local and external factors facilitate or create barriers affecting the teachers decisions? What is the relationship between the development of local capacity and the quality of external reform structures? (p.59)
The projects which were selected came from a range of locations ut focussed more on projects that sought to change teaching and learning processes, practices and beliefs rather than on projects that focussed on developing products. (p60.)
Some Findings:
Two items of note were:
1. Initiation of an innovation - Intended users of an innovation need to be engaged very early int he planning stages of the innovation in an endeavour to ensure adoption and take up of ideas later on.(p.61) This can be extrapolated to my project in terms of getting stakeholders involved from the early stages and to getting uptake of evaluation findings.
2. Implementation, embedding and upscaling of innovation - one influence on this comes from the personal conception of teaching (p.62). I'm suggesting that the same is true for evaluation. Also, Academics with teaching qualifications appear to be more open to investigation of alternate curriculum and teaching approaches. (Lueddeke, 2003). It would be interesting to test and see if this is the case with evaluation ie does the qualified teacher value evaluation mechanisms more and therefore employ them in the projects in comparison with academics without teaching quals. and if this is the case is that because they have been 'taught' evaluation methods and skills and therefore their conception is different?
Conditions for successful dissemination:
- effective, multi-level leadership & management
- climate of readiness for change
- availability of resources
- comprehensive systems in institutions and funding bodies
- funding design
This last point is of interest - it mentions that expectations about the approaches to projects and activities that ought to be adopted are taken from the funding design. so i could say that if there is no real expectation of evaluation requirements then of course the project applicants are not going to be that stringent. - can link to this in the analysis of phase one.
One final mention in the conclusion (p.65) says 'An important aspect of this study was to identify abd to recommend that learning and teaching grant recipients must be supported and provided with access to experts in educational innovation and evaluation.'
The 2005 Carrick report goes into much more detail and evaluation is mentioned throughout. It also gives recommended strategies for each of the 5 'conditions', at the national, institution and discipline level. when i read through look like they were all implemented at MQ ie recommending standard requirements for application, including description of which evaluation strategies will be used.
FOr item 3, the report recommends providing project teams with access to specialist expertise - this could be for evaluation or at the least evaluation resources. Findings on p.55 state that 'those responsible for the project may require assistance in designing an appropriate evaluation process'.also info on this on page 45 (emerging themes)
For item 4 it mentions that support for quality processes, particularly monitoring and evaluation ought to be supplied. Also that evaluation is reported within an evaluation framework. Also, on p.58 there are findings that state that institutions that allocated funders AFTER the projects were finished were evaluated well and regularly and were eventually embedded within an institution. 'Generally, however, experiences quoted int he literature and in case studies evidenced poor quality of evaluation if done at all.' It then went on to explain that frequently dissemination across institutions occurred before it was apparent that there was any value or improvement instudent learning and therefore impact analysis was vital.
The 2011 review of this project didn't seem to add much more in regards to Evaluation other than a recommendation that external evaluation reports be provided publicly. There was a reference to Dow (2008) which i should follow up and could provide some useful data.
An evaluation of the Australia Learning and Teaching Council 2005-2008. Sydney: Australian Learning and Teaching Council.
No comments:
Post a Comment
Thank you for your comments!