Thursday, March 15, 2012

Visit to the OLT

Today we visited the Office for Learning and Teaching (OLT) in Sydney. We met with Siobhan Lenihan to discuss plans for my PhD project and to get feedback and ideas. Took the following notes:

ALTC changed policy to evaluate projects in 2009 - would be interesting to look at the difference before and after this policy change - what was done differently. Apparently they went from requiring summative evaluation to both formative and summative mechanisms.

contact C&C to find their results of their interviews - should be willing to share.

is there scope to find out what other HE ares doing? Look at the websites of QUT, Griffith, Monash and possibly Deakin

longer term impact - is there scope to do a follow up on projects, down the line to report on change and impact. I have founf this an interesting outcome of may of my interviews - people have expressed a desire to talk about their findings etc and have been pleased that there has been an interest in their work.


**James Dalziel has a teaching fellowship his evaluator is Grainne Conole, she is reviewing the outcomes at each stage of the followship**

visit OLT website and request read some of the evaluation reports - these can be sent.

networks national that could be used for dissemination opportunities - contact Melissa???

Promoting Excellence networks (PEN) - ask if they have knowledge about internal grant processes etc State-based grps with a uni rep on each

Lesley Wilcoxsen - Sally Kift took over the project "the whole of university experience"

Siobhan to send me - (Look out for) the upcoming report: Carmel McNought - Hong Kong Uni
large first year stem classes to do team work Glen Lawrie - project leader

the C&C framework has been tweaked slightly - have checked the changes - very minimal. Wondering if I should follow up on their recommended references/links and summarise some of these. They look like an exhaustive collection of evaluation materials.

Saturday, March 10, 2012

Looking through a different lens

I decided this week that it would be a good idea to also interview the selection panel members who sit on the application review committee for the internal grants at MQ. This would give the perspective of what's required etc and fins out what value they put on evaluation.
Have written amendment to the ethics application and reworded the questions to suit. Will discus with Marina.

Monday, March 5, 2012

a meta-evaluation of a different kind


Means, Toyama, Murphy, Bakia, Jones (2010) Evaluation of Evidence-Based Pratices in online Learning: A Meta-Analysis and review of Online Learning Studies

I’m interested in this article for the methodology of how they went about their meta-evaluation (although I have to say the content is really interesting too and leads itself to the FLaMe program).

They specify the type of studies they were evaluating, these studies must have stringent designs (random-assignment or controlled quasi-experimental) and examine the effects for objective measures (of student learning). In relation to my project, I have not really specified the sample such as this and have used a sample of convenience – will this hold up to rigorous testing?

Furthermore I am not looking at quantitative studies. Evaluations tend to be more qualitative in their nature.

there was a main finding from the literature review - few rigorous studies in the area of interest had been published. In fact non had been found in the years of search and so the search was expanded by 2 years.

There were key findings from the meta-analysis and then there were findings from the narrative review. This review came from analysing the studies that were not able to be included in the meta-analysis due to the fact that they did not have a particular control condition. (p.xii)

Interesting comments on the potential for bias stemming from studies' authors dual roles as experimenters and instructors

The lit review and Meta-analysis were guided by four research questions. Context for the meta-analysis was given, explaining who commissioned the study (stakeholders) and the overall goal of the study. Then they described a conceptual framework for the topic [online learning] which included three key components.

Methodology

First define the topic [online learning], state what is included and what is not. Use this to define the categories to be searched for (three in this study).
List data sources and search strategies. Give years from and to and mention dates of different say for thesis etc.
List the databases that were used and the list of keywords (in the appendix).
Additional search activities - included a review of articles cited in recent meta-analyses and narrative syntheses of similar topics.
Key journals were identified and abstracts from each of these in a given period were manually reviewed.
A Google Scholar search engine was used with a series of keywords and any article abstracts retrieved were reviewed to ensure no duplication from pervious searches.

Scrrening was then carried out in 2 stages. Firstly abstracts were reviewed, giving studies the 'benefit of the doubt' (p.11) ensuring they met the inclusion criteria. Statement of number of papers included and number rejected (as percentages) with reasons for rejection.

Full text screen - the next pass had two stages, studies had to meet content relevance criteria (listed) as well as basic Quality (method) criteria (also listed.). A table of the primary reasons for exclusion was included detailing numbers for each reason and percentages. (p.13)

The coding of the study features was then detailed, regarding study features and study quality, then a paragraph explaining how the interrater reliability was checked.  Finally data analysis was explained, but this is completely statistical since the chosen studies were all quantitative, and a meta-analysis software was used for computations.