Saturday, July 21, 2012

Analysis of Phase One

The research questions for this phase are:
  1. What evaluation forms and approaches have been used in Macquarie funded learning and teaching projects? [easy to answer from the interview data.]
  2. What are the issues and challenges in evaluating learning and teaching projects?
    [This was the original Q11 in the interviews. Briefly, items would include:
    lack of skills - guidelines and support/resources needed
    initial plans too ambitious
    insufficient use of Stakeholders
    insufficient money/budget - to pay for extra help or input when needed i.e. admin support
    no feedback at any stage in the project
    lack of time
    • to plan
    • for reflection/learning
    • too busy with teaching and other demands]
  3. What is understood by evaluation? [can look at the misuse or confusion in terminology.
    For example Evaluation vs. Research and Evaluation vs. Project in terms of both planning and results. There is also some misinterpretation between evaluation and feedback. Also look at Q12 from the interviews.]
  4. How does perception influence how evaluation is carried out in practice? Or, What influences how evaluation is carried out in practice....... [for this i think i need to go through the data and pull out all examples where perception is discussed, albeit implicitly most times. Also need to look for other papers that have done this and see how they have done it. Need to refer back to the theoretical approach of the project as well as the realism paradigm]






Friday, July 20, 2012

Phase 2

There are so many competing actions needed right now I can't seem to move forward at all.
So this is my attempt to list what needs to be done and so form a plan of action.

  1. write paper 1 - lit review identifying gaps and hence need for this project
  2. thematic analysis of data from phase 1 - write  a paper on findings - need to adress the research questions identified in proposal
  3. use findings from analysis to inform the development of an evaluation framework to be tested in phase 2
  4. identify the projects to be used as case study in phase 2
  5. write a plan of action for the case study approach - something to give to project leaders
  6. have initial meetings with project leaders of two projects. Get ethics info and consent forms ready.

Action plan - phase 2
"The investigator will act in the role of Participant-as-Observer (Gold, 1958) of the evaluation and project process(es), actively participating with the project members and documentation and providing a depth to the research which would not be possible with an observer-only role (Babchuk, 1962)." (proposal, June, 2012)


  1. Meet with the project team. Show the list of questions which will be used as part of the data gathering instrument. Answer any of their questions.
  2. First 'interview' for follow up on answers. [need to develop some more questions but may be those from the 'notes' column in the table below.]
  3. Explain that I will attend all of their project meetings (where possible) and take notes which i will use in my reflections. Act as participant-as-observer.
  4. Then I will meet two more times for 'interview', once after the progress report is due and again at the end of the project - perhaps after the final report is submitted. Each time, there will be a set of questions to be answered and then I will follow up on these in more detail in the interviews.
So in total there will be three interview stages but perhaps a number of people will be interviewed.

Questions for first interview - Phase 2. (Time required – approx. 1hr)


Area
Questions
Notes
1. project clarification
What is the nature of the project?
What is the focus of the project?
What is the scope of the project?
What are the intended outcomes?
What (if any) are the project outputs?
What are the operational processes developed to achieve the outcomes?
What is the conceptual and theoretical framework underpinning the project?
What is the context of the project?
Are their any identified risks?
What key values drive the project?
Are there sufficient resources/admin for this project?

Are the plans too ambitious?

What may happen to delay the project?
Has teaching time been factored in?
2. Evaluation purpose and scope
What are you evaluating?
Why is the evaluation being done?
Are you basing the evaluation on any particular method framework or approach?
How will the information be used?
Who will evaluate this project? Are they suitably skilled?
What value will the evaluation process add to the project?




Do they need training? Do they need some support resources?
3. Project Stakeholders and study audiences
Who are the stakeholders for the project and the audiences for the evaluation information?
Stakeholders- Who has an interest or stake in the project and/or its outcomes, and in the evaluation of the project?
Audiences - Who will be interested in the results of the study and what types of information do they expect from the evaluation?
How should competing interests be prioritised?
Have you asked for feedback on the project?
Use these to your advantage – to help guide the project.
Be clear to understand the difference between two groups.
4. Key evaluation questions
What are the KEQ? Some examples could be:
What processes were planned and what were actually put in place for the project?
Were there any variations from the processes that were initially proposed, and if so, why?
How might the project be improved?
What were the observable short-term outcomes?
To what extent have the intended outcomes been achieved?
Were there any unintended outcomes?
What factors helped and hindered in the achievement of the outcomes?
What measures, if any, have been put in place to promote sustainability of the project’s focus and outcomes?
What lessons have been learned from this project and how might these be of assistance to other institutions?

5. Data Collection Methods
How will the information be collected and analysed? What/who are the data sources?
What types of data are most appropriate?
What are the most appropriate methods of data collection?
How will the data be analysed and presented in order to address the key evaluation questions?
What ethical issues are involved in the evaluation and how will they be addressed?

6. Dissemination of Findings
How will the evaluation findings be disseminated? Who are the audiences for reports on the evaluation and what are their particular needs and interests?
What are the functions of reporting?
What reporting strategies will be used?
When will reporting take place?
What kinds of information will be included in evaluation reports?

Are the stakeholders involved in dissemination plans?

7. Evaluation Plan
What does your evaluation time line and activity schedule look like?
What measures do you have in place to ensure you don’t run out of time for the evaluation to take place as planned?
Who will you ask to review your evaluation plan?
Has time for reflection been built in to the plan?






Saturday, July 14, 2012

Thematic Analysis

Braun and Clarke (2006), offer a complete and in-depth paper on what it is, guidelines to using it and pitfalls to avoid. They state that it can be considered a method in its own right - contrary to other authors stating it is not. It is compatible with constructionist paradigms and they stress its flexible nature in its use which can sometimes cause it to be framed by a realist/experimental method (though they don't particularly go along with this).
Importance is placed on explaining 'how' analysis is conducted (as its often omitted) as well as describing what and why they are dong it that way. Terminology is defined - Data corpus, set, item and extract. [all interviews, answers or themes, one interview and quotes].
" Thematic analysis is a method for identifying, analysing and reporting patterns (themes) within data." p.79
we shouldn't say that themes 'emerge' because actually that implies that they reside in the data - in actual fact they reside in the heads of the researcher who plays an active role in identifying, selecting and reporting them.(Ely et al., 1997 and Taylor & Ussher, 2001)
How do we say a theme exists? There are no hard and fast rules - though prevalence is important. A theme could be prevalent in one data item or across the whole corpus. And it may (or may not) be present in every data set or it may be present to only a small extent.
You should think about how a theme captures something of importance to the overall research question(s). This may make it 'Key'. The the question lies in how to report it ie 'most participants' or 'some..' or 'a number of...' etc.
You need to ask yourself whether you want to provide a rich thematic description of the whole data set or do you want to provide a detailed account of just one aspect of the data.
Next, will you provide an inductive analysis - whereby you link the themes to the data ie from specific questions, or a theoretical analysis - whereby your research questions evolve from the themes.
Semantic vs latent themes ie surface level descriptive or more deeper analysis of the underlying causes, assumptions and conceptualizations - this leads towards a more constructivist approach
Back to the paradigm wars - if a constructivist paradigm is used, then the themes are built from sociocultural contexts which enable individual accounts. In comparison the realism framework allows a more simple explanation to develop since meaning, experience and language are unidirectional. ie the language is used to describe experience and provide meaning.

The paper then goes on to describe 6 steps in thematic analysis - to be used flexibly and as a guide.
1. Familiarize yourself with your data: jot down notes for coding schemas that you will come back to in subsequent phases.
2. Generate initial codes. Work systematically through the data set  and identify interesting aspects in the data items that may form the basis of repeated patterns.
3. Search for themes: sort the potential codes into themes - broader analysis of whole data set. Use concept mapping
4. Review themes. this is a two step process - level 1 consists of looking at the extracts for each theme and deciding if they really fit that theme, and are coherent. if not, reassess the theme and perhaps discard extracts if necessary. Level 2 of this stage will depend on your theoretical approach and requires revisiting the whole data set and consider the validity of each of the themes.
5. Define and name the themes. Look at the data extracts for each theme and organise into a coherent account with an accompanying narrative. Look for sub-themes within a theme and finally, use clear (short) names for the themes so that the reader understands quickly what it means.
6. Produce the report. More than just describe the data, tell the story by using the extracts to make an argument in relation to the research questions.

Some questions to ask towards the end fo the analysis:
what does this theme mean?
What are the assumptions underpinning it?
what are the implications of this theme?
what conditions are likely to have given rise to it?
why do people talk about this thing in this particular way?
what is the overall story the different themes reveal about the topic?

Potential pitfalls are described:
1. no real analysis is done and the analysis is just a sting of extracts with little analytic narrative.
2. uses the data collection questions as the themes.
3. no coherence around an idea or concept in all aspects of the theme.
4. no consideration of alternative explanations or variations of the data
5. mismatch between the interpretation of the data and the theoretical framework.



Sunday, July 8, 2012

evidence, gaps in the evaluation lit

"Given the poor quality of evaluation performance in education, and the lack of a research base to guide evaluators, it seems urgent to contrive ways of defining, assuring, and documenting the quality of evaluation work." Stufflebeam, 2011.

Saturday, July 7, 2012

Bazeley 2009

This article is written about the analysis of qualitative data and in specific about the use of themes.

Short introduction on the difference between categories concepts and themes (often used interchangeably). This author uses category for the descriptive and concept for the more abstract. Other authors use concept as the lowest level and category for a group of concepts (Strauss & Corbin 1998).

The author states that producing themes as a goal of research is not much use or interest. Just describing them and using some quotes from the literature to support them is not enough to be convincing.

Suggestions made to share some portions of data with a colleague to get a fresh perspective and alternative avenues to pursue. The author writes that if describing themes is what you are doing then you need to connect them to the literature and contextualise them.'Data must be challenged, extended, supported and linked to reveal their full value' p.8
Are emergent themes really emergent? If you asked questions which produced these themes then the findings are shallow and unsubstantiated. Talks about the 'garden path analysis' (Lyn Richards) just stating what you see. Move towards a Describe-Compare-Relate approach instead.
As a starting point, describe: How did people talk about this theme, how many, what's not included.  [NOTE: in my case could look at the sentiment lens in Leximancer possibly?]
Then compare differences in characteristics and boundaries for that theme. [in my study could compare different ability levels, academic levels, experience of project management levels internal vs external etc]
Then relate to themes others have already written about.

A section on creating and using displays of the data extolls the value of for developing an understanding and presenting conclusions.
Matrix displays - for detecting patterns - for facilitating comparative analysis and sometime for presenting conclusions
Flow charts and models - present conclusions
Typologies - used a a working tool and can become a final presentation tool

And finally - avoid reliance on Quotes for Evidence as this encourages superficial reporting of themes. Try not to write to the sources, voices or methods. Build a coherent argument using evidence and then '...add illustrative quotes to add interest and clarity for the reader'.(p.20)

Friday, July 6, 2012

Leximancer

Searching for articles which have used Leximancer to see how best to utilise the concepts maps i am producing.
One study (Grimbeeck, Bartlett & Lote, 2004) analysis interview transcripts with  a student and looked at the words/concepts produced in the students responses the interviewers questions and then both to see where overlaps occurred. This got me thinking it may be useful to see if I can answer my question about how individual's concepts of evaluation are influenced and by what, by looking at some of the text - which is not specifically answering questions but going off on tangents.

Its also true that I have noticed from the interviews this theme that the meaning of evaluation is often interpreted differently and confused with research - could this also be a theme I could pull out using Leximancer?

A paper by Cretchley (2010) reported on study of the dynamics in conversations between carers and patients. They used Leximancer to analyse the contributions and determine the concepts and determine their functions in the discourse. In their analysis they grouped together different conditions to look at patterns of behaviour. I could perhaps do this by grouping the different 'levels' ie novice, experienced or those that did evaluate and those that didn't. Or even external and internal projects say.

when quoting/discussing the software refer to (Smith & Humphreys, 2006).Smith, A. E., & Humphreys, M. S. (2006). Evaluation of unsupervised semantic mapping of natural language with Lexi- mancer concept mapping. Behavior Research Methods, 38(2), 262-279.
p.1616 in Crethcley states: "Here, it enabled us to take an exploratory approach, letting the list of concepts emerge automatically from the text. Other qualitative content analysis techniques (e.g., NVivo) require the analyst to derive the list of codes and rules for attaching these to the data, and are thus researcher driven. As a result, these methods require checks of reliability and validity. In this research, we set up the Leximancer projects in a way that allowed the intergroup dynamics to be depicted with minimal manual intervention. This approach, which is strongly grounded in the text, permits a level of reliability that is an advantage over other methods." In Crofts & Bisman (2010) the use of Leximancer is defended as it avoids the researcher - imposed coding schemes that can be inherently biased. Quotes Atkinson 1992:Atkinson, P. (1992), “The ethnography of a medical setting: reading, writing, and rhetoric”, Qualitative Health Research, Vol. 2 No. 4, pp. 451-74.
Includes a good description of how Leximancer works (with citations). And how they utilized the software - p188: For our purposes, Leximancer provided a means for generating and recognising themes, including themes which might otherwise have been missed or overlooked had we manually coded the data. Using the themes derived by the software, the researchers then went back to engaging directly with the data in order to further explore, and interpret, the meanings of the text."

David Rooney, Knowledge, economy, technology and society: The politics of discourse, Telematics and Informatics, Volume 22, Issue 4, November 2005, Pages 405-422
This article is very in depth on knowledge discourse but uses leximancer for thematic analysis. HAs a table of the top 20 ranked concepts, then uses the concept maps to show how concepts are clumped into themes. It first uses the top 8% of concepts as 'This is the lowest level at which a number of discernable semantic clusters has formed'.
Then it looks at the clusters that are relationally close and makes inferences about the 'distant' cluster. The second concept map it uses is set at 54% of concepts which then shows other themes and how they link to the main four identified themes.

Watson, Glenice and Jamieson-Proctor, Romina and Finger, Glenn (2005) Teachers talk about measuring ICT curriculum integration. Australian Educational Computing, 20 (2). pp. 27-34. ISSN 0816-9020
Leximancer  is  a  software  package  for  identifying  the  salient dimensions  of  discourse  by  analysing  the  frequency  of  use  of terms,  and  the  spatial  proximity  between  those  terms.  The Leximancer package  uses  a grounded theory approach (Glaser &: Strauss,  1967;  Strauss  &:  Corbin,  1990)  to  data  analysis.  It computes  .the  frequency  with  which  each  term  is  used,  after discarding text items of no research relevance (such as 'a' or 'the'), but does not include every available word in the final  plotted list. Constraints include the number of words selected per block of text as well as the relative frequency with which terms are used.


Sunday, July 1, 2012

Q13 - other comments

This was an interesting final question. There was a number of themes that came from each interviewees final responses.

A number of people talked about the need for support and resources on evaluation. Both during the time of application and also during the project. Suggested ideas included templates and guidelines as well as information available of different frameworks and their benefits.

A few people mentioned having time to look back at the project revisit it an look at impact - but I guess that would depend on whether the project produced an 'product' that could be evaluated for impact.

A few people mentioned the importance of incorporating the evaluation into the research cycle. And there was the mention of the importance of receiving constructive feedback from the university. This could be viewed as a need for identifying study audiences and stakeholders which was another theme that came out re the importance if one required some traction in implementing the outcomes of a project.

Another theme was the forms - make evaluation compulsory and give it importance by having a section on the application, but more importantly on the reporting pro-forma.

And finally the theme of networking was also evident. Participants mentioned the benefit of sharing findings with people who were interested ie L&T related research etc.

Q12 - the value of doing evaluation

This questions asked participants 'what value did evaluation add to your project'? Even though there were a number of projects that didn't conduct an evaluation, most people attempted to answer by saying hypothetically what they felt rather than what they observed.
The majority of responses mentioned the learning that takes place when one looks back, reflects. A couple of responses mentioned learning from mistakes and valuing that. Evaluation was also seen as a mechanism for keeping the project focussed.
One participant was concerned about the fine line between feeling like you are being checked on and being supported. Another participant agreed that there has to be some amount of accountability.

Question 11 - challenges to conducting the evaluation

I'm now looking at the question 'were there any challenges to conducting the evaluation?'

An emerging theme that is coming through is that there is a lot of interference from contextual events. Running a project is something that is done at the same time as teaching, research, marking. Then there is the institutional requirements that occur simultaneously - the playing out of major projects and change procedures that also impact on the time available and possibly the outcomes. Now these all impact on how the project is run and not necessarily to the evaluation but it appears as a consequence, evaluation gets sidelined or is often not as rigorous as perhaps they initially hoped.

Some other themes that emerged:
  • Time
  •  Money
  • Administration/resources
  • Confusion in answering, talking about the challenges of conducting the research and not the evaluation.
When asked what could be done to overcome some of these issues, the following themes emerged:
  • More money
  • More time
  • Better planning 
  • More help with evaluation
  • Developmental Evaluation

This last dot point arose because a few people were talking about using evaluation for checks and measures along the way, to refocus. And also to have evaluation incorporated into the approach so it becomes part of what you do and not something extra you have to make time for.