Showing posts with label themes. Show all posts
Showing posts with label themes. Show all posts

Saturday, May 20, 2017

Canadian Evaluation Conference

A Storify of my tweets and favourites from others


My Favourite quotes and comments from the conference:

evaluation is about problem solving not fault finding @AccessAlliance


Challenging assumptions is difficult when hierarchies are involved- people afraid to be truthful says @Mark Stiles

evaluators need to think beyond the report @Nancy Snow

so much more learning can occur when eval introduced early. @TolgaYalkin 

Blog about evaluation findings as a way to share info and show value @Nicholas_Falvo

Role of evaluation - “evaluation is essential for learning” (Uitto et al., 2017) @Astrid Brousselle

The single biggest problem about communication is the illusion that it has occurred @G_KayeP

Funding does not lead to impact. Funding leads to knowledge, which (once applied) leads to impact. @jwmcconnell

Lesson learned in DE: don’t assume that, just because people come together for a project, that they have the same understanding! @StrongRoots_SK

Building eval capacity is as messy as learning, to be transformational we need to help them understand time needed @carolynhoessler

Evaluators work in the space between the thinkers/doubters and the doers/faithful @a_Kallos

Intimacy = in 2 me see @Paul LaCerte


Penny Hawkins summarised the panel presentations (2nd day's keynote) : Misaligned expectations; learning vs. accountability; valuing evaluation. This is a reaffirmation of the main points that were discussed in my Thesis.

This online tool can be used to assess organisational evaluation capacity - developed by @eval_station - Similar to a benchmarking instrument, to be used in group mode, not individual. "The conversation is often of greater value than the answers to the questions"


Met Dr Justin Jagosh, the founder of CARES: Centre for Advancement in Realist Evaluation and Synthesis: https://realistmethodology-cares.org 

Through the website I was able to make contact with Dr Prashanth N S who runs a reading list - articles which all identify how realism is used in practice - which I wanted to read to help with my methodology section of the thesis. Great connection.

https://www.mendeley.com/community/critical-realism-and-realist-evaluation/


~~~~~~~~~~~~~~~~~~

3 Questions I was asked about the poster:

1. Who is the audience
2. What is innovative about it?
3. What impact will it have?

These questions really made me think about the design of the poster/project and I hope will help me when writing it all up.

1. The findings from this study consisted of a set of recommendations, some aimed at those carrying out the evaluation and others aimed at the funders of these small projects, usually at the institution level or even the Faculty or School level. The former group can use some of the strategies to assist them in their evaluative efforts and help them grow their evaluative skills. The latter group may learn more about the needs of the grant awardees and be able to modify expectations and behaviours.
These two groups make up the audience for this project. However, I believe the findings and recommendations could be transferable across other sectors who offer small scale grants for introducing new innovations. 

2. I'm not sure I would describe my research as innovative - but here we go. The evaluation framework which was developed through action research cycles and resulted in an online interactive tool was a great output from this study. A need for such a resource was identified and the format of the final product is quite innovative in its simplicity. 

3. I'm hoping that the impact of this study will come about when people (the identified audiences) start to better evaluate their work, through thoughtful planning and understanding of the available options and requirements. When these small innovations and projects are evaluated, the findings need to be disseminated so that others can learn and improve on them. Thus leading to an improved learning experience for our students.




Saturday, October 11, 2014

analysis notes - phase 2

Starting a second cycle of coding on the first of three sets of data. Initial coding interspersed with some InVivo Coding and Versus coding was used in first cycle.

Codes were transposed into a spreadsheet and then colour coded using a focused coding (Charmaz, 2006) approach.

The categories thus far are:
  • People (who are connected with a project such as steering group, audience etc)
  • changing nature of projects; contextual factors
  • project management information
  • issues or challenges
  • tame taken or timing
  • types of evaluation or evaluand
  • perceptions, affective language, emotions, conceptions
  • communications
  • quality
Another option for 2nd cycle coding is Elaborative coding (Auerbach & Silverstein, 2003). In this approach, findings from previous research can be supported, strengthened, modified or disconfirmed (Saldana 2013). If I use this, I can work with the first phase findings which unveiled 4 themes from across the 15 completed L&T projects: conflation between research and evaluation; capability building in evaluation; resources (time and money) and an action approach to evaluation.

At this point I will start with Focused coding and perhaps simultaneously note if any other previous themes appear in the current data set.

Once I complete this with the first set of data I can separately work on the 2nd and 3rd data sets (projects/ case studies). Then using Case Study methods I can compare and contrast  findings from each case.

Saturday, August 18, 2012

More thoughts on phase 1 interviews

I've been reading through each of the interviews and marking up the themes that I have and I've come across another theme that is partially related to evaluation but also related to grants and projects.

It seems that more than a few people I interviewed complained about the fact that even though their project created or produced some wonderful outputs, not everyone was willing to take up these new ideas.  For example redesign of some units into online units with chunking up of content meant that there were now more opportunities to interact with the material but the students felt this was too much work and the tutors felt this was too much work (marking). This kind of leads then to the point that if you don't fully involve your stakeholders then the project outcomes can't really meet their needs.

On the other hand another participant talked about this and said that there has to be a balance because when you try to please all of the stakeholders in this way you end up with a substandard product ie you 'dumb it down' to keep everyone happy and some of those novel and 'out there' ideas are lost in translation. This participant said that often it is a case of good timing. If no one takes up your new idea/product as you intended then it may because they are just not ready yet.

That then in turn leads to the conclusion i keep coming back to that evaluation has to be done later down the track (as well as formative and summative in the terms of the project). Often it is too early to say whether this project has been successful or not. You can say whether you met your outcomes or not but you cannot say whether those outcomes have had impact. Neither do people report what didn't work or rather why they may not be being taken up.

I think perhaps also there is some confusion between steering groups and stakeholders. A steering group may well reign you in, but stakeholders will be sure to tell you what they want and why/how they want it. Its one thing to listen to their needs then made an informed decision on how you are going to design your product (say). It is another to present it to a steering group because then you pretty much have to make the changes they request.

Saturday, July 14, 2012

Thematic Analysis

Braun and Clarke (2006), offer a complete and in-depth paper on what it is, guidelines to using it and pitfalls to avoid. They state that it can be considered a method in its own right - contrary to other authors stating it is not. It is compatible with constructionist paradigms and they stress its flexible nature in its use which can sometimes cause it to be framed by a realist/experimental method (though they don't particularly go along with this).
Importance is placed on explaining 'how' analysis is conducted (as its often omitted) as well as describing what and why they are dong it that way. Terminology is defined - Data corpus, set, item and extract. [all interviews, answers or themes, one interview and quotes].
" Thematic analysis is a method for identifying, analysing and reporting patterns (themes) within data." p.79
we shouldn't say that themes 'emerge' because actually that implies that they reside in the data - in actual fact they reside in the heads of the researcher who plays an active role in identifying, selecting and reporting them.(Ely et al., 1997 and Taylor & Ussher, 2001)
How do we say a theme exists? There are no hard and fast rules - though prevalence is important. A theme could be prevalent in one data item or across the whole corpus. And it may (or may not) be present in every data set or it may be present to only a small extent.
You should think about how a theme captures something of importance to the overall research question(s). This may make it 'Key'. The the question lies in how to report it ie 'most participants' or 'some..' or 'a number of...' etc.
You need to ask yourself whether you want to provide a rich thematic description of the whole data set or do you want to provide a detailed account of just one aspect of the data.
Next, will you provide an inductive analysis - whereby you link the themes to the data ie from specific questions, or a theoretical analysis - whereby your research questions evolve from the themes.
Semantic vs latent themes ie surface level descriptive or more deeper analysis of the underlying causes, assumptions and conceptualizations - this leads towards a more constructivist approach
Back to the paradigm wars - if a constructivist paradigm is used, then the themes are built from sociocultural contexts which enable individual accounts. In comparison the realism framework allows a more simple explanation to develop since meaning, experience and language are unidirectional. ie the language is used to describe experience and provide meaning.

The paper then goes on to describe 6 steps in thematic analysis - to be used flexibly and as a guide.
1. Familiarize yourself with your data: jot down notes for coding schemas that you will come back to in subsequent phases.
2. Generate initial codes. Work systematically through the data set  and identify interesting aspects in the data items that may form the basis of repeated patterns.
3. Search for themes: sort the potential codes into themes - broader analysis of whole data set. Use concept mapping
4. Review themes. this is a two step process - level 1 consists of looking at the extracts for each theme and deciding if they really fit that theme, and are coherent. if not, reassess the theme and perhaps discard extracts if necessary. Level 2 of this stage will depend on your theoretical approach and requires revisiting the whole data set and consider the validity of each of the themes.
5. Define and name the themes. Look at the data extracts for each theme and organise into a coherent account with an accompanying narrative. Look for sub-themes within a theme and finally, use clear (short) names for the themes so that the reader understands quickly what it means.
6. Produce the report. More than just describe the data, tell the story by using the extracts to make an argument in relation to the research questions.

Some questions to ask towards the end fo the analysis:
what does this theme mean?
What are the assumptions underpinning it?
what are the implications of this theme?
what conditions are likely to have given rise to it?
why do people talk about this thing in this particular way?
what is the overall story the different themes reveal about the topic?

Potential pitfalls are described:
1. no real analysis is done and the analysis is just a sting of extracts with little analytic narrative.
2. uses the data collection questions as the themes.
3. no coherence around an idea or concept in all aspects of the theme.
4. no consideration of alternative explanations or variations of the data
5. mismatch between the interpretation of the data and the theoretical framework.