Friday, July 6, 2012

Leximancer

Searching for articles which have used Leximancer to see how best to utilise the concepts maps i am producing.
One study (Grimbeeck, Bartlett & Lote, 2004) analysis interview transcripts with  a student and looked at the words/concepts produced in the students responses the interviewers questions and then both to see where overlaps occurred. This got me thinking it may be useful to see if I can answer my question about how individual's concepts of evaluation are influenced and by what, by looking at some of the text - which is not specifically answering questions but going off on tangents.

Its also true that I have noticed from the interviews this theme that the meaning of evaluation is often interpreted differently and confused with research - could this also be a theme I could pull out using Leximancer?

A paper by Cretchley (2010) reported on study of the dynamics in conversations between carers and patients. They used Leximancer to analyse the contributions and determine the concepts and determine their functions in the discourse. In their analysis they grouped together different conditions to look at patterns of behaviour. I could perhaps do this by grouping the different 'levels' ie novice, experienced or those that did evaluate and those that didn't. Or even external and internal projects say.

when quoting/discussing the software refer to (Smith & Humphreys, 2006).Smith, A. E., & Humphreys, M. S. (2006). Evaluation of unsupervised semantic mapping of natural language with Lexi- mancer concept mapping. Behavior Research Methods, 38(2), 262-279.
p.1616 in Crethcley states: "Here, it enabled us to take an exploratory approach, letting the list of concepts emerge automatically from the text. Other qualitative content analysis techniques (e.g., NVivo) require the analyst to derive the list of codes and rules for attaching these to the data, and are thus researcher driven. As a result, these methods require checks of reliability and validity. In this research, we set up the Leximancer projects in a way that allowed the intergroup dynamics to be depicted with minimal manual intervention. This approach, which is strongly grounded in the text, permits a level of reliability that is an advantage over other methods." In Crofts & Bisman (2010) the use of Leximancer is defended as it avoids the researcher - imposed coding schemes that can be inherently biased. Quotes Atkinson 1992:Atkinson, P. (1992), “The ethnography of a medical setting: reading, writing, and rhetoric”, Qualitative Health Research, Vol. 2 No. 4, pp. 451-74.
Includes a good description of how Leximancer works (with citations). And how they utilized the software - p188: For our purposes, Leximancer provided a means for generating and recognising themes, including themes which might otherwise have been missed or overlooked had we manually coded the data. Using the themes derived by the software, the researchers then went back to engaging directly with the data in order to further explore, and interpret, the meanings of the text."

David Rooney, Knowledge, economy, technology and society: The politics of discourse, Telematics and Informatics, Volume 22, Issue 4, November 2005, Pages 405-422
This article is very in depth on knowledge discourse but uses leximancer for thematic analysis. HAs a table of the top 20 ranked concepts, then uses the concept maps to show how concepts are clumped into themes. It first uses the top 8% of concepts as 'This is the lowest level at which a number of discernable semantic clusters has formed'.
Then it looks at the clusters that are relationally close and makes inferences about the 'distant' cluster. The second concept map it uses is set at 54% of concepts which then shows other themes and how they link to the main four identified themes.

Watson, Glenice and Jamieson-Proctor, Romina and Finger, Glenn (2005) Teachers talk about measuring ICT curriculum integration. Australian Educational Computing, 20 (2). pp. 27-34. ISSN 0816-9020
Leximancer  is  a  software  package  for  identifying  the  salient dimensions  of  discourse  by  analysing  the  frequency  of  use  of terms,  and  the  spatial  proximity  between  those  terms.  The Leximancer package  uses  a grounded theory approach (Glaser &: Strauss,  1967;  Strauss  &:  Corbin,  1990)  to  data  analysis.  It computes  .the  frequency  with  which  each  term  is  used,  after discarding text items of no research relevance (such as 'a' or 'the'), but does not include every available word in the final  plotted list. Constraints include the number of words selected per block of text as well as the relative frequency with which terms are used.


1 comment:

  1. Thanks I've been trying to decide between Leximancer and Nvivo for analyzing judicial decisions and came to this post. Found it very helpful but haven't made up my mind yet.

    ReplyDelete

Thank you for your comments!