Friday, October 30, 2015

Evaluation Planning Instrument - next steps

Presented the development of the tool at ISSOTL this week. Great feedback - comments from audeince indicated that they were waiting for this interactive and would like to see exmples included.

Interest from RMIT - could i go and present to seed grant holders (in Science) on what to do around project evaluation.
Interest from UBC - they need help with this and have some skills to exchange - need to follow up with them to find out if they mean programming skills!

Next steps:

  • update the steps to include feedback from AES and focus group 2.
  • harvest the examples collected from focus groups
  • mind map how the online form could work - including branching
  • plot out in excel using simple logic
  • contact a programmer and think about what this may look like in an online version

Friday, October 23, 2015

2nd focus group for phase 3

After the first focus group I revisited the planning instrument and took on some of the feedback to rethink the content and structure.

The main concern from the focus group was that it was too lengthy for a small project. So some steps were combined and others removed. The resultant checklist has 6 steps:


1. What is the purpose and scope of the evaluation? Consider also how the information will be used. (i, ii, iii, iv, v, vi.)
2. Who are the stakeholders of the project and of the evaluation? Are they also part of the study audience? i, v, vi.

3.  What are the Key Evaluation Questions? i

4. What data and evidence will be collected and how will it be analysed? i, ii, v

5. What are the criteria for judgment? i, ii.

6. What dissemination strategies will be used and how will this help you? i, vi.


i.     Chesterton, P., & Cummings, R. (2011). OLT Grants Scheme - Evaluating Projects.key to literature

ii.    Owen, J. M. (2006). Program Evaluation. Forms and Approaches. Crows Nest: Allen and Unwin.
iii.   Patton, M. (1994). Developmental Evaluation. American Journal of Evaluation, 15(3), 311–319.
iv.   Scriven, M. (1991). Evaluation thesaurus. Newbury Park, Calif. Sage Publications.
v.     Saunders, M. (2000). Beginning an evaluation with RUFDATA: Theorising a practical approach to evaluation planning. Evaluation 6(1) 7-21.
vi.  Stufflebeam, D. L. (2011). Meta-Evaluation. Journal of MultiDisciplinary Evaluation, 7(15). 

Some key points to come from this focus group included:
  • allow users to begin at any point as some felt step 1 (outcomes) was not their logical starting point.
  • use the langauge of progress rather than achievement.
  • add back in a question about recommendations for the future - for people who want to pick up where this project leaves off.
  • an iterative approach to the steps would also be useful. Feedback loops. Once you start your project you usually revisit these steps and answers may change as you go.
  •  collecting evidence - not only what but also how (many people forget ethics approval)




Friday, April 24, 2015

Phase 3 planing - focus group

After a few false starts with running a workshop on the planning framework/instrument, finally got 10 people volunteering to attend a focus group.

The aim of the focus group was to test the waters with the framework. Get academics with small L&T projects in a room. Talk them through the different stages and get them to give examples for each step. Then find out what they think of the framework.

I was really excited at the prospect of 10 participants as two previous attempts to run this at MQ had resulted in only 2 and 1 respondents respectively. However luck was not on my side again - the torrential rain and storms in Sydney on the day prevented three from attending. However the focus group went ahead with 7 participants.

2 male 5 female. Four faculties covered included:
Arts and Social Sciences (Journalism; Communication; Education)
Business (Accounting)
Engineering & IT (Civil and Environmental Engineering)
Health (Nursing)

Two of the participants had had large scale (OLT type) L&T grants, one was new to the L&T grant space and others had received numerous small scale L&T grants.

General Observer Comments:

there was a general openness to talking about evaluation and a positive vibe in the room during the discussions particularly when talking about their projects.

A diagram would have helped when describing evaluation and research synergies.

There was also a general understanding of the value of evaluation and a willingness to embrace it in their projects.

However some people felt overwhelmed by the info/framework. Too many steps. This lead to comments regarding 'having to' complete it. And negativity that it would take too much time to complete and if this was required then it would put them off doing evaluation. --> this kid of missed the point (I thought) - as this is meant as a resource to help them formulate their evaluation plan.


I asked for definitions of Evaluation (with respect to L&T projects).


  • quality control - monitoring level of proficiency
  • feedback - monitoring and improving learning outcomes
  • an assessment of a particular project, either qualitative or quantitative
  • feedback re: effectiveness in student learning and efficiency/efficacy in delivery of content and development of student directed expectation
  • a process to determine if the stated projects aims and objectives were achieved and if not what we can learn from it
  • finding out it it meets the intended purpose of the project
  • identify what works/needs review
  • comparison of outcome against objectives (designed at beginning of project)


I also asked them to say how they 'felt' about evaluation. Words and phrases included:

  • useful to gather "lessons learned"
  • essential part of project feedback loop
  • must be multidimensional
  • necessary useful tool
  • useful if done halfway through project rather than just at the end
  • intrusive on time
  • useful at critical stages for modification of delivery
  • great! I love to see how its evolved, turned out, even if its a catastrophe!
  • its required to improve subject quality
  • I welcome evaluation as long as its not an unwieldy or unnecessarily complex process
  • I see it as a critical and integral part of any project


Then at the end of the session I asked them to reconsider evaluation and write whether their thoughts or feelings about evaluation had changed since the beginning.

  • two people said no change
  • four stated they were now more aware of different evaluation foci/purposes
  • one stated they were more 'dispirited' because of the quantity of work required to complete the framework


Next steps:
1. Transcribe the audio and analyse data.
2. Transpose examples given by participants in the 'workshop' section to the framework document
3. Think about which steps could be reduced by either combining or removing.
4. Run the session again with the revised framework.



Saturday, April 18, 2015

first quarter 2015 update

Already I feel like i have gotten off to a better start to the year than last.
I had tried to run a focus group to get feedback on the development of the framework before moving on to constructing some kind of interactive one that I could trial. However it proved impossible to get anyone along to it at MQ.

Checked in with ethics at UTS and providing i did an amendment at MQ to include a UTS staff cohort, they were happy for me to go ahead and run the focus group at UTS (new employer).

Approval was also required by the DVC (Shirley). Once done, the invite went out and now the focus group/ workshop is scheduled for wednesday 22nd April. There are 10 participants rsvp'd. Each has a current L&T grant. MOre on that later.

Also, just in is feedback from IJEM that paper 3 (details of phase 1) has been accepted with minor revisions. So I am now looking back at data and analysis notes from 2012(!) It's dusty up there.... in here.... so am scouring this blog in the hope to find something useful since the reviewers want to see some more detail of analysis in the results and appendices.

Tuesday, February 3, 2015

more thoughts on analysis

Have finished coding all of the interview transcripts and finished up with 19 codes.


  1. others who have connection to a project (steering committee, audience or ref grp)
  2. changing nature of projects; contextual info
  3. project management lingo
  4. issues or challenges
  5. time taken or timing
  6. types of evaluation or evaluands
  7. perceptions or emotions or conceptions
  8. communications, dissemination activities
  9. quality
  10. feedback
  11. experience
  12. relationships, connections
  13. influence
  14. support (needing)
  15. sustainability
  16. impact
  17. findings
  18. where to from here - what's next, how is this meaningful
  19. research


These were combined into 8 themes.


compiling frequncy counts of codes and themes (as %) and looking for patterns. Comparing PL and PM and then across projects. Going to use radar charts to see if there is a particular pattern for the role.