Thursday, June 19, 2008

Evaluation Summary amendment

Hi Everyone

Following on from my previous post, it seems that the link to the Student Questionnaire was not working. I have now put this into a different format and so should be available now. Apologies to Helga and anyone else who has had difficulty accessing it. Bronwyn I have shared this with you again.

Cheers

Hilary

Wednesday, June 18, 2008

Evaluation Summary

Seems ages since I posted a blog entry but like everybody else.... I have been busy obtaining and sorting my feedback for my evaluation study. I managed to obtain 22 student questionnaires, 6 course facilitator questionnaires, 3 student observations and 3 interviews with programme management, all of which took about 14 days. Below is a summary of the findings:


Evaluation Summary
The following is a summary of an effectiveness evaluation carried out over a period of 14 days.
The rationale behind the evaluation was to identify and compare students’ reaction to online learning within a blended learning environment. The overarching questions were directed towards two specific areas namely: whether the structure of the online learning adequate to encourage self direction for a new user and the knowledge learned is transferable to real life situations and secondly, whether or not sufficient confidence is gained by the students on completion to contemplate higher level learning using this method.

The focus of the evaluation was a unit standard which is currently delivered within my working environment as part of the National Certificate in Computing Level 2. The evaluation sought to employ triangulation of data by obtaining feedback using questionnaires completed by students and classroom facilitators; observations carried out on students new to online learning as well as interviews with members of programme management in order to seek their views on the level and type of support given to students completing this unit.


Evaluation Survey
I was able to obtain sufficient numbers as per the survey sample outlined in my evaluation plan and was particularly pleased by the response of past students to the survey which was far more than I had anticipated, thus giving a more even mix of current and past students completing the survey. In total I received 22 completed student questionnaires. A short questionnaire was also completed by 6 classroom facilitators who gave their views regarding student support and use of online learning tools. The observations also went well with the availability of three students new to online learning and who attended the classroom during the period of the study for guidance with this module. Interviews were conducted with three members of the programme management team who have direct contact with the students, namely the Programme Leader, Student Support Adviser and Course Assessor.


The overall results of the student survey revealed the following:


The questionnaire handed to the students covered four areas; reaction, navigation, interaction and course outcome. Observation of the three students also sought to assess their initial reaction to the module and identify any problems vis-à-vis navigation and interaction with the course content. Six classroom facilitators from six different classrooms also completed a short questionnaire which focused on the level and type of support given to students when completing this unit.


Reaction
Once they had identified there was an online learning requirement for this unit, the majority of students managed quite well to complete the module. Initial guidance seemed to be lacking a little within the course content which was evidenced by the three interviews and facilitator questionnaires, indicating that most of the support was given at the outset. All six facilitators stated that they gave initial guidance on how to access the module and find the course material. All except one of the facilitators stated that once the students had become familiar with this style of learning, the level of support given by them was not any greater than for other modules.


Navigation
Only 1 student indicated that they had difficulty locating the course material and this small number is probably due in part to the amount of guidance given by the classroom facilitators prior to commencement of the module. Both the Programme Leader and Student Support Adviser mentioned in interview that despite the fact that students are told where to find it and there is a designated button located on the site, students who chose to study at home often needed step by step guidance over the telephone to help them find the module. 22% of the students experienced navigation problems involving following instructions and moving between the different sections of the course materials. One student summed up the experience of navigation as “a challenge”. However 17 (77%) of the 22 students surveyed rated the overall navigation features of this module as easy. The observations revealed a number of navigation problems; in particular as there was no ‘home’ button provided on the site to return to the content list, students often lost their way or found themselves going into the same link twice.


Interaction
Students appeared to be happy to complete the online quizzes and the majority of students considered that the course material provided sufficient knowledge to enable them to answer the questions. They liked the self assessment facility but as the solutions were not available online, feedback on incorrect answers was not as instant and this was echoed with comments in both the questionnaire and the observations.

Some 15 (68%) students of the 22 surveyed made use of the discussion board. Of these, 9 indicated that they found it useful whereas 6 maintained that it did not aid their learning. Some negative comments were received regarding the discussion board and these all centred around the content structure of the discussion board rather than its use as a vehicle for learning. On a positive note, all of these 15 students found the feedback from the tutor useful and only 3 would not consider using the discussion board again. Of the 7 students who did not use the discussion board, the main reasons appeared to be either lack of confidence or that it appeared to be a non-essential part of the course. This was further backed up by the course assessor who stated when compared with other online tools which are more results focused, the abstract nature of the discussion board within this module did not invoke a huge student input and consequently did little to promote student to student interaction.

Course Outcome
There was a very positive result for this section with an overwhelming 21/22 students stating that they have been able to apply the skills and knowledge learned from this module to everyday use. The survey was further broken down into areas of use and students asked to tick as many as applied, the results of which are as follows:


Home
44%
Workplace
31%
Education
15%
Community/Voluntary Work
8%
Other
3%

19 of the students indicated that they would be happy to use online learning for further study at tertiary level and 18 showed a preference for online learning over other methods. This is a positive result confirming in this survey at least, that this module does provide students with sufficient confidence to consider further online learning. Moreover, students completing this unit can acquire transferable knowledge and skills that can be applied to everyday use.
Only one student indicated on the questionnaire that they had not been able to apply the knowledge learned from this module to everyday use. However, a comment from this same student clarified the answer by admitting that they already had extensive internet skills and knowledge and consequently did not feel that they had learned a great deal.

To summarise – overall I feel that the response received has been very positive and the online learning module does appear to be effective in its structure, providing students with a pedagogical direction towards student centric learning and transferable knowledge. The fact that a large portion of the students surveyed indicated that they would consider online learning in the future, also indicates that the module also appears effective in providing sufficient knowledge, confidence and skills to promotes staircasing to a higher level.

I have listed below an initial analysis of the study and would welcome any comments and feedback on the style of presentation and how it could perhaps be improved.


Student Questionnaire

Interview Results

Observations

Facilitator Questionnaires



Thanks very much



Hilary

Sunday, May 25, 2008

Evaluation Presentation

I have had a very frustrating Sunday afternoon trying to get my presentation onto MyPlick. Firstly, I had a lot of trouble getting the audio recorded - not all of the free downloads seemed to work. I then had several attempts to record the audio as I kept losing it somehow. I then had a problem uploading to MyPlick but eventually got that sorted (and Gordon made it sound so simple)! Unlike Yvonne, I couldn't find a way to edit out all the errs and urms, so they are still there I'm afraid. Also, for some reason the quality of the recording is extremely poor and I think this can only be attributable to the equipment I am using. My voice is very muffled in places and tends to go up and down, so sounds worse than normal. No amount of adjustment seemed to correct it. Apologies for this and will put it down to experience and hope to do better 'next time'. I haven't been put off though - I think it was a good challenge for me.

The link is below and once more apologies for the quality.

Hilary's presentation

Wednesday, May 21, 2008

Evaluation Plan

After some guidance from Bronwyn (thanks again for that) I have now re-written my original evaluation plan draft towards an effectiveness evaluation instead of an impact evaluation. This type of evaluation matches more closely what I wish to measure against which is; whether or not students first time experience with online learning via a particular Unit Standard equips them with sufficient knowledge, skills and confidence to consider higher level learning using this method.

I hope to post my summary soon. All comments appreciated.

Thanks

Here is the plan

Sunday, May 18, 2008

Evaluation Plan Update

Hi Guys

I realise that I have been a bit quiet recently but I have been trying to get my evaluation plan on a proper track. Unlike Rika I haven't hit any technical hitches (not recently anyway). Basically what I thought was an impact evaluation is more an effectiveness evaluation and as such, I am wishing to tie the loose ends up so that my (almost) final plan will reflect what I am doing. Hope that makes sense and I look forward to sharing my plan with you soon. Thank you for all your help so far.

Hilary

Sunday, May 4, 2008

Weeks 7 & 8 Evaluation Plan

Here is my draft evaluation plan. I have published in Google Docs, so I hope the link works.


http://docs.google.com/Doc?id=d9k9zcq_2dd7ckvkm

Thursday, April 17, 2008

Evaluation Project Thoughts

I have had to rethink my original ideas for my project due to restrictions within the area I work. So I have had another ‘head scratching’ session and picked something else that will be more suitable and not create the same barriers.


For my project, I will focus on impact evaluation of an on-line module that we provide within a blended learning course. This module is often a first time experience with online learning for many of our students and as such, is quite often met with little enthusiasm. I made reference in my previous blog to a model used by Phillips (2005) which illustrates a development cycle into which various stages of evaluation fit, namely: Design, Production, Implementation and Maintenance. I wish to look at the third phase of evaluation and identify whether students are able to apply what they are learning and in particular whether or not they are motivated to look at further study using the same methodology.


I feel that my previous thoughts regarding paradigm and models are still appropriate to my new project, namely Eclectic-Mixed Methods-Pragmatic Paradigm and utilising either Stake’s Responsive or Multiple Methods evaluation models. However, I have had to revisit the guidelines I will use as the previous ones posted, do not appear appropriate. I have therefore selected the following which I have adjusted to suit my ideas (as suggested by Bronwyn- thanks).


TD3 Does the e-learning encourage a realistic progression towards self direction? Does it recognise varied starting points in confidence and motivation?


SD5 Do students acquire the learning skills that promote staircasing to higher learning?
S08 Do students get guidance on study skills for the e-learning environment?


S010 Do students get an explanation of any differences to the e-learning modules compared to a more familiar approach?


I feel that all of the above guidelines follow a similar pattern and I will probably look at paring them down to perhaps two when I have formulated my plan a little further.

Sunday, April 6, 2008

Week 5 & 6 Evaluation Methods

Although not clearly determined yet, my evaluation project will most likely focus on methodology within an existing structure and as such, I have been looking at articles that encompass summative evaluation with monitoring or integrative evaluation. I am not sure at this stage, that formative evaluation is relevant but am open-minded to methods under this category if further development could be considered practicable within my area.

I found the article by Rob Phillips – “We can’t evaluate e-learning if we don’t know what we mean by evaluating e-learning” raised several interesting points.

In particular, Phillips states that in order to study the effectiveness of e-learning products, a mixture of evaluation and research needs to be employed. This to me would concur with the multiple methods model associated with the eclectic mixed methods-pragmatic paradigm. In this article, Phillips illustrates a model derived from work by Alexander & Hedberg (1994) and Reeves and Hedberg (2002) and proposed by Bain (1999) which identifies the various stages of evaluation process from analysis and design evaluation (front-end analysis) to institutionalisation (monitoring or integrative evaluation). In order to gain meaningful evaluation results within an e-learning environment , it would appear that a variety of techniques need to be employed, depending on whether focus is on qualitative or quantitative data in order to meet the needs of the research. Phillips refers to three independent studies carried out to research students’ use of e-learning, each using one of the recognised educational paradigms: analytic-empirical-positivist-quantitative, constructivist-heremeneutic-interpretivist-qualative and critical theory-neomarxist-postmodern-praxis. The findings provided relevant insights, but each had weaknesses which Phillips suggests could have been addressed by the use of a mixed methods approach.

Whilst the Multiple Methods Evaluation Model would appear to be a good match, I also feel consideration should also be given to Stake’s Responsive Evaluation Method which recognises the need to provide more focus on evaluation processes and methods that promote continuous communication thus allowing questions to emerge during the evaluation process. I feel that both methods support an ‘open mind’ approach which would accommodate the constantly moving target that evaluation seems to be.

References:
Alexander, S., & Hedberg, J.G. (1994) Evaluating technology-based learning: which model? In K. Beattie & C. McNaught & S. Wills (Eds.) Multimedia in Higher Education: Designing for Change in Teaching and Learning. Amsterdam: Elsevier.
Bain J. (1999) Introduction to special issue on learning centred evaluation of innovation in higher education. Higher Education and Development, 18(2), 165-172
Phillips, R (2005) we can’t evaluate e-learning if we don’t know what we man by evaluating e-learning! Interact, 30, p 3-6. Learning Technology Support, University of Bristol
Reeves, T.C., & Hedberg J.G. (2002). Interactive Learning Systems Evaluation: Educational Technology Press.

Another article that I found to be appropriate to my research was the following:
Learning through online discussion: A case of triangulation in research by Michael Hammond and Mongkolchai Wiriyapinit

This article focuses on online discussion and its focus within e-learning and distance learning programmes. It identifies that a commitment to student-student and student-tutor interaction is an important feature and concurs with the constructivist approach to e-learning. The studies carried out focus on triangulation and the findings from each of the methods used were in respect of consistency (if there was a match between findings) and contrast (findings were contradictory). It was however, identified throughout the study that the interpretation of some of the surveys carried out did not adequately measure likely variables. Analysis of data therefore, would point to ensuring that evaluation methods are identified that adequately reflect the purpose of the research. Notwithstanding the strengths and weaknesses of the study, it:-

“Reinforced the case for triangulation and showed three major advantages.
  • There were some perspectives which could only be accessed via one method e.g. students’ management of time, their engagement with reading and approaches to composing messages only emerged clearly during interviews.
  • Findings from one method could be put in a wider perspective through comparison with those from other methods, e.g. students’ accounts of their online activity could be compared to the objective data concerning frequency of message postings.
  • Consistency between findings gave a greater authority in reporting, e.g. the claim that students valued the module and adopted a task focused approach to group work is credible”.

If my area for evaluation and proposed project emanates around an existing e-learning structure, then I believe there would be a strong consideration for triangulation and bracketing approaches.


At this point, I think I need to ‘scratch my head’ and identify an evaluation project that will justify further research and help me identify a relative and appropriate structure.
Reference
Hammond, M. & Wiriyapinit, M. (2005). Learning through online discussion: A case of triangulation research. Australian Journal of Educational Technology, 21(3), 283-302.

Wednesday, March 26, 2008

Week 4 - Evaluation Paradigms and Models

Quite a bit to take in this week but I found Gordon's table to be a good aid. When comparing two paradigms that I would associate with my area, I feel the following theories could apply:

Constructivist Hermeneutic Interpretivist Qualitative

This paradigm appears to be parallel with the constructivist theory that learning is constructed from previous knowledge or experience and that it is through the research of ideas that knowledge is obtained. This paradigm would lend itself I feel to the research and investigation elements required for many of the educational areas in which we operate.

Eclectic-Mixed Methods-Pragmatic Paradigm

This paradigm seems to have the advantage over some of the others in that it mixes some techniques associated with other paradigms and appears to encompass most methods to achieve an outcome. I can particularly relate to the ‘pragmatic’ aspect which reflects the fact that nothing is ever perfect in education but can be improved through design and recognition of possible flaws. Eclectic-Mixed Methods-Pragmatic Paradigm almost appears to be a parody of the other paradigms in that it does not rely on one particular belief.

Saturday, March 15, 2008

Week 3 - eLearning guidelines for Quality

Below are three of the guidelines that would be appropriate to my working area. Many eLearning issues could possibly fit into more than one guideline or even overlap depending on the category of guideline.

TD4 What makes for an effective online discussion

Despite efforts to introduce the discussion board to students at level 2, they are really only forced to use it at level 3. It has been found that when tasks or dialogue on the discussion board are optional, very little activity is evidenced. This is primarily due to the fact that no-one actually facilitates the discussion board at level 2 and as such, students are left to organise themselves. The fact that no-one appears to be monitoring the discussion board generates low motivation in this area and few students actually participate. Those that do use the discussion forum just publish meaningless comments or inappropriate material. If a discussion board facility is to be utilised in a meaningful way, then some feedback needs to be provided together with appropriate activities to keep students motivated and on track. The discussion board is a tool that enables distance learning students to make contact with and share thoughts using an asynchronous medium. However, without guidance and supervision it becomes sadly lacking.


TD10 Should students present work using online discussion tools?


Following on from above (TD4), Level 3 students are expected to submit work for summative assessment using the discussion board. This applies to one module only and is facilitated by the assessor. Two or three activities are provided with the coursework which involves a certain amount of research and prepares the student for their final submission. Students are encouraged to post their findings on the discussion forum and share ideas and findings with their group. Each intake is spaced one month apart and students for each intake are grouped accordingly. However, there is some resistance to submitting using this means. Students feel that due to the high numbers passing through this course, plagiarism between student postings will occur. Many students have already been picked up for ‘copying and pasting’ direct from the net without reference or little acknowledgement to “own work”. Comments from the assessor frequently appear on the discussion board to this end. The answer here maybe is to set specific tasks for discussion and formative assessment with the final assignment to be submitted using a more secure method thus promoting sharing without the inherent plagiarism where it counts.

TT3 Is there evidence of timely, accurate and well targeted feedback to students?

Feedback is adequate considering the volume of students. Some online assessment material is in the form of multiple choice questions which provides automatic feedback and enables the student to see immediately how many questions they have got right or wrong. Students are encouraged to attempt the quiz as many times as they wish until they achieve 100%. The only problem here sometimes is that some questions can be quite ambiguous (often for a purpose) resulting in students seeking help through other channels to achieve the answer thus taking away the independent nature of on-line learning. The turnaround of feedback used to be quite poor due to the fact that this was carried out by only one or two people and the burden was quite onerous. As such, positive feedback was not always plausible. However, this has now been addressed by the hiring of more assessors who are now able to take a more proactive approach to monitoring and feedback. The result of this is that students are now able to progress through the modules more quickly and most students are happy with the turnaround of work submitted.

TO4 Are retention rates reviewed and evaluation done on why students did not complete the course?

It is recognised that any structured course that contains several modules and/or is self paced, the highest drop-out will occur within the first few weeks. The drop-out on our level 2 courses used to be huge and this has been reduced greatly by the introduction of monitoring systems. The majority of the students can be classed as ‘mature’ and as such work and family commitments impinge on their ability to commit to the number of hours required. Students that appear to be inactive are telephoned periodically to ensure that they are still on target and intend to continue with their studies. This has been welcomed by many of the students as it demonstrates our desire for them to succeed and continue their studies whilst at the same time, giving them the opportunity to air any concerns on a personal level.

Sunday, March 9, 2008

Week 2 - The Importance of Evaluation

Sorry to be so late with this post....... but briefly:

Why is Evaluation Important to me and how do I define it?

I would define evaluation is being a means to measure effectiveness of learning. Also to check that goals are being achieved and if not, why?

In the past when I have designed and delivered courses, most of the evaluation I have carried out has been to merely measure the effectiveness of the material and how it was received by the students. The feedback obtained has enabled me to go back and tweak various aspects, thus improving on previous ideas and adapting accordingly which has proved to be invaluable. A boring but necessary topic for example, can be improved on and measured at the formative stage before it falls at the last hurdle. But that is easy when ownership of a course sits in the lap of one person, which of course is quite often not the case. Institutions are primarily seeking results and achievements and I believe evaluation is an important part of ensuring success, especially within elearning environments where important feedback to tutors is delayed, the results of which unfortunately are quite often evidenced by drop-out or non-achievement.


What Sort of Evaluations Mentioned on the Presentation Are Familiar to You Already and Why?

My experience within various organisations is that evaluation is more than often done at the end - when frustration and disappointment with a course had already set in. The key here it would seem would to have not only learned from the feedback for the future, but to put into practice some ongoing evaluation process that addresses this problem.

The model demonstrated in the presentation would be a good guideline I think. Quite often though analysing feedback only leads to published statistics which become both meaningless and historic.

Why is Quality Important in eLearning?

I have learned from previous papers on this course that transferring a face to face course to an online learning environment takes a lot of time and effort. Attempting to 'sell' the benefits of elearning can be difficult and often met with resistance from many. Ensuring quality within elearning would overcome some of the difficulties and promote this type of learning for the future. Development time put aside for this type of learning should I feel, include evaluation as well as testing of resources.

Tuesday, February 26, 2008

Hello

Hi Everyone

I am thrilled to be active at last on this blog as it is my first attempt at setting up my own blog. Having said that, my first posting was very succinct - it merely had a heading, so task 1 was to find a 'delete' button!

I work for Manukau Institute of Technology delivering various levels of IT within the community learning programme which is part of the Faculty of Business following our restructuring. We are classroom based but much of the course material is slowly moving to on-line although I suppose you could say at the moment that we have a blended learning environment.

On a personal note, I have lived in New Zealand for 6 years now, having made the move from the U.K. I live in Howick (a suburb of Auckland) with my husband and daughter who is currently 'studying' (although I see little evidence of this) at uni. We also have a Chinese girl living with us who is not a student but more like a lodger as she is a permanent resident here and has a full time job.

I am pleased to be back on the elearning course, having had to endure a forced six months break before I could rejoin and start this semester. This is my 5th paper and I have learned so much on this course that I often wonder if I had known that I knew so little at the beginning, would I have enrolled on the course! All part of the learning curve I suppose. As far as this paper is concerned, I can probably pick on one or two areas within my work area that could potentially be suitable for an evaluation project but I suppose I will identify them more fully as we progress through the course.

I look forward to sharing the learning with you all.

Ciao

Hilary