Wednesday, June 18, 2008

Evaluation Summary

Seems ages since I posted a blog entry but like everybody else.... I have been busy obtaining and sorting my feedback for my evaluation study. I managed to obtain 22 student questionnaires, 6 course facilitator questionnaires, 3 student observations and 3 interviews with programme management, all of which took about 14 days. Below is a summary of the findings:


Evaluation Summary
The following is a summary of an effectiveness evaluation carried out over a period of 14 days.
The rationale behind the evaluation was to identify and compare students’ reaction to online learning within a blended learning environment. The overarching questions were directed towards two specific areas namely: whether the structure of the online learning adequate to encourage self direction for a new user and the knowledge learned is transferable to real life situations and secondly, whether or not sufficient confidence is gained by the students on completion to contemplate higher level learning using this method.

The focus of the evaluation was a unit standard which is currently delivered within my working environment as part of the National Certificate in Computing Level 2. The evaluation sought to employ triangulation of data by obtaining feedback using questionnaires completed by students and classroom facilitators; observations carried out on students new to online learning as well as interviews with members of programme management in order to seek their views on the level and type of support given to students completing this unit.


Evaluation Survey
I was able to obtain sufficient numbers as per the survey sample outlined in my evaluation plan and was particularly pleased by the response of past students to the survey which was far more than I had anticipated, thus giving a more even mix of current and past students completing the survey. In total I received 22 completed student questionnaires. A short questionnaire was also completed by 6 classroom facilitators who gave their views regarding student support and use of online learning tools. The observations also went well with the availability of three students new to online learning and who attended the classroom during the period of the study for guidance with this module. Interviews were conducted with three members of the programme management team who have direct contact with the students, namely the Programme Leader, Student Support Adviser and Course Assessor.


The overall results of the student survey revealed the following:


The questionnaire handed to the students covered four areas; reaction, navigation, interaction and course outcome. Observation of the three students also sought to assess their initial reaction to the module and identify any problems vis-à-vis navigation and interaction with the course content. Six classroom facilitators from six different classrooms also completed a short questionnaire which focused on the level and type of support given to students when completing this unit.


Reaction
Once they had identified there was an online learning requirement for this unit, the majority of students managed quite well to complete the module. Initial guidance seemed to be lacking a little within the course content which was evidenced by the three interviews and facilitator questionnaires, indicating that most of the support was given at the outset. All six facilitators stated that they gave initial guidance on how to access the module and find the course material. All except one of the facilitators stated that once the students had become familiar with this style of learning, the level of support given by them was not any greater than for other modules.


Navigation
Only 1 student indicated that they had difficulty locating the course material and this small number is probably due in part to the amount of guidance given by the classroom facilitators prior to commencement of the module. Both the Programme Leader and Student Support Adviser mentioned in interview that despite the fact that students are told where to find it and there is a designated button located on the site, students who chose to study at home often needed step by step guidance over the telephone to help them find the module. 22% of the students experienced navigation problems involving following instructions and moving between the different sections of the course materials. One student summed up the experience of navigation as “a challenge”. However 17 (77%) of the 22 students surveyed rated the overall navigation features of this module as easy. The observations revealed a number of navigation problems; in particular as there was no ‘home’ button provided on the site to return to the content list, students often lost their way or found themselves going into the same link twice.


Interaction
Students appeared to be happy to complete the online quizzes and the majority of students considered that the course material provided sufficient knowledge to enable them to answer the questions. They liked the self assessment facility but as the solutions were not available online, feedback on incorrect answers was not as instant and this was echoed with comments in both the questionnaire and the observations.

Some 15 (68%) students of the 22 surveyed made use of the discussion board. Of these, 9 indicated that they found it useful whereas 6 maintained that it did not aid their learning. Some negative comments were received regarding the discussion board and these all centred around the content structure of the discussion board rather than its use as a vehicle for learning. On a positive note, all of these 15 students found the feedback from the tutor useful and only 3 would not consider using the discussion board again. Of the 7 students who did not use the discussion board, the main reasons appeared to be either lack of confidence or that it appeared to be a non-essential part of the course. This was further backed up by the course assessor who stated when compared with other online tools which are more results focused, the abstract nature of the discussion board within this module did not invoke a huge student input and consequently did little to promote student to student interaction.

Course Outcome
There was a very positive result for this section with an overwhelming 21/22 students stating that they have been able to apply the skills and knowledge learned from this module to everyday use. The survey was further broken down into areas of use and students asked to tick as many as applied, the results of which are as follows:


Home
44%
Workplace
31%
Education
15%
Community/Voluntary Work
8%
Other
3%

19 of the students indicated that they would be happy to use online learning for further study at tertiary level and 18 showed a preference for online learning over other methods. This is a positive result confirming in this survey at least, that this module does provide students with sufficient confidence to consider further online learning. Moreover, students completing this unit can acquire transferable knowledge and skills that can be applied to everyday use.
Only one student indicated on the questionnaire that they had not been able to apply the knowledge learned from this module to everyday use. However, a comment from this same student clarified the answer by admitting that they already had extensive internet skills and knowledge and consequently did not feel that they had learned a great deal.

To summarise – overall I feel that the response received has been very positive and the online learning module does appear to be effective in its structure, providing students with a pedagogical direction towards student centric learning and transferable knowledge. The fact that a large portion of the students surveyed indicated that they would consider online learning in the future, also indicates that the module also appears effective in providing sufficient knowledge, confidence and skills to promotes staircasing to a higher level.

I have listed below an initial analysis of the study and would welcome any comments and feedback on the style of presentation and how it could perhaps be improved.


Student Questionnaire

Interview Results

Observations

Facilitator Questionnaires



Thanks very much



Hilary

4 comments:

Helga said...

Hi Hilary,

great to get such positive responses. It is always good to know that you are on the right track. Unfortunately I was not able to access any of the documents on Google Docs (I seems to struggle with it all the time) so no feedback on that at this time.

I was wondering if you also got information on suggestions for change, especcially in the areas accessing the module and navigation. Except for the home button, is there anything else that they suggested?

Your initial outcome was very good to read, to the point without too much detail. I should have a look at my doc as well as it took me about 20 pages to get to initial results....

Good luck with writing your report.

Hilary said...

Hi Helga

I think one of the greatest navigation problems is that there are too many links to course material and links within links. For example if you wish to go to the assignment, there can be 4 or 5 links to click onto before you can eventually access what you want. As for accessing the module, this is an extraordinary problem and in some ways very much like the intuitive problems mentioned in your blog! I think maybe the button needs to be changed in some way as the comments were that they could not see it - 'wood and trees' springs to mind here :)

Hilary

Yvonne said...

Hi Hilary

I've not had much access to a computer lately so apologies for the late response.

As I worked with you and know all about your evaluation theme, this has been a really good read.

The overall result appears to be that the unit is a successful one but needs tweaking - and I guess this is possibly what you thought before you got started.

Your observations certainly show that students were having difficulty finding material and this was confirmed in the questionnaires and by the programme team. A simplification of the unit navigation seems to be in order there.

As for the discussion board, it appears that the activities do not encourage 'interaction' and some improvements in the tasks perhaps are needed. Linking participation in a discussion with an assessment or assignment task may be helpful here. I remember starting this course in e-learning and wondering what I was going to contribute to a discussion where it had all been said before. What developed was a discussion about that issue and from there came a more comfortable ability to comment on what others had already said. The process of discussing online make discussing online more accessible! If that makes sense!

Your study has identified areas for improvement and has confirmed areas that are strong. Great stuff.

Also, I found you data presentation easy and clear to follow.

Cheers

Yvonne

Gordon said...

Hilary, Some comments and thoughts from me to add to the others. Sorry they are a bit late.

You have uncovered some issues up front with the initial guidance lacking in course content. I found the instructor comment that the level of support for the e-learning module was the same as for any other module, once familiarised of great interest. I guess we all want to reassure our staff about potential future workloads in the e-environment

The navigation issue was quite polarised. 22% experienced problems navigating between the various sections, but the other 77% rated it as easy. Any data as to the demographic of those who struggled, or were there genuine issues?

A common theme seems to be that the self assessments were valued but this is dependent on instant feedback, otherwise it is more like a summative than formative.

It seems like students need educating on how to use the discussion board for optimum effect. Why did the 9 find it useful, for communication with others (i.e. building the community) or for getting answers/information from the others?

How long was the module? How many hours of e-learning? Did the 44% who accessed it at home actually stay at home instead of turning up to classes, or was this additional study done at convenient times?

Being a numbers person I have a couple of thoughts of how you could present your reaction data as follows:

1. If you assign values to your Likert scores, e.g. 1=1, 2=2, 3=3, 4=4 (though some discussion has been occurring in the blogs [https://www.blogger.com/comment.g?blogID=1296622255826493833&postID=6201199593795997011] about whether this could be reversed 1=4 …4=1 for ‘first’, which I think is counter intuitive – I like the higher the score, the better the rating) then for each question you can provide a total score.
As an example, let’s consider question 1 - I was aware at the outset of this course…. The score for this question would be 0x1 + 2x2 + 11x3 + 9x4 = 73. If you do this for each question you could establish which questions were scored the highest and perhaps determine the stronger and weaker aspects of the module.
2. You could turn this into a mean score for each question by dividing by the number of respondent, i.e. 73/22=3.3
3. You could plot a graph showing the scores for each question to represent this visually.
4. You could average out the values within each category (reaction, navigation, interaction, online tools etc ) to provide a general overview. This could also be plotted.

5. I quite like working with percentages as I think that people relate to 75% better than ¾. So if you want to do this you need to bear in mind that for your 4 possible responses, 1=0%, 2=33%, 3=66% and 4=100% so to convert your 3.3 mean value in para 2 above you would use the following formula:
(3.3 -1)/3 = 77% or generally: %age = (mean -1)/3

6. Using either mean or percentage values there are then all sorts of analyses and graphs you can do such as ranking in order to find the 5 highest or 5 lowest, are there trends or links between them that have a common element? I have also compared the mean with the median values (i.e the middle number when the data set is written down sequentially) and plotted this graph, as I had an obvious outlier data set which skewed the mean.
7. I also quite liked the frequency graph in the resources, which would lend itself to your data quite well. I used an Excel spreadsheet to create and colour in the blocks as I couldn’t seem to generate it any other way – here: http://www.utdc.vuw.ac.nz/research/emm/documents/workshop/LecturingReport.pdf

I hope this all makes sense, sorry if I have gone on a bit! I have taken Bronwyn’s advice and kept it simple.

Gordon