Recently I've been asked to evaluate an online conferencing system, with a view to getting it set up on an IT Services server for teaching next semester.
As a software engineer, I am really one of the last people who should be doing software evaluation. Not only am I too expert at using a wide variety of software, but I also understand and work round all the compromises and short-cuts that developers make; after all, I've made them all myself. However, I do sometimes have to evaluate software as part of my job. Most of the time this is quite straight forward as most of the software we deal with consists of fairly simple web applications, and checking that they work usually just means checking that they work on a variety of browsers, including older versions of Internet Explorer.
When asked to evaluate software such as virtual conferencing systems, checking a few different browsers is not sufficient. Those that use the browser depend on plug-ins such as Flash and/or Java. Different variants of the operating system, different graphics cards and different sound cards can all have an impact, so just testing using one version of Windows on the standard University Dell hardware is really not sufficient.
The truth is that we are not really set up for serious software testing in the Learning Technology Unit. We have our standard desktop computers that we use for our day to day jobs, a mix of high specification Windows 7 and Mac OS X workstations, with a lot of different software installed. We also have an elderly pair of tablets – one Android and one iPad available for testing, and our own more modern tablets and phones also get called in when necessary. Realistically we can't say we're certain something is good enough across all the wide variety of systems our students use, but we can sometimes say it is not.
Quite apart from the problems we face doing a proper evaluation, a better approach would have been to come to us with a requirement, and ask what would be a sensible solution. Between us in the LTU we've got a lot of experience doing distance collaboration, as students, teachers and researchers. In my own experience (with IMS working groups, and as an OU student) I've found that the simple solutions work best. Telephone conferencing systems are reliable and have low latency, so are excellent for audio. Telephone conferencing with a shared desktop using VNC worked well in the QTI working group, but a simple shared browsing experience might be more appropriate than VNC with a less technical group. When many of our collaborators were not native English speakers, we found a simple text messaging system worked, and avoided the problem of the native speakers talking too fast in incomprehensible regional accents.
As it is, I don't know what the requirements are other than a rather vague 'online tutorials', I don't know what the students have been told the IT requirements for the course will be, and I don't know what soft of internet access they'll have. In the end though, all these things probably are irrelevant, because rather worryingly, although I've to evaluate this conferencing system, it seems that saying that it has failed the evaluation is not an option…
Also see Sarah's blog post on Vicarious Learning
No comments:
Post a Comment