HERDSA2008 Paper


 * HERDSA 2008 Paper Home Page
 * ACODE workshop data
 * ASCILITE workshop data
 * Manchester workshop data
 * Additional Key Factors not identified
 * Comments and reflection

=What are the key factors that lead to effective adoption and support of e-learning by institutions?= Stephen Marshall Victoria University of Wellington, Wellington, New Zealand Stephen.Marshall@vuw.ac.nz

The E-Learning Maturity Model (eMM) provides a means by which institutions can assess their capability to sustainably develop, deploy and support e-learning. The eMM assesses capability in a number of processes that have been selected through a combination of research and consultation with groups internationally. This paper presents the results of the consultation on the key institutional practices needed for successful e-learning. As well, reflecting the engagement with the e-learning community necessary for the eMM to reflect current practice, this paper is also being provided via a wiki so as to facilitate reflection and updating of the analysis by experts in the areas of e-learning design, development, delivery and support. The final version of this work presented at the conference will include that commentary and reflect on the process of engagement and ongoing development of a shared understanding of the factors influencing successful use of technology by educators.

Keywords: guidelines, publishing formats, procedures (give three keywords)

Introduction
A variety of measures of educational institution performance have been proposed over the last thirty years for a variety of purposes (Clift et al., 1987; Sizer, 1988; Cave et al., 1991; Alexander, 2000; Kuh, 2007). A rising culture of accountability in public life has seen a significant increase in the range of indicators being collected and reported upon. Many of these measures have been criticized for their focus on ranking and ease of measurement, rather than their reflecting fundamental factors relating to successful outcomes for students, communities and institutions.

The E-Learning Maturity Model (eMM, Marshall & Mitchell, 2007) has been designed as a quality improvement framework by which institutions can assess and compare their capability to sustainably develop, deploy and support e-learning. The eMM is based on the Capability Maturity Model (CMM, Paulk et al., 1993) and SPICE (Software Process Improvement and Capability dEtermination, El Emam et al., 1998; SPICE, 2002). The assessment of capability by the eMM depends on a set of key e-learning processes and practices which have been identified by analysis of the literature and in consultation with practitioners (Marshall 2005).

The rapid growth in the technologies being used, the ways that they are being applied across an ever widening group of academic disciplines and the evolving skills and experience of teachers and students means that e-learning is a moving target.

This continuous change means that the aspects of institutional e-learning performance measured by the eMM must inevitably evolve themselves. Identifying the key measures is a significant challenge, as they need to reflect the rapid development of new technologies as well as the changing understanding of how technology is used to support education.

In practice, the research literature tends to trail current understanding in the field of e-learning, and well-founded empirical studies tend to be produced well after current practice has moved on. Accurate and timely information on the effective use of technology to support education thus lies primarily in the heads and daily activities of those engaged in the design, development, delivery and support of e-learning.

In an attempt to capture that knowledge and use it to evolve the eMM a series of workshops were held in Australia and the United Kingdom. These workshops generated a set of statements of expert opinion, or heuristics, which have been used to inform the development of the current eMM process set. To further test this set and update it, the analysis presented in this paper has been formatted as a wiki and made available to the workshop participants. The commentary of these people will be then used to further refine the analysis and update the eMM process set.

Conduct of the international workshops
Three workshops were held internationally in Melbourne in association with an ACODE (Australasian Council on Open and Distance Education, http://www.acode.edu.au/) meeting (attended by 23 people), at the 2005 ASCILITE (Australasian Society for Computers in Learning in Tertiary Education) conference in Brisbane (attended by 6 people), and at the University of Manchester, UK (attended by 15 people).

The workshops provided participants with an introduction to the eMM and the associated concepts and an opportunity to collectively brainstorm the ways by which institutions can ensure that e-learning is effective and high quality. The physical results of the workshop brainstorming sessions were a number of A3 sheets of paper containing the collected and organised ideas of the participants on post-it notes. These were transcribed and aggregated by process area in order to capture the individual ideas for further analysis.

Ethical approvals were obtained from the Victoria University of Wellington Human Ethics Committee to conduct these workshops and use the collected information.

Workshop outcomes
Summaries of the three workshops conducted are presented below. The responses are considered separately as the participants came from quite different backgrounds and have consequently focused their contributions on different aspects of the institutional e-learning environment.

The detailed tables listing all of the workshop results are available online as part of the discussion wiki. This alignment was not done as part of the workshop, but rather was conducted as part of the evolution of the current version of the eMM. Consequently every item has been matched with a corresponding process. It should be noted that this alignment allowed for multiple matches so the same item may appear in more than one process and dimension (as noted at the end of the item text).

An example of the detailed alignment information is given in Table 1. This shows the workshop items that were associated with a given process, grouped by similar concept, which is then linked with the current practice statement within a dimension of the process.

Table 1: Example of Workshop Item Alignment with eMM Version 2.3 Process D1

ACODE Workshop Outcomes

 * ACODE workshop data

Table 2 shows a summary of the number of items identified in the ACODE Workshop brainstorming exercise broken down by eMM process area and dimension. The Support area received the most attention from participants, followed by the Organisation and Development areas. The Learning and Evaluation areas in contrast received relatively little attention.

Table 2: Summary of ACODE Workshop Item Alignment

In terms of dimension, the majority of the items were placed in the first two dimensions, Delivery and Planning. This appears to reflect the summary approach adopted by the participants in describing the items and it is likely that in many cases the intent of the item would include aspects that are addressed by the higher dimensions of the eMM.

Only one item in the Optimisation dimension was specifically identified “Student evaluation feedback used for [course improvement]”. A relatively high proportion of the items in the Organisation area were placed in the Definition dimension, potentially a consequence of that dimension’s definition overlapping with the concerns of Organisation processes.

Learning
Looking at the results for the Learning process area in detail, the main focus of the participants mapped to process L3 (Students are provided with e-learning skill development). Many of these also mapped to processes in the Support area, consistent with the overall focus on support issues noted above. The two items in the Definition dimension related to detection of plagiarism and collusion, and the need for rules governing deadlines for assessment.

The absence of more items in the Learning process area may reflect the sense that many of the processes are the responsibility of individual teachers, a sense by participants that learning is already well addressed, or merely the focus of the interests of the participants on organisational aspects. Nevertheless, given the frequent comment on the need to focus on learning when discussing e-learning it is interesting that more items were not identified.

Development
The Development area received much more attention from the participants. Process D1 (Teaching staff are provided with design and development support when engaging in e-learning) had the most items, but as with process L3, this appears to reflect in part the focus on support issues as many of the items overlapped with staff support processes. Key items that were mentioned by a number of participants were the need for “adequate ongoing staff support” provided by specialist staff organised in a coherent or centralized way (see Table 1 above). The “support for and recognition of the need for evidence-based practice” was also noted as was the dependence on an “interoperable, seamless learning environment” that was able to evolve with staff and student needs.

Support
Support was easily the most popular area, however this in part reflected the expression of the items in general terms that covered all or most of the processes in this area such as “technical support for students” and “library support for learners”. There were also a large number of duplicate or similar items reflecting a shared set of concerns by the participants. The provision of “at elbow” professional development support was the most noted idea with 22 items reflecting this theme. Clearly group workshops as a form of staff development are not well regarded, and there was a clear preference for a range of support options being provided over extended hours. The need for “reward and celebration of good teaching” was also noted in many items.

Evaluation
In contrast to Support, the Evaluation process area was almost ignored. Very few items were noted and although these matched the higher dimensions disproportionately this appears to be an artifact of the overlap between the definitions of the dimensions and the processes.

Organisation
Given the composition of the workshop it is perhaps not surprising that a large number of items were identified in the Organisation process area. Processes O2 (Institutional learning and teaching policy and strategy explicitly address e-learning) and O9 (The provision of e-learning is guided by formal business management and strategy) dominated the results and there was also a large number of items in the Definition and Management dimensions, probably reflecting the overlap between the definitions of the dimensions and the processes. Key ideas included that “there must be a clear institutional view about what the purpose of e-learning opportunities might be” supported by clear definitions and policy, and the strongly expressed need for a “shared vision about what it means to the institution and what it will do”. The need for “appropriate and timely funding” was also noted.

Interestingly, many of the operational aspects that are part of this process area received little attention (for example O8: Course administration information communicated to students prior to starting courses). As with the Evaluation area, this may reflect the sense that existing administrative procedures and systems are well equipped to address e-learning or simply the composition of the workshop attendees.

ASCILITE Workshop Outcomes

 * ASCILITE workshop data

Table 3 shows a summary of the number of items identified in the ASCILITE workshop. As with the ACODE workshop the Support area received the most attention from participants, followed by the Organisation and Development areas. The Learning and Evaluation areas in contrast received relatively little attention.

Table 3: Summary of ASCILITE Workshop Item Alignment

As with the ACODE workshop, the majority of the items were placed in the first two dimensions, Delivery and Planning. The relatively high number of items in the Management dimension of the Evaluation area reflected a strongly expressed desire that evaluation and feedback information be widely shared, rather than kept confidential to management or technical areas.

Learning
Very few items were identified in the Learning area, focusing primarily on L3 (Students are provided with e-learning skill development) and L10 (Courses are designed to support diverse learning styles and learner capabilities).

Development
The Development area received more attention from the participants. Process D1 (Teaching staff are provided with design and development support when engaging in e-learning) had the most items, followed by process D5 (All elements of the physical e-learning infrastructure are reliable, robust and sufficient). It is interesting to note that items mapping to process D5 were all in the Delivery dimension and seem to express a distrust in the reliability and robustness of e-learning systems “The technology ‘just works.’” The realization of the need to involve staff with a range of skills, not just academics, was also apparent “Integrated teams (academic, support & library) to develop courses”.

Support
As with the ACODE workshop Support was easily the most popular area. However, unlike the ACODE results this was a consequence of a large number of items in only two processes S5 (Teaching staff are provided with e-learning pedagogical support and professional development) and S6 (Teaching staff are provided with technical support in using digital information created by students) with 21 of the 23 items mapping to these two processes. Most of these items were focused on the availability of support and assistance although the need for evidence based practice was clearly noted with items such as “Establish exemplars of best practice of e-learning”, “Research led L&T encouraged” and “Teachers are given time to think, do, evaluate and reflect”.

Evaluation
In contrast to the ACODE workshop the Evaluation process area was relatively well covered, with a number of items identified across the first four dimensions. Interestingly these were not just general items around issues of collecting feedback “E-learning and other teaching are evaluated in a truly integrated way”, but there was also a strong emphasis on quality improvement and engaging staff and students in an effective feedback loop “Obtain & circulate student feedback /concerns/needs from multiple courses to devise common directions for improvement”. The need to use feedback information both to inform “Evaluation of innovations & changes to understand their impact” but also to celebrate effective teaching “Celebrating teaching forums” was also clear.

Organisation
Items in the Organisation process area fell mainly in processes O2 (Institutional learning and teaching policy and strategy explicitly address e-learning) and O9 (E-learning initiatives are guided by institutional strategies and operational plans). However the noticeable emphasis was on the lead for well informed “Managers admit they need to learn about this and stop doing ‘nothing’ because it feels safer” and clear leadership “Get executive ownership of e-learning direction at an institutional level” with a clear academic focus “Stop [educational] decisions being made by managers[/IT experts]”. Interestingly, the items covering student information and preparation needs (O6-O8) were only expressed in terms of actions by the students.

Manchester Workshop Outcomes

 * Manchester workshop data

Table 4 shows a summary of the number of items identified in the Manchester workshop. The Learning and Evaluation areas received the least attention, while Development and Support received the most. As with the ACODE workshop, the dominance of Support reflected at least in part the large number of items being expressed in general terms and thus being listed in multiple processes, the effect of this is apparent in the low number of unique items.

Table 4: Summary of ACODE Workshop Item Alignment

A much greater proportion of the items were placed in the Delivery dimension compared to the other workshops. The items in the Management dimension were dominated by generic comments about the need for quality control and measurement mechanisms.

Learning
The items in the Learning process area were relatively evenly spread amongst the processes, although there was an emphasis on the need to communicate both learning objectives “Clear learning objectives, personal transferable outcomes & academic skill outcomes” and communication “Expectations should be communicated to students regarding their participation online”. The need for “Undertaking thorough surveys of student needs both for knowledge aspects and delivery aspects” was also noted in the Management dimension.

Development
The Development area contained items mapped to the first four dimensions. Apart from processes D4 (Courses are designed to support disabled students) and D7 (Resources created are designed and managed to maximise reuse) items were distributed evenly in the processes and with multiple participants identifying similar concerns. Notably, items for processes D5 (All elements of the physical e-learning infrastructure are reliable, robust and sufficient) and D6 (All elements of the physical e-learning infrastructure are integrated using defined standards) were predominantly expressed at dimension one (Delivery) “e-learning system must be reliable” while those for the first three processes were expressed over a range of dimensions, including the Management dimension “Embedded quality enhancement processes”.

The need for effective support “At elbow training for academics (rapid response team)” involving a range of skills and perspectives “Teams – Academic (learning technologists)” was noted clearly in process D1 (Teaching staff are provided with design and development support when engaging in e-learning) but with a concern that the focus be on academic rather than business outcomes “Consider how you can sell me ‘real’ story not the ‘efficient’ story corporate affairs would prefer”. The absence of any items for D4 (Courses are designed to support disabled students) is most likely a consequence of the strong UK legislative requirement that support be provided, thus making any consideration of it redundant as a benchmarking discriminator.

Support
While not as dominant as in the other workshops, Support was still a popular area reflecting the general coverage of items such as “Plan student support requirements” and “Students understand what support is available to them, where/how to obtain it, what response times to expect”. Items were spread relatively evenly amongst the processes and with items at the four lower dimensions.

The most items were found in process S5 (Teaching staff are provided with e-learning pedagogical support and professional development) with items ranging from basic service provision “Staff development to support e-learning” through to encouraging and supporting research and reflection “It must be possible to derive research outcomes from (large scale) e-learning activities”. Items in S6 (Teaching staff are provided with technical support in using digital information created by students) also noted the need for support arising from the empowerment of students “Programme developments might assume learners will be active producers of own content”.

Evaluation
Evaluation process area items noted the need to get student feedback “There is real feedback of aspects of course performance (learning activity, assessment) to planning” but also to actively engage them in the improvement process “Developing partnerships with staff and students”. The need for quality enhancement processes was strongly expressed in several items noting the need for continuous enhancement “Embedded quality enhancement process” and the need to support innovation “Quality processes must facilitate innovation in e-learning”.

Organisation
Relatively few items were identified in the Organisation process area. The need for an e-learning strategy dominated the results along with the need to ensure that strategies and policies are used throughout the organisation “Policy and policies are disseminated meaningfully to practitioners and practitioners buy-in to policy and policies”. As with the Evaluation process area, the need for support of innovation was noted “Work towards change. We live in a change culture” and “Allow for flexibility/adaptability of processes/policies/procedures”. This was not however seen as solely an academic or managerial concern “Student experience is seen as the responsibility of all institutional staff”.

Engaging with the ‘e-learning community’
The intention of the workshops was to gather a set of key factors that contribute towards effective use of e-learning technologies and pedagogies by institutions. The information that has been gathered provides a snapshot of factors that people engaged in the day-to-day use of e-learning feel their institutions should be focusing on. This paper has placed these factors into the eMM framework and attempted to analyse possible reasons why some aspects are mentioned often and others are not.

Table 5: Summary of Item Alignments for All Three Workshops

Table 5 shows the relative paucity of items from the workshops addressing the Definition, Management and Optimisation dimensions. This is consistent with the eMM capability assessments that have been performed to date (Marshall & Mitchell, 2007). This may reflect in part the way that e-learning has been undertaken in many institutions; essentially following the ‘innovation’ approach described by Rogers (2003).

By presenting the items from the workshops in this paper and continuing the conversation online, it is hoped that a wider range of perspectives on these dimensions of the eMM processes will be obtained, and consequently a better sense of what is needed for e-learning to have a significant impact on student learning. Accordingly, this paper, and the associated detailed analysis have been published on a wiki at this address: http://www.utdc.vuw.ac.nz/emmWiki/index.php/HERDSA2008. The address has been provided to all of the workshop participants and circulated to the ASCILITE mailing list inviting contributions and comments. The outcome of this online conversation will be presented at the conference and published as an addendum to this paper online.