L3 1 4

=Course activities provide students with opportunities for substantive feedback on their e-learning skills. =

Evidence
Feedback is integral to the learning process (see L5) and this applies to learning e-learning skills as well.

With e-assessment personalization of learning is possible, also there is potential for on-demand summative assessment. The real advantage may, however, be in the immediacy of feedback.

Valtonen et al. (2009) study student readiness for e-learning. They report that students are not all equally ready to undertake e-learning. There are negative, neutral, and positive attitudes and beliefs concerning online learning. Also, student ideas about the possibilities of e-learning are superficial. Students must be shown the point and possibilities of e-learning if it is to be successful. It is simply not true that just because students are digital natives that they have transferable skills and a disposition to work in online environments. Many students lack self-efficacy and hence are likely to avoid difficult tasks in a domain. Although the students utilize ICT tools in daily life efficiently, and the difference in ICT skills across students was small, there were marked differences in beliefs about e-learning. Many students do not know enough about the possibilities for e-learning because of ‘functional fixedness’. They simply aren’t used to using the web to learn. It is important to offer students more information about the characteristics and possibilities for online learning.

Resources
Dunn (2002) argue from a multi-year assessment of information literacy skills at California State University that it is important to actually assess information skills. A process must be established for assessing student information and research competence. This data can be used to target information competence instruction. Assessment, however, must be real-world in context.

Emmett & Emde (2007) describe how they have used an assessment tool based on the ACRL’s ‘information literacy competency standards for higher education’ tool. These researchers provide methods for assessing information literacy skills by focusing on the development of an assessment tool based on learning outcomes. The append their tool in full to the article.

Mittermeyer & Quirion (2003) use a 20 question quiz to study information research skills of first year undergraduates in Quebec. They conclude, ‘Despite the limited number of variables in this study, the results indicate that a significant number of students have limited knowledge, or no knowledge, of basic elements characterizing the information research process’ (p. 7). And recommend, (1) regular evaluation of the information research literacy of first-year undergraduate students upon entrance to university; (2) participation of a library representative in the various program committees; (3) successful completion of a test to measure information literacy competencies during students’ first year of studies; (4) incorporation of information literacy instruction into academic programs at the undergraduate and graduate levels.

E-learning study skills guides exist, e.g.: http://www.amazon.co.uk/E-Learning-Skills-Study-Guides-Clarke/dp/1403917558

A pre-course assessment such as an e-passport test could be used to gauge student readiness for e-learning: http://www.e-skills.com/cgi-bin/go.pl/newsletters/passport_issue1_story.html?uid=935 Course activities can then be tailored to needs.

http://www.jisc.ac.uk/media/documents/themes/elearning/effpraceassess.pdf

http://www.jisc.ac.uk/uploaded_documents/RM%20Final%20report%20Vfv.doc

Also, the dept. of Nursing at New Mexico State University offers a computer competency evaluation for distance education course.