Digital Literacy Is Key in Modern Higher Education

Authentic assessment allows educational institutions to measure that their students not only have the knowledge necessary to solve actual critical issues in the world, they are also able to put that knowledge to practical use.

digitalliteracy.jpg

Literacy has through most of history had a pretty forward definition: the ability to read and write. And while the term has met attempts at redefinition from multiple angles, pointing out that communicating and creating meaning can be accomplished through other means than decrypting constellations of letters and performing exercises in penmanship, the traditional definition still stands strong – at least in an analogue context.

Because even though the definition of literacy in a traditional sense is somewhat unyielding, the digital age is forcing an expansion of the term: digital literacy. The nature of communication, the way in which we find, manage and express information, is changing and so are the instruments with which we do it. And digital literacy is not a skill reserved for a small elite of the population. It is not about possessing IT expertise but simply being able to function – to live and work – in an increasingly digitised society. And digital literacy is also about being able to learn within the framework of modern education.

Many digital tools and methods are already a staple at educational institutions and interwoven within many  processes in higher education; from the delivery of course materials to the way we conduct research, technology resources play a pivotal part. Virtual learning environments, e-portfolio platforms, student portals etc. are all standard equipment at most higher education institutions, and it is necessary for students to consult most of these on a weekly basis.

In short, students are already exposed to and expected to engage with technology from the moment they set foot on university grounds, which places digital literacy at the forefront of indispensable skills in higher education. But how digitally literate are students really?

The Myth of the Digital Native

The term ‘digital native’ (coined by Marc Prensky in 2001) has been a popular label for people born sometime after the mid-eighties, ascribing this age bracket with a familiarity with and fluency in using digital tools. The expression has little empirical proof to lean on but has none the less gained much traction within the public debate, as this particular demographic have been the ones entering higher education within the past decade.

There is no doubt that digital exposure is more widespread in the younger generation, but the notion that children know how to use technology by instinct is unsupported by both science and common logic: putting fingers on an Ipad to play Minecraft might be both entertaining and educative in some way but it is hardly indicative of the ability to understand and fluently navigate the digital landscape in its entirety.

The theory of digital natives possessing inherent digital capabilities and a far better ability to multitask has been debunked repeatedly. Paul A. Kirschner and Pedro De Bruyckere’s article “The myths of the digital native and the multitasker” neatly anthologises a wide variety of scientific publications on the subject and base their conclusion on these: “As has been shown, there is quite a large body of evidence showing that the digital native does not exist nor that people, regardless of their age, can multitask” (Kirschner & De Bruyckere, 2017).

As that leaves us with increasing digitisation and a student body that might possess less evolved digital capabilities than popularly expected, developing and strengthening a digital literacy becomes a necessary part of being a student within modern education. Digital literacy is also a benefit to the educational institutions themselves, as digital developments have a monumental impact on them and the academic tradition.

1: A Change in Information Search Strategies

The classic movie montage-moment of a student researching something by frantically leafing through piles of dusty books in the library might be the generic image of higher education studies, but those days are past. A more fitting montage today would replace books with a laptop and a stream of Google-searches following one another.

This change in information-searching has spawned an abundance of new sources where students can seek answers, and not only to simple questions, but even answers to composite problems with a high complexity. And while this can create fantastic opportunities for students to access vast amounts of information, not all of these sources are equally valid. By simply typing a question into a search machine, they are exposed to a plethora of possible answers and sources of information, with little clue how to distinguish between what is relevant and what is not.

For many students, the search for information might overtake the process of actually understanding it, thereby impeding their learning process. The act of Googling alone does not facilitate learning in students, as they are not necessarily approaching the subject with the same critical mindset they are taught to use otherwise, and critical thinking is an invaluable skill in higher education and a necessity for higher order thinking. Taking the solo taxonomy of John Biggs as an example, it would be impossible to move from a uni-/multistructural state to the relational state (and further again) of the taxonomy without being able to objectively approach, analyse and evaluate a given issue.

But discerning between valid and invalid sources and using critical thinking online requires digital literacy: “User awareness in making these decisions largely determines the quality of the conclusions, positions, opinions, or models constructed from the information. In the absence of effective mechanisms for information evaluation, how can learners decide which of the infinite and conflicting bits of information to choose, and which to doubt?” (Eshet, 2004). Knowing how to tell viable academic sources apart from biased authors, such as private blogs, marketing articles or user-generated wikis is a necessary skill for modern students, as well as discerning how updated information is and how many other sources find it credible through linking.

With the increasing amount of ‘fake news’ currently spreading across the web, the competency to sort through the mess is incredibly valuable for students. A helpful tool in this endeavour is the aptly named ‘CRAP test’, citing currency, reliability, authority and purpose as key markers when trying to establish the credibility of a source.

The increasing amount of educational activities that take place online or involve students using devices with internet access at some point in the process also necessitates a change in strategy for the educational institutions, especially when it comes to academic integrity during exams. Today, tools like lockdown browsers and plagiarism detectors are some of the preventative measures taken against academic misconduct in online exams.

2: The Possibility of Improving Learning Technology

While 2019 is the year of 1982’s dystopian tech-noir, Bladerunner, real-world technology has not yet caught up to that point of technological advancement (fortunately, I might add, as murderous bioengineered androids seem to be an altogether awful idea). But in some places, we are getting there, as last year made a leap towards AI-powered personal helpers with Siri, Alexa and Google Assistant, lab-grown meat went from theoretically feasible to actual business plans and researchers at the University of Cambridge created artificial embryos from stem cells.

Advanced technology has also made its entry into the halls of higher education. Augmented analytics, AI, blockchain technology, AR/VR etc. are just some of the innovations said to make an impact on education in the very near future. But just as in other areas, much of the technology intended for education is still being researched and many of the existing platforms and solutions are in continual development. This means that academics and educational institutions that wish to inspire, innovate and have an impact on this technological development are in many instances able to help shape educational technology in cooperation with its developers.

But this as well requires that the educational practitioners who are willing to participate and leave their mark on digital education are digitally literate. The pedagogical and practical contributions of educational practitioners can be invaluable to edtech companies, but to use these insights, they need to be made from practitioners who are able to see the learning potential in technology and how to actually use it in courses.

An example is our OMAP project (Online Massive Assessment Platform), which we have worked on with Centre for Teaching and Learning at Aarhus University and Korean WeDu Communications. The project has aimed to create high end authentication and learning analytics functionality for WISEflow, helping educational institutions create safer exams and get more actionable insight from student exam activity. The result of this project will among others be:

  • Facial Recognition: In the future, facial recognition technology will be instituted in WISEflow, making authentication both incredibly fast and easy and additionally secure. This way it becomes even easier to conduct secure on-site digital exams for a massive number of students.

  • Research-based Learning Analytics: Student exam activity generates a large amount of data and we have been working hard on how to utilise this information to the greatest benefit of students and educational institutions. One of the approaches we have undertaken is to look into learning strategies and their relevance for exam. Based on data regarding student study habits and preparations before exams, used exam forms and achieved results of exams, we are currently investigating the relevance of such data for individual and institution, as well as the proper feedback this can instigate. This is done together with Centre for Teaching and Learning, Aarhus University. 

3: Better Employment Outcomes for Graduates

When graduates go into industry, they have a vast number of areas they can work within, but while a university graduate might not be hired for programming jobs or perform other technically complicated tasks, IT is still bound to play a large part in their professional lives. According to the European Commission report “ICT for Work: Digital Skills in the Workplace”, 93% of European work places use computers, 98% of work places require basic digital skills from their management and 90% also require these skills from professionals within the fields of science, engineering, health, teaching, business and administration, information and communications technology, legal, social or culture, which encompasses the majority of work fields that graduates are heading towards.

For graduate jobs in both the private and public sector, digital tools are used in some form or other for almost any task imaginable:

  • Company materials are typically stored and shared through document sharing services like Dropbox, OneDrive or Google Drive.

  • Internal communication is often managed through chat functions in online collaboration tools, while external communication is managed through email clients, social media etc.

  • Production tools are found on the computer, such as the Office Suite tools for tasks like making presentations, spreadsheets or word processing.

The importance of digital literacy in relation to graduate employment also poses a social dimension. We record a massive amount of information about our personal lives on social media, which for many of us is openly accessible to the public. Learning to manage your online persona and being aware of how your online behaviour reflects on you is especially important, as recruiters are known to be using social media to vet potential applicants without informing them. And while the practice is acted against in much of the world, for example in several US states, where legislation specific to social media checking has been passed, and in Europe through the privacy compliance framework GDPR, the practice is still taking place: according to a study by CareerBuilder, 70% of employers used social media during the hiring process, with 48% of employers using social media to check up on current employees. 

Bridging the Gap: Authentic Assessment for Digital Students

Several educational institutions already have digital literacy policies, projects or courses in place, and multiple frameworks for understanding, promoting and teaching digital literacy exist, making it easier for the institutions to approach digital literacy.

This can benefit a lot of students who will perform the majority of their work in a digital environment. Whether they end up working as engineers, computer scientists or lawyers, computers will be the tool of their trade. But before these digitally literate students enter a digital job market, many of them have to overcome an obstacle within the move towards employment.

During their courses, they have plenty of opportunity to flex their academic muscles within a learning environment that supports their specific skillset. Computer scientists can design, code and test different types of software to develop solutions to problems and mechanical engineering students can use CAD software to create and validate designs, but when time comes to put their prowess to the test, many educational institutions lack the possibility of replicating these learning scenarios adequately within their exam and assessment process, as this process is still based on pen and paper.

To counteract the academic contrast between digital learning and analogue exams, a digital assessment platform can provide a more authentic exam and assessment scenario that allows students to perform tasks that mirror real-world application of their theoretical knowledge and skillset. For the above examples, this would mean giving the students an exam platform where they can use the same tools and methods to solve tasks that they would use during their course or if they were working in industry: computer, CAD software, a code editor, advanced calculators and other subject-specific software.

Dr Simon Kent, Director of Learning and Teaching at Brunel University London, on authentic exams

Authentic assessment allows educational institutions to measure that their students not only have the knowledge necessary to solve actual critical issues in the world, they are also able to put that knowledge to practical use.

References

Eshet, Yoram. “Digital literacy: A conceptual framework for survival skills in the digital era.” Journal of educational multimedia and hypermedia 13.1 (2004): 93-106.

Kirschner, Paul A., De Bruyckere, Pedro. ”The myths of the digital native and the multitasker.” Teacher and Teacher Education 67 (2017): 135-142.

Previous
Previous

Procuring a Digital Assessment Platform Easily: The G-Cloud Framework

Next
Next

SCALABILITY OF WISEFLOW