There is a whole range of underpinnings to the ICT4D/E movement (Information and Communication Technologies for Development/Education) – if one wants to use the term ‘movement’ here. One that is frequently stressed particularly by NGOs and development agencies is the idea of empowerment – or as Jens Karberg from the Swedish International Development Cooperation Agency (SIDA) put it at our conference on ICT for Development (ICT4D) in China last autumn: “Information and communication is power.”
A panoply of issues is connected to this claim, which have been debated by scholars and activists alike, such as:
- Do ICT give more power to some than to others? What are the implications of the fact that some individuals or groups are more digitally literate, or simply more likely, to exert digital power than others?
- Is it ethically appropriate to advance this sort of empowerment when we know that this might increase digital inequalities? Such as teaching groups to self-promote their causes and thus catch the World Wide Web’s attention, at the expense of those who cannot make themselves heard, but are equally or even more marginalized?
- What is the relationship between digital empowerment and the real world? Can changes in the digital world transform physical, solid structures of power and inequality? Or are we rather witness to a ‘feel-good nothing-happens’ development, where many people can speak out in all kinds of digital ways, but hardly anyone listens to them (let alone changes their situation)?
Another underpinning connected to ICT4D/E is a very old idea – indeed, the twin brother of education and development: modernization. It is – often explicitly, sometimes implicitly – assumed that ICT4D/E will lead a society – or certain groups therein – into bright and modern times.
Here, other problems come to the fore, such as:
- The alleged ‘abuse’ or unintentional use of ICT, when ICT is not used for straightforward educational purposes (or purposes defined ‘educational’) but for entertainment (such as social networking, games and gambling etc.). Often, this emerges from a divide between official policy goals and on-the-ground conceptions of what technologies are ‘good’ for.
- The ‘non-use’ of ICT in developing countries, which has been a frequently debated topic at this year’s Annual Conference of the Comparative & International Education Society: computers are crippled by dust and dirt, or by computer viruses; knocked out by lacking electricity; or covered up 24 hours – as tokens of the modernized classroom, but never to be used by teachers and students.
These problems touch upon the question of how much technology teaching and communication actually need. Very often, especially when ICT4E is cast into policy programs and goals, technology becomes an end in itself, such as specific designs/developments of ICT infrastructure, hours of ICT training etc. – in short, things that can be easily measured in terms of success and failure. On paper, these goals may look very good, but they lack a sustainable scheme of how to successfully and permanently integrate the use of ICT into teaching and learning activities. One example of this sort of ICT4E/D activity regarding rural China are the projects within UNESCO’s Beijing-based International Research and Training Centre for Rural Education (INRULED).
Rather than making teaching and learning easier and better, these new technologies often become a burden in themselves – they present new goals to reach for teachers and local communities, while these agents are still struggling with more basic challenges, such as keeping children at school, finding qualified teachers etc. Who says that more traditional media couldn’t serve these more overarching goals of schooling and learning just as well, and in some cases even more thoroughly?
The sometimes blind belief in the effectiveness of new technologies may be grounded in the tacit assumption that modern ICT, just by virtue of their modern nature, are able to transform their users and user environments into modern entities as well.
This view of ICT as some sort of magic trick is shared by policy makers in many so-called developed regions/nations as well – less so with regard to modernizing the populace, but to making them more creative.
Creativity is the buzzword of the 21st century knowledge economy, considered an essential key skill for dealing with contemporary challenges. China’s Ten Year Development Plan for the Informatization of Education (Years 2011-2020) mentions ‘creative’/’creativity’ numerous times, however without specifying how ICT are to generate this creative boom.
Similarly, and arguably as a model for other Asian nations, South Korea has embarked on the enormous project of SMART education, where SMART stands for Self-directed, Motivated, Adaptive, Resource-enriched, and Technology-embedded. While South Korea regularly excels in international student assessment studies like PISA, critics point to the persisting problem that – just like in China and other exam-oriented societies – students rely heavily on rote learning and the teacher’s authority, while lacking crucial skills such as innovative and critical thinking. SMART education is to change this picture.
(To get a glimpse of SMART education’s Brave New World, take a look at this report. Various companies make a huge profit from this new hype about smart education, claiming that this new type of education makes learning more efficient, stressing factors such as new interfaces of administration and learning, irrelevance of place, customized learning, new possibilities of benchmark comparison, integration of various providers of skills and knowledge etc.)
While the designers of these ‘smart’ worlds of learning may be given some credit for their creativity, it is doubtful why a student using a tablet or smartphone should be regarded as more creative than one using paper and pen. Digital users they are, yes – but more innovative or creative learners?
Creativity comes in when students can actually make a computer do something, instead of just using pre-programmed surfaces – creativity comes with the ability to code. It is like writing a story, creating a piece of art, building your own little machine – as opposed to just consuming ready-made objects.
Coding is also beginning to become a part of the school curriculum in many developed nations, which makes us turn full circle: while coding is increasingly considered a much-needed skill in the 21st century, it is not envisioned a part of ICT4E/D projects in developing nations. Those are first to master the basics of digital literacy before moving on to more complex operations like coding.
But the term ‘basics’ is confusing here: isn’t the ability to code a profoundly basic – if not the basic – operation of digital literacy? If we keep coding out of the curriculum – isn’t this like teaching students how to read, but not to write? Isn’t this outlook on ‘basic’ vs. ‘advanced’ users creating new kinds of digital divides?
Thus – coming back to this blog’s title – “to the basics” can mean two things: first, non-digital forms of teaching and communication may serve certain purposes (such as learning and development goals) just as well as digital forms – if not even better. Second, if we are to take ‘digital literacy’ seriously, coding should be an integral part of that concept, lest we produce generations of users who blindly trust those operating systems that increasingly determine and steer their living environments – irrespective of whether users live in the so-called developed or developing world.