What's wrong with traditional IT thinking in education?

New job, new challenges, and I'm finding myself drawn into more and more educational tech procurement. Not a task I exactly relish, but needs must. It has, however, sparked some thoughts over this process which have probably been bubbling away for many years now. My research has allowed me to colour these with some underlying theoretical concepts, which hopefully help to put some meat on my perspectives.

This post contrasts the outlook of a traditional IT department with my take on what's needed from an educational technology perspective. It suggest three reasons why traditional IT thinking is failing academics and students in terms of supporting their teaching and learning, which I've listed under:

  • One size does not fit all
  • There are no 'types' of technology
  • Digital tech provides a place, not just a tool

One size does not fit all

The 'solution' mentality. It seems deeply embedded within IT services that for any educational need there will be a 'solution'. Multiple companies will be offering this solution, and all that is necessary is to pick one of these solutions and the job will be done. Now that's not an easy thing to do, obviously, all sorts of comparisons, tests and checks will be necessary, but the basic premise is very simple, very engineering. Work how what's offered, make a big checklist of what you want, and then compare the two. Make sure the numbers add up, and hey presto, one solution. All very simple.

Now if there's one thing everyone can agree on about education, it's that it's not simple - in fact it's downright messy. And the reason it's messy is because it's personal - it's personal for the teacher, but more critically it's personal for the learner. All leading educational theories agree that learning is individually constructed by the learner themselves, in a very personal and unique fashion. New experiences are assimilated into an individuals existing thinking.

When trying to theorise technology, two competing positions stand out for me - the idea of technological determinism and that of interpretive flexibility. These two can be thought of as a sort of continuum, with Technological determinism (TD) on one end, and interpretive flexibility (IF) on the other.
  • Technological determinism basically states that it's the technology that shapes the user, and you can see this in action every day. Take our cars, for example, they shape our relationship to the world around us, by insulating us against the world. They form a barrier in time and space from the places that we move through within them. This has a direct impact on how society works and interacts around roads and road spaces.
  • Interpretive flexibility, on the other hand, suggests that we tend to use technology in unexpected ways that were not intended by the designer. SMS is a great example of this, the texting service was never designed to be a form of digital chatter, and surprised it's developers and the telephone companies which it's dramatic take-up across the world.
Now back to the way people learn, that they construct individual meaning based on their experiences in the world. No prizes for guessing which end of the TD/IF spectrum is best for learners when it comes to technology: the interpretive flexibility implicit in some technologies is ideally placed to allow them to find the specific value in a technology that matches their learning needs.

Now back to the one solution. The trouble with this approach, is that it lies right at the other end of the continuum, it chooses one technology that is supposed to fit all, Technological determinism prevails. Instead of individuals being able to match the tech with their own needs, they're all pressed into the same position - they all have to use the same tech for their very different learning. Very little chance for individual construction = bad for learning. We need to move away from the one size fits all mentality.

There are no 'types' of technology

I think I've lost track of how any times I've created list of technologies under headings. Those headings might be 'information gateways', perhaps 'media tools', or then again 'video creation'. They can be all shapes and sizes, they can be based on adjectives, nouns or verbs, they can be educational, technological, perhaps even philosophical, but they all have one thing in common. They group technologies into specific sets. They attempt to 'type' technologies.

I'm as guilty as the next person for this, at least I was until recently. I have numerous lists of techs and how they fit under different headings. But a recent project changed all that, it gave me the time and space to reflect on this approach and to conclude it simply doesn't work. It's down to that interpretive flexibility again, even your everyday mundane technology - this Blogger site for example - can be used in a multitude of different ways according to what people want to achieve with it. For instance, on a recent project I managed we used Blogger as a video wall not a traditional blog at all.

Perhaps we have a natural tendency to try and move things we can experience into types, it helps us to have a basic understanding of the value of something new without actually needing to know an awful lot about it. If you tell me something is a knife, for example, then I know I can cut things with it. Perhaps there was a time when digital technologies were also so simple that they could easily be listed under headings, but things have moved on, and tech has broadened and deepened so much that such a simple taxonomy approach no longer works. Which kind of leads me onto my third point

Digital tech provides a place, not just a tool

Almost every technological roll-out there ever is comes without enough support for those are who are supposed to use the technology. People are just supposed to read the odd download, perhaps attend a one-off workshop, or go on a brief training programme. The UK roll-out of Interactive Whiteboards is perhaps the best local large demonstration of this in practice, and that's been widely discussed, but it's something that's been knocking about for years. Initial guidance is often woeful, and ongoing support non-existent. But why does this keep happening? Or perhaps more pertinently, why should we need this? I argue it's all down to the nature of the technology. It's actually a mis-conception of what technology actually is.

Technology is an artfully deceiving word, in fact it's an artful word full stop. It's origin is in art and craft, but like many words it has come to mean so much more than it it's original definition. When things as disparate as the latest smartphone from Apple and a simple book can arguably be described using the same word, it's just possible that word isn't actually being very helpful any more. Language exists as a framework within which we can have a shared experience of the world, and find and share meaning, and when words become as generalist as 'technology' has now become they don't help that exchange of meaning very much.

My preferred argument is that digital technologies actually create new digital space that we can perceive, move within and - provided it's been allowed by the designer of the space - change. The iPhone, to take a modern example, is not so much a piece of technology, as it is a portal into a digital space, in actual fact a series of interconnected digital places, some of which are constrained to the physical space of the phone itself, but some of which extend much further into other digital spaces way beyond the phones physical dimensions. The use of the word portal then is key - when you peer into the screen you are looking through a window into another world, a digital space full of opportunity.

Why is this relevant? It's quite simple - spaces and places take time to get used to, they need exploring if you want to find what might be useful within them. And they change too, people come and go, furniture gets moved around, occasionally things get spruced up and repainted. A place, as distinct from an object, is a complex and changing entity that consequently needs more guidance. It's fine to issue a leaflet on use with a new shaver perhaps, or a coffee machine - these technological objects are static and unchanging once purchased. Places, on the other hand, generally come with entire guidebooks, media laden, and structured around the type of things you might want to achieve in those places - walks, food, entertainment, etc.  They tend to be issued annually, to reflect the changes in those places. They often have real people as guides, to give real time tours of the in and outs of those places, allowing individuals to discover personal meanings.

Thinking of digital technologies as places helps to prompt new thinking about how people use those places, and how they might discover meaning within them.

Popular Posts