The hidden danger of systems, or why we're missing the good stuff

I've been meaning for a while now to post about something that's been bugging me for a while, to do with what I see as the hidden danger of systems, or if you like "systematisation". Now generally speaking I'm a big fan of off-loading cognition into virtual spaces. Not only does my job demand that I create the capacity for this day-to-day, but I'm also a big consumer of third party offerings such as Google Alerts, Twine, etc. But that said, there does seem to be an increasingly dangerous approach that can see some systems as finished product rather than a tool to achieve it. The "if I use this tool I no longer have to think about it" mentality.

In order to explain what I'm on about I first need to nip back in time to when I was a Psychology student, and the time when I first discovered the normal distribution curve. To those who don't know it, this is a deceptively simple graph that explains, well, pretty much everything about life to my mind. It goes something like this:


What you're looking at can explain many things in our day-to-day lives. Take individual wealth in the UK for example, with the vertical axis indicating wealth and the horizontal our population. A very few people have lots of money. The same again have practically nothing. As you go away from these extremes you find more and more people nearer the middle, with less and more depending on their circumstance. No great breakthrough you might think, more common sense, but that's where the deceptively simple becomes powerful.

Now look at the graph again but think of cow fertility across the planet (weird I know, but bear with me). Take the vertical axis as fertility and the horizontal axis as cows across the world. Again you find some cows very fertile, some very infertile, but the majority is in the middle.

I'm not trying to make a point about cows here, but I am trying to make a point about this graph, and that point is that this type of distribution appears throughout nature, and for that matter artificial systems as well. Technically speaking it may become skewed here and there according to local variables, but that's a lesson for the statistics class, and not here in this blog.

So what's all this got to do with systematisation? It's the ends ...

What systems do (and now I'm thinking about computer systems) is encode certain chunks of our lives into code, and that code is represented in virtual spaces which we can wander through and then achieve various tasks. I'm talking here about anything from checking into an airplane flight to taking an online archaeology course. But what's easy to miss is that the encoding is a very limited affair, and hugely dependent on the authority, capacity and empathy of the encoder. And even in the very best encoding possible, some things will be left out. And do you know what's left out? It's the ends of the graph.

We lose the terrible, i.e. the far left of the curve, but we also lose the genius, i.e. the far right. Systematisation then, and by that I mean the application of generic systems of action to replicate a process, by its very nature removes the capacity for critical failure, but also removes the ability for critical success. In effect it could even be seen as a way of dulling life, of taking away that spark which can sometimes be just what you need to make life worth living.

Yes there are times and places where you need to remove the possibility of critical failure, and so have to accept the fact that you therefore remove the ability for critical success, but I think far too often systems are applied without even acknowledging this effect in the first place - and the consequence is, we miss the good stuff.

I don't know about you, but personally I like the good stuff ... I'm happy to take the rough with the smooth, you only get one chance at the life lark after all.

Comments