Business

Ghosts In The Machine

Issue 83

Automate, is a key cry issued by those scaling their businesses. But let's not lose sight of the fact the absence of the "ghost in the machine" may come to haunt your organisation, as Dr David Cliff reflects:

As we live in a technological world, many decisions that intimately affect our lives are down to automated processes. Daily, they decide our consumer preferences, traffic flow and call handling. Increasingly, algorithms come into the most intimate parts of our lives, including our health. A Coroner’s recent judgement over a 90-yearold left waiting for an ambulance so long she died, was down to the over application of algorithm technology and devoid of human judgement at the time.

Automation can create great ways of getting things done quickly and without the emotional ‘contamination’ of human operators. These systems are however, without feelings, impartial and simply search for sameness and difference criteria. Despite the growth of AI they lack the “ghost in the machine” that is involved in human judgement. Bureaucracy loves algorithms, the sociologist Max Weber ultimately regarded bureaucracy as something that would result in fairness of distribution to all and so algorithms could be argued to move towards this. However, ‘fairness’ is a subjective value not satisfied by simple ‘sameness’. Consequently, automated systems are far from perfect and easily duped. Consider how the family of the 90-yearold mentioned above may feel, especially when some of those priority calls at relevant the time were misusing NHS services, and where human need was also balanced against organizational risk in the establishment of who to service first.

Equally, our consumer habits are regularly reduced to automated processes. Notice how hard it is to contact a human being when you’re dealing with any form of customer services whether it’s banks, online shopping or just about anything else! Somehow our transactional needs are dealt with by a process system that allows us to complete transactions. However, when complications occur, speaking to a real human being is often the way to restore faith. The problem is when you speak to the many human beings in these systems, the very automation they use operationally increasingly prevents them from offering the one thing that is needed to complete the customer experience – discretion.

Discretion may be an old idiosyncratic thing, but it produces a mutuality between humans that affirms the recipient as an individual. It can nuance customer experiences so that people feel they are seen as being unique in their own right. This is important to people in their buying experience, but when the transactions involved go beyond that of simple consumerism into health care and the like, it creates confidence and hope. Discretion injects something valuable into a system – ‘necessary diversity’- a systems concept that is hardly ever spoken of these days. It prevents the creation of ‘wooden’ responses- those where the transaction is a technical success, but the person is left feeling dehumanized but must just get on with it usually, because the system is too complex to change and/or the opportunity cost of complaint or going to rivals is just too great. Look at banks for example, they have to pay you to switch and do the leg work for you to encourage you to change to them.

When something goes wrong in the human front, people “learn the lesson”, but humans running organisations with automated processes on key customer facing areas are not necessarily learning, they are simply tweaking code within the same system. It does not produce the real seeds of change.

Most automated systems are fine when processing very simple transactions, buying online et cetera, but when trying to deal with the uniqueness of human need, we must remember they only partially satisfy the customer.

We lose sight of people as individuals at our peril. They can tolerate impersonal, “distributive “types of service for a long time some forever, but for most a time of reassertion occurs wherein their individuality must be accurately both listened to and responded to. Otherwise, you can get cataclysmic switching which can result in an organisation hemorrhaging customers, whether that’s Sainsbury’s customers switching to ALDI or NHS customers switching to private healthcare or swing voters in politics where AI based polling forecasting systems fail to understand the “Volk Geist” of the people! Systems, like ideologies, are often great servants and poor masters…

Sign-up to our newsletter

  • This field is for validation purposes and should be left unchanged.