Go directly to the content of the page Go to main navigation Go to research

Technology is a great servant but a terrible master, says futurist Gerd Leonhard, who believes that its rollout entails ethical choices.

Technology, including sensors, algorithms, cobots, cloud, automation and AI, is becoming increasingly widespread and spectacularly accelerating the transformation of the way we live, produce and consume. And the process is just getting under way. “Humanity will change more in the next 20 years than in the previous 300 years,” says Gerd Leonhard.

In his books and lectures, the German futurist stresses the idea that technology is as much a threat as an opportunity for humanity, which it can equally serve and enslave. The main point he makes in his book Technology vs. Humanity is that we need to make choices, and we need to make them fast. Without fear of hyperbole, he warns that “Now is our last chance to question the nature of these coming challenges.”

As he sees it, the issue is not so much the use of technology itself as the ever-deeper integration of technology in human life. From artificial intelligence to human genome editing, “striking a balance will be key,” says Gerd Leonhard. The futurist provides a starter set of humanist-inspired ideas to help us reap the benefits of technology without forfeiting what makes us human – first and foremost our free will.

“Androrithms” versus algorithms

In his book, Gerd Leonhard lists the transformations now taking place. “Megashifts” – facets of the transformation currently under way – include digitisation of everything that can be digitised, mobility, disintermediation, automation, virtualisation, and robotisation. Each of these aspects is examined from the dual perspective of its risks and its expected benefits.

“What cannot be digitised and/or automated could become extremely valuable.”

Gerd Leonhard believes that “What cannot be digitised and/or automated could become extremely valuable.” This involves the essential human qualities such as emotions, compassion, ethics, happiness and creativity, “the things that make us uniquely human,” he writes, contrasting these “androrithms” with algorithms.

Connectivity, the ability to exchange data at all times and everywhere, is the “new oxygen”, but it is accompanied by another phenomenon, mediasation and disintermediation, which force us to use platforms to access services previously provided by human beings such as doctors, teachers and bankers.

Virtualisation and anticipation

Other megashifts that are both promising and threatening include what he calls ”intelligisation” – connected objects becoming “intelligent” – and virtualisation, in which physical objects such as books and even communication networks are replaced with their electronic counterparts.

Virtualisation, says Gerd Leonhard, will be “a driving force in the conflict between technology and humanity,” causing loss of jobs, making it likely that “software will soon eat biology,” and driving the increasing temptation to virtualise humans via brain-uploading or “cyborgism”.

Anticipation is another aspect of digital transformation, perhaps one of the best ways to illustrate the opposition between man and machine. Fed with its “master’s” data, the AI assistant will be able to respond to an event by rescheduling a meeting or calling a taxi, which is indisputably a convenience. But “prediction” by an algorithm of a future crime in the city is eerily reminiscent of the “precogs” in the film “Minority Report”, with individuals singled out and receiving a visit by a social worker or a police officer before they have done anything.

Ethics to the rescue

Gerd Leonhard says that to address the nature of the coming challenges, we must examine technology from the vantage point of ethics. Since “technology has no ethics,” its imminent entry into our most private lives and biological processes must be negotiated as a top civic and corporate priority.

The issues to be addressed are dependency, confusion, loss of control and abdication, says the futurist. Dependency lurks when we are tempted to let software do our thinking. Confusion is caused by not knowing if a decision was taken by oneself or the AI assistant.  And having no way of knowing if the AI’s anticipation was correct or not means loss of control.

Abdication is being tempted to leave more tasks to systems than necessary, whether it is coordinating personal schedules or answering simple emails.

Society must use its ethics compass, says Gerd Leonhard, and the decisions it takes must be reflected in regulation, notably of the new “data oil companies”.

12/12/2019