This space takes inspiration from Gary Snyder's advice:
Stay together/Learn the flowers/Go light

Thursday 27 May 2021

Where technology does NOT empower humanity

Photo source: Netflix's Black Mirror
The battle to achieve recognition of human dignity goes on into the 21st Century. It is a battle because some in the elites of the political and corporate worlds, and especially of academia, would have the social masses capitulate to demands that they keep their heads down and their mouths shut while those controlling our social structures ease the path for new technological forms to take over.

In a discussion posted on YouTube in early May between political philosopher Michael Sandel of Harvard Law School and historian Yuval Noah Hariri of the Hebrew University of Jerusalem, the topic arose of the likelihood of success of the "project" to improve society through greater appreciation of certain values relating to human dignity as compared to the transformation of social structures through technological development where free will yields to mechanism, as Michel Houellebecq envisages it. 

Sandel:

Some years ago I was teaching a course on ethics and biotechnology … and we were discussing various aspects of the project of re-engineering human nature…. One day we invited James Watson… who had won the Nobel Prize for describing the structure of DNA, and he was talking about cognitive enhancement through genetic alteration, and he was very much in favor of it…

I asked him: “Do you consider having a low IQ to be a disease in need of a cure?” He said, “Yes, of course, because people with low IQs live very difficult lives. They have trouble making a living and so on.”

A student raised her hand and asked: “Well, given that's the case, why don't we try to reform the economy and societies so that people with low IQs don't live such hard lives?”

Watson's reply was: “We're never going to be able to change society. That's way too hard. That's why we need to use genetic engineering to solve this problem.”

Sandel continues: 

And I found that a revealing but chilling answer, not only because of its eugenic sensibility, but also because it seemed to concede so readily the project of moral and political improvement, as if to say that human agency is impotent in the face of that project. Therefore, better to repair ourselves to fit the world, the social roles, that are beyond human repair or reform. That's the worry that I think represents the fundamental concession to the moral and political disempowerment of humanity.

Harari takes up the issue:

…There are many successful attempts to better human society, not through the invention of some new tools, some new technology, but by changing the values, the stories, the structure of society itself. 

One of the biggest achievements of humankind has been the drastic reduction of violence over the last few generations and even though it owes something to a technical invention – the nuclear bomb – to a large extent it was done by changing human values in society. … [Political parties] also had enormous success, and when it comes to racial inequality, to gender inequality, they really managed to improve things and not by inventing a new technology. 

So this fixation that the answer to any problem we have is just to invent a new technology…. That's extremely dangerous, first of all, because it gives up so many other things that we can do and, secondly, because it ignores the main problem that, okay, you invent the technology but then the decision what to do with it is not in the hands of the tool, it's in the hands of the same society. So if you have done nothing to change the society and its values, you just invented a new tool. Then if the society has evil values, it's now just more powerful to do its evil things. You have done nothing; you just made things worse.

Harari points out that artificial intelligence machines may try to force humans to comply with their wishes, to which Sandel comments:

I suppose that we could conclude by agreeing that even the smartest smart machines can't tell us how they should be used. That is ultimately for us as democratic citizens, which suggests a project not of manipulation but of education and of persuasion. 

Harari:

Yeah … but I would just say that maybe part of what is fueling the political crisis we are seeing around us is this deep sense that time is running out, that if humans don't exercise their agency in the near future they will lose that agency.

I mean all previous technological inventions in human history … empowered humanity, but the current wave of technological inventions for the first time … endangers human agency [as] we see a shift in power from humans to algorithms.

Therefore, it is imperative that people everywhere strive to maintain a expansive sense of the dignity of the human person, building on the success in banning slavery, in the declaration human rights, but looking ahead to the banning of  the death penalty, the banning of euthanasia and abortion based on the ulimate value of human life, and the restructuring of economic principles to end the scandalous economic inequality worldwide.

One final thought: It's surprising that Harari continues to use the deep term "evil" in talking of values that a society may have. With his evolutionary humanism, where ethics is simply a set of rules people think will make life easy, you would expect him to use "negative" or some such neutered term. 

[] Go to my Peace and Mind newsletter on Substack and subscribe (It's free!) to get notice of each new post.

No comments: