This space takes inspiration from Gary Snyder's advice:
Stay together/Learn the flowers/Go light

Saturday 20 July 2013

Christian view on evolution

Mainstream Christians accept evolution as a fact. However, there are limits as to the conclusions that can be drawn from the information provided by scientific endeavour. Some clarity as to what Christians - for the most part - accept as to human being's evolutionary past is found in a critique by the US Catholic Bishops Conference of a book written by Elizabeth A. Johnson, Distinguished Professor of Theology at Fordham University, a Jesuit institution in New York City. The book, Quest for the Living God, Mapping Frontiers in the Theology of God, was published in 2007, and the bishops' doctrine committee offered two formal critiques, with its 2011 statement providing the insight into Catholic teaching on evolution that follows.

The bishops state:
Human beings necessarily [are] part of the material universe. Bodily existence is an intrinsic part of human nature. Consequently, scientific investigation has a great deal to teach us about the human person and human society. At the same time, there is something about the human person that transcends material realities and that escapes the grasp of scientific investigation. There must be another, a non-material explanation for the existence of this aspect of the human person. There is a range of philosophical attempts to provide an explanation. The Catholic Church teaches that the human soul is not the result of material forces, such as the bodies of the parents, but is created immediately by God.
The main issue is about "what can be explained in scientific terms and what cannot be explained in scientific terms". The bishops' stance is that "science by its very nature has no other way of looking at the evolution of human beings than as the result of the interplay of material forces" but they say the author goes too far when she asserts that "Matter evolves to life and then to consciousness and then to self-consciousness, and this can be accounted for without positing divine intervention, scientifically speaking" [author's emphasis].

The counter-argument from the bishops is this:
Science could account for life, consciousness, and self-consciousness, however, only if these were wholly the result of the interplay of material forces. While an adherent of a materialist philosophy would readily agree that material factors account for all reality, this accords neither with Catholic teaching, nor with sound philosophical argumentation.
Although a scientific explanation of life in purely material terms already presents considerable difficulties that could be discussed, the crucial issue is that of self-consciousness. Simply put, human self-consciousness cannot be wholly explained as the result of material causes. The multiple neurons of the physical brain cannot account for the unitary self-consciousness of the human person. The functioning of the brain cannot of itself explain human acts of knowing and willing. This has been amply demonstrated by various philosophical arguments. There is, therefore, one stage in evolution that cannot be fully accounted for by scientific explanation, that of the appearance of self-conscious intelligence and free will.
 It is here that metaphysics is again to the fore in exploring the full nature of the human being. Studies of the functioning of the brain are providing information to enable doctors to repair damage or prevent what may handicap a fully capable person, but in an era of incredible computerised robotic functions, there remains the final hurdle of duplicating the human's "self-conscious intelligence and free will". Though some studies have been hyped so as to posit that higher orders of animals approach humans in such attributes, it is clear these distinctive qualities set humans apart from the purely natural world. Therefore, scientific findings need to feed into the metaphysical (philosophical and theological) considerations of the nature of the human person to lay a foundation for recognition of the dignity that each person deserves.

Tuesday 4 June 2013

Suicide makes losers of us all

 Legalizing assisted suicide – though often termed assisted dying – is unnecessary and disrespectful. In countries where this has become an issue, the medical staff with palliative care expertise generally do not support such measures. It’s a hot topic in Britain at present, and debate was fuelled by Amour, as it was in 2004 by the sports drama Million Dollar Baby. The French take on the difficulties of old age went beyond the adage, “Old age is not for sissies”, but took us to where the character played by Emmanuelle Riva had reached the point where death was the natural next step for her life. But Jean-Louis Trintignant’s character strangely did not seek medical help to allow his wife to die with respect. That help could have been simply to arrange intravenous nourishment and hydration to allow a peaceful departure from this life in the old woman’s own time. That it was time to stop offering solid food was obvious. Therefore the film had a false climax, one that was unconvincing to anyone who had paid the slightest attention to what medical services now routinely provide those near death whether at home or in a hospice environment.
Amour portrayed a case of what should have been “care of the dying”, not “assisted dying”. The first term upholds the dignity of the person and the rhythm of life – Riva’s old woman wanted death to come; it was time to die. Yes, we (the reasonable ordinary serious-minded person) can see that this is the right decision given the woman’s time of life and health situation. We also know that a person’s life does not have to be extended at all costs or we would not honour those who give their life for a justifiable cause, such as to save another person. However, it is acknowledged that sometimes the care given to the dying does have a double effect of bringing death sooner as a result of trying to remove pain.  This is not splitting hairs; this is recognising the fact that we have in our care a human being and that this person’s dignity must be safeguarded. We must be sensitive to the delicate nature of the precious moments. Such a storyline would have made an enthralling film. 
Clint Eastwood’s Million Dollar Baby was more convincing because his character knew it was morally wrong to kill Swank’s “Baby”. This “assisted suicide” was an act of murder plain and simple. The loving father character should have been guided more by his moral principles, which he allowed his emotion to override. But this scenario of a young person severely handicapped but nowhere near death, offered another option, one that meant a decision of a more uplifting kind for “Baby”. The decision could have been one that highlights where so much current moral thinking leads – to what is easy, to what avoids inconvenience, to what, in fact, removes the challenge – even adventure – of embarking on a journey that would be painful for sure, but may allow a magnificent outcome, an outcome that would be a personal achievement of calm acceptance, and, not least, and a heightened regard for the welfare of others, because of the one performing at the supreme level that is possible for a human.
Eastwood’s character’s role should likewise have focused on the goals his “baby” could achieve in her life despite the constricting circumstances. A morally sound person would have an understanding of the value of suffering – in the face of those who reject the reality that suffering can be ennobling – from both their own life experience and from the testimony of survivors of instances of hell on earth such as Hitler’s death camps.  He would have given encouragement to a sense of adventure, helping the victim make courage the stairway to a  greatness of spirit and to an integrity higher than most of us can attain, we who are physically whole but as morally weak as a mouse. The companion in this case could share the suffering of the bedridden, but also bring the world to the bedside, the world of much horror and mayhem, as well as of the many who struggle to raise themselves from hopelessness.
Many people do not see how weak and shallow they have decided to be when they decide to kill themselves as they grow old or are overtaken by illness. This is where their upright friends and family need to step in with forceful counsel. The unhappy thing is that those people say something of this kind: “I’ve always been an active person – I simply couldn’t bear being feeble”. Why do they limit themselves in such an extreme way? Where is their sense of adventure? Why don’t they see that the courage they think they have is merely the display of common or garden fear – fear of pain, or fear of the unknown where before they considered themselves in control? Even when dementia, for example, has seemingly stripped the person of dignity, that, too, is part of the journey of that person in life. Also, doctors acknowledge that we must reserve our judgment on when that person has lost all awareness of self, let alone when that person has reached the point of death.    
Therefore, the argument for assisted dying/suicide as we see surrounding much of the legislation in Europe and envisaged for England and Wales, is not an step forward in the development of our awareness of human dignity. Rather it reflects a general decline in moral strength, especially because but it undermines respect for the human person. It also undermines the ability of an aware but ailing person to find support in making the effort to shape their life story in a personal way. The common good would be better served by the firm espousal of hope for one’s life ahead, and trust in the goodness of carers, than by caving in to the dismay of those around them, who (correctly) see inconvenience and financial difficulty for themselves. But that positive view requires the ability to convey a more complex reality than these two films achieved.
[For an insight into the ethical argument advanced here go to this resource]
   

Sunday 24 February 2013

eLearning: Achieving Human Goals

Posthumanists are those who put no restrictions on the “constitution and boundaries” of what a human is (Hayles 1999).  Futurists in this school of thought are especially inspired by the burgeoning possibilities for the shaping of a person by “digital, genetic, cybernetic and biomedical technologies” (Graham 2002). Education of the technological kind is implicated in this.The enthusiasm to recreate the human person includes human mental and physical powers, but does not recognize the notion of human nature or a spiritual dimension for humans, the qualities of which warrant the corresponding notions of human dignity and rights of the person. Certainly we need to be always delving into the essence of what it is to be human, but associated questions are whether the future will make for a playing field for aliens of our own design, and whether these will pose a threat to what has be achieved over eons of natural social and physical development, and divine intervention. Human progress is not a constant.

The novel became a 'science fiction' film
Already we can see in the field of genetic engineering the instance where a second, lower class of human is “manufactured” (unethically, and unnecessarily, according to many scientists) because stem cells from members of that “saddled” class can enhance the health of a member of the “unsaddled” group. Those are key descriptive terms for Fukuyama as he battles transhumanism, as in his 2004 paper in Foreign Policy. His prediction is portrayed in the novel Never Let Me Go by British author Kazuo Ishiguro (2005). The narrator tells of what it is to be a member of a new group in society, young people without parents searching instead for a possible genetic source, accepting they can never have a family or children, growing up knowing their purpose in life is solely to provide body organs for the dominant group in society and, as soon as that is done, to “complete”, that is die, even amid a process when they are conscious that they are being stripped of all remaining usable body parts. In a discussion with a protective “full” human the narrator is told:
For a long time you were kept in the shadows and people did their best not to think about you. And if they did, they tried to convince themselves you weren’t really like us. That you were less than human, so it didn’t matter. … There [will] always be a barrier against your being seen as properly human.
That friend of the “use and discard” group goes on: “I saw a new world coming. More scientific, efficient, yes. More cures for the old sicknesses. Very good. But a harsh, cruel world.” Ishiguro sets all this in the present. In truth, there is a wide fascination with what ways the human can be “remodeled”.  But I expressed my fears here.
The future has also arrived with regard the ability of those with economic or political clout to control the lives, or at least act with disregard for personal privacy, a fundamental human right. Recent reports make clear that digital learning, along with social networks, could so easily become part of a “Big Brother” environment.

Just as the “ethically challenged” may set the pace in the posthumanist world, it may be the "pedagogically challenged" who will drive the “enhancement” of those seeking an education.  The technology used in education, ranging from pencil and book to keyboard and video, has an important role in determining the dominant teaching/learning style and hence the character of the thinking skills that result. British Education Secretary Michael Gove certainly makes his view clear:
From radio to television, computers and the internet, each new technological advance has changed our world and changed us, too.
However, ultimately, which role-player has the upper hand in educational technology is the crucial question that Lowell Monke, an American university professor of education, had in 2004 within a despairing view of the “disruptive innovation” with regards the “technological ideology” (his words) assaulting the learning process:
We may deliver our children into a world of tremendous technical power, but it is rarely with a well-developed sense of human purpose to guide its use.
Nigel Thrift
Another with a skeptical view of new technology in education, specifically over the race to use a MOOC in some form, is Nigel Thrift, vice-chancellor and president of the Warwick University, England. Though his university has joined a MOOC grouping – identifying a human purpose in doing so – he posits four reasons for the latest technological “obsession” in the world of education and the media. First, “It is based on the idea that higher education is the next sector in line for the high-volume, low-margin information-technology treatment after finance, retail, and the media.” In other words, self-interested parties see a business opportunity. Second, parents have high hopes because of the rising costs of education. Other observers see school administrators focusing on this aspect. Third, “nations are searching for ways of reducing higher-education spending, and MOOCs can look like a silver bullet, making it all so much easier to cut and still feel good about it.” Fourth, “ it makes sense to look at ways of teaching more people more efficiently”.


Professor Michael Wesch  in 2011. Photo: Kansas State U
How technology should be applied to education is a matter I discuss here. As the enthusiasts breathlessly push for a “dramatic disruption” in education, the introduction of digital learning technologies and the often associated competency-based learning system could replace the old “bums on seats” factory education system with a “shiny new and socially acceptable model” in which there is “a naked grab for power over classroom teachers and … the ascendancy of corporate education providers” (See the full comment). Therefore, it is wise to take Thrift’s advice, which is for everyone to “calm down”, and accept the need to experiment with various possibilities, implementing on a large scale only what research has shown to be fruitful.  Even such a prominent proponent of teaching with technology as Michael Wesch has had to “re-boot” as other teachers have found his approach to be as useful as a flat tyre during a grand prix when trying to engage students for an authentic learning experience. On that matter, Wesch states that he now points out to educators the greater need to foster a sense of wonder among young people. That compensatory imperative in education is one that Monke espouses in a 2007 article, where he declares:
 The efforts to label and sort children while constantly seeking technical means to accelerate, enhance, and otherwise tinker with their intellectual, emotional, and physical development are acts of mechanistic abuse (there is really no other name for it) committed against children’s nature. There is no more critical task for schools than to counter this unfolding tragedy. Schools can make headway simply by patiently honoring and nurturing each child’s internally timed, naturally unfolding developmental growth, by abandoning anxious efforts to hurry children toward adulthood, and by giving these young souls time to heal from the wounds inflicted by a culture that shows no respect for childhood innocence.
 This view of education is certainly counter-cultural with regards technology's role in a place of education but it makes plain educators' responsibilities in their care for the well-being of the human person in their charge, no matter whether that person is at K-12 level or a young adult. Only on that basis will educational technologies serve not dystopia but their essential human purpose of enabling new generations to live in tune with an increasingly fragile world.

Friday 15 February 2013

High-tech hazards - #edcmooc

In viewing the new world of big data and the power to use and abuse information technology some experiences are giving rise to a great deal of wariness. Dan Yakir, the chief legal counsel of the Association for Civil Rights in Israel, has said that government power to keep secret or to to disregard personal rights can be a dangerous matter: "Without a doubt, the power is disproportionately on the side of the state and there is a fear that they can take advantage of this power."
The issue of abuse of power by state or corporation when it comes to collecting information about people or controlling the flow of information available to them has also been highlighted by the information that Raytheon has developed software to mine social network use to track their behavior and contacts. The software is available not only to governments but businesses.
The Guardian in Britain has done a service by revealing the nature of how technology, developed by a "defence" company, might be used for nefarious purposes. In the accompanying video, copied here, we see how little remains "private" in the age of social networks.
Though social networks also benefit social groups by breaking down barriers that prevent the openness vital for a healthy society, a factor cited in the Israeli secrecy case referred to above, the risks are great that a distracted population will be manipulated by a government or business, the setting for the Homeland TV series from the United States.
In a second piece, a Guardian commentator says that it's not okay to accept the invasion of privacy because the information is being sifted to catch the "bad guys":
Any [tracking software] algorithm will generate hundreds if not thousands of false positives (innocent people who hit a red flag). Given how rare, say, terrorism is, the vast majority of people bothered by these systems will be ordinary people facing previously unbelievable intrusion.

Second, these systems and techniques are just as useful to draconian governments around the world – as demonstrated in the Middle East uprisings, and time and again with China's internet monitoring and censorship.
The commentary concludes by sounding alarm at the dire prospects for us even now, but more definitely in the decades ahead when the software becomes more sophisticated and therefore less respectful of human rights:
 Surveillance is getting cheaper and easier by the day, which in turn proves almost irresistible – for those with good and bad intentions – to make more use of it.

The only way to prevent such a shift is to group together, raise funds, and lobby hard for real legal safeguards, fast, before the culture shift is irreversible. Anything less is acquiescence.

Sunday 3 February 2013

IT makes waves in education - #edcmooc




A washing machine and vacuum cleaner as educational technology is a stretch, but the way  anything can be a useful technology is brought out in the following short film. In it, two shoppers find their "networked" shopping bags link them in a special way and they jump at the opportunity to convert the nature of the process to further their relationship.

Where the reference to the household appliances providing support for personal education arises is in a 2002 speech by Unesco's then assistant director-general, who recalled comments made to him by a researcher who had said that Tanzanian women who wanted to further their education needed most of all aids that helped them escape the demands of housework.

The decision on which technology should be used in education is given guidelines by Unesco's John Daniel (1). First, focus on the needs of the learner; then on the practicality of the technology; next on its cost; and fourth is the "quality of the teaching that can be delivered". He poses as the "central challenge" for education this century that of "how to increase access, raise quality and cut costs".

More antagonistic towards technology in higher education is Noble (2), who had a bitter experience of an effort at the top-down imposition of IT against staff and student wishes. He sees the issue leading to "the commercial development and exploitation of online education". This criticism is from a 1998 article, 'Digital Diploma Mills: The Automation of Higher Education', which is insightful but and true still in many ways, but now lacking the picture of how many universities, and lower-level institutions are tentatively absorbing IT into their learning-teaching processes. Often it is left to teachers to make the running, with the institution giving technical support and encouragement.  The prodding of staff to employ IT comes about because at stake is the institution's image, its effectiveness in teaching and, for sure, its cost structure.

Below is a video promoting the image of one Australian university that is working its way through this hazardous new territory according to the theme: Online learning will change universities by degrees (3).      

From the likes of such informative-while-promotional efforts at communication with staff and prospective students and their parents, it is clear that Dahlberg's paper (4) on the need to take a wide view of the cause and effect aspects of the internet is more accurate than those which are narrow in their "deterministic" interpretation of the sources of the changes that lead to or result from the introduction of 21st Century forms of informational technology (Chandler 5). In the rapidly changing education sector particularly, there is a fundamental interplay apparent between people and technology, with time needed before the jury can decide on the success of the experimentation being undertaken by the public, educational administrators and practitioners, commercial interests and the political wing of society.
  
1. Daniel J, 2202, 'Technology is the Answer: What was the Question?' Speech at Institut du Monde Arabe Paris, May 27-29. Text updated October 17, 2002.
2. Noble DF, 1998 'Digital Diploma Mills: The Automation of Higher Education', First Monday, January 5, Vol 3, No 1
3. Gardner M 2012 'Online learning will change universities by degree', The conversation.edu.au, viewed February 3, 2012 at
4. Dahlberg L 2004 'Internet Research Tracings: Towards Non-Reductionist Methodology', Journal of Computer-Mediated Communication, 9/3
5. Chandler D, 2002, Technological determinism. web essay, Media and Communication Studies, University of Aberystwyth. Pdf.