In the sixties, with the so-called liberation from state psychiatric institutions, there was a successful movement to shut down the farms associated with these hospitals. It was felt that the patients were being taken advantage of by having to work on those farms without being properly remunerated. But surprise, surprise: a few years later someone came up with a brand new progressive idea called work therapy, where patients suffering from disabling psychiatric illnesses were taught marketable skills with the goal of moving on to full-time employment. I have witnessed some of my own patients achieve dramatic improvements from participation in such a program, despite receiving only token pay.
WHO STARTED THIS WORK THING?
This change would seem to confirm the opinion espoused by my grandmother: “Work is good for the soul.” Yet, it begs the question as to whether this need to work is conditioned by social pressures or is encoded on our DNA. Presumably, this insatiable drive to produce was necessary for the survival of prehistoric man. The provision of food, shelter and protection from predators must have required a great deal of hard work. These hunter-gatherers soon learned that the sexes complimented each other. The upper-body strength of the male made him a better choice as hunter, while the superior manual dexterity of the female suited her to gathering and preparation of foods and making clothing. Her need to nurse her children would also have been an impediment to hunting. Now technological advancements do the heavy work in many jobs, which has opened the door for females to participate in more types of jobs.
NOT FUN THEN, EITHER
These people had a larger brains than other critters, and decided to use it to develop labor saving devices, an activity which continues to the present day with no sign of slowing. Initially, they learned to shape stones into tools, such as spear points, knives and axes.
Meanwhile, the lady of the house–or cave in some instances–would learn to use stones to grind grain into something edible. After learning to control fire, they were able to make pots in which they could transport, store, and cook essentials. The ability to make a vessel in which to carry and store water would have brought about a revolutionary change in their lives, as it would no longer be necessary for them to live near a source of water.
IT NEVER ENDS
Eventually, man would learn to smelt metals, which could be used to make more efficient and durable tools. Thus, he entered the Bronze Age, followed a few hundred years later by the Iron Age, from whence the pace of change towards modernity accelerated. As mentioned in my previous blog, nothing has affected the nature of work and, indeed, the whole of society, more than the invention of the computer. One could rightfully argue that such things as aviation, automobiles, modes of communication, electricity, and space exploration were agents of change; however, they pale in comparison to the power unleashed by the digital age. Computerization has become such a routine part of our daily lives that it is taken for granted as much as indoor plumbing.
JUST LIKE ROCKY
As was the case with Rocky–our stone age ancestor who made life easier for himself by fashioning an axe from a piece of stone–we continue in the same manner to develop tools to lessen our workload. Unfortunately, these same tools both then and now have enhanced our ability to kill others more efficiently, which has been the case ever since Rocky learned he could use his axe to eliminate competition. Although there are still many crappy jobs out there, technology appears to be moving in the direction of the elimination of jobs for humans, an issue previously brought to my mind by the sight of an automated garbage truck in action.
NOT ALL GOOD
This labor-saving business obviously is very attractive, but when we are able to develop machines that can think, we enter a whole new dimension. Stephen Hawking, the renowned theoretical physicist, opined in a 2014 interview with the BBC that the development of Artificial Intelligence was “the greatest event in human history.” He went on to say it could be the last event for humanity. He believes this could lead to the extinction of the human race if it goes unchecked. Elon Musk, the founder of Tesla and Space-X, describes artificial intelligence as “our greatest existential threat.” Bill Gates, the computer guy, echoes these concerns and wonders why others don’t share his worries. They all seem to be saying that some of our si-fi flicks and literature could be prophetic. Artificial Intelligence has already been used to develop weapons systems such as smart bombs, drones, and who knows what else? Boston Dynamics has a prototype soldier robot called Atlas (pictured right) that is currently being taught to act independently.
It sounds like a great idea to have robots do our killing for us just as much as it sounds like a great idea to have those who operate our drones in the comfort of their office do our killing for us, thus insulating them from the horrors of war. In his interview, Musk noted his concerns about economic and psychological effects due to the loss of jobs resulting from the application of artificial intelligence. In my previous blog, I speculated that we were moving towards a push-button society, but with the application of Artificial Intelligence, we won’t even need to push the button. In such a case, the question is: would we be able to retain control?
As with many other pseudo-scholars, I get much of my information from television, and on Sunday, in the midst of writing these musings, I took a break and tuned into Fareed Zakaria’s show on CNN. Since his broadcasts are nearly as important as Ohio State football, I always try to catch his program. Talk about serendipity, he was in the process of interviewing Ginni Rometty, the CEO of IBM, about Watson, IBM’s super computer. I find it interesting that there appears to be an attempt to humanize these machines by giving them names. Not surprisingly, she was extolling the virtues of Watson, and indeed there are many. Her main purpose was to convince us that Artificial Intelligence could never replace us, only assist us. This super computer has received the most press for winning the game show Jeopardy and beating geniuses at chess; however, its more important feats are those which could only have been dreamed of a few years ago. It has been said that information is power, and if that is true, Watson is one powerful dude.
Archeologists agree that one of early man’s major achievements was the development of language. This must have had its beginnings when he learned to make distinctive sounds to communicate. The ability to transmit complex bits of information verbally would have made it possible for Rocky to teach Rocky Jr. how to make his own axe, how to stalk prey, how to avoid vegetation , how to start a fire, and to stop teasing his sister. As subsequent generations continued to collect knowledge, symbols were used to represent words, and a visual means to communicate and store information was born.
Such writings soon outgrew the capacity of cave walls, and a few thousand years later we have huge buildings full of books, the content of which far exceed the capacity of the human brain. One of the largest of these is the Library of Congress, which contains more than 16 million books housed in three buildings. The folks at IBM insist that Watson has the capacity to store every page of all those books in his memory and, unlike us, never forget anything. It is predicted that eventually all the information ever written will be stored, sorted as to its veracity, and made instantly available. As previously mentioned, knowledge is power, but Watson is able to independently access that knowledge, analyze it, and make decisions. He is even able to learn from his own experiences. Watson turns out to be much more than just a memory bank, and this Artificial Intelligence business is starting to sound less artificial
WATSON BOOSTERS EVERYWHERE
IBM must be a bit anxious about all these Artificial Intelligence naysayers, for later in the day, after watching my guru Fareed, Sixty Minutes featured Watson’s work in the diagnosis and treatment of cancer, which was indeed impressive. Oncologists from leading academic cancer centers were in love with Watson, who was programmed to commit to memory all the scientific literature ever written about cancer. The computer learned to interpret laboratory reports, read scans, interpret symptoms, and return a diagnosis along with recommendations for treatment, all with fewer mistakes than when done by conventional methods. With Watson on the job, who needs doctors? This could allow cancer doctors to play golf full time.
The contributions of Artificial Intelligence have enormous potential, not only for treating cancer, but in all of medicine. The unraveling of the human genome has opened up the ability to find, and in some cases correct, mutations, which are responsible for many illnesses including some types of cancer. Although there have been marvelous achievements in medicine, future generations will undoubtedly look back on our efforts as crude and ineffective much as we demean such things as the blood-letting done as treatment by our predecessors.
THE GOOD STUFF
The list of benefits which could be provided by Artificial Intelligence is nearly endless. Self-driven vehicles would eliminate the problem of human error and would undoubtedly greatly reduce the number of traffic accidents. Dangerous or unhealthy jobs could be eliminated. Information could not only be provided, but sorted, verified, and implemented. Robots could learn from their own experiences, and thus they would be able to make rational decisions. We are already at a point where we use and take this ultra-high tech stuff for granted. I rely on my good friend Siri to tell me how to get from point A to point B, how long it will take and where I can stop for a hamburger. Soon, she will be able to take me where I want to go while I read a book or take a nap. The more one thinks about this stuff, the weirder it becomes, and there seems to be no end to the possibilities. There must be grist here for a hundred sci-fi movie scripts.
NOT ALL GOOD
Our technological advancement has already provided the means by which we can destroy human life on this planet. Nuclear war is the most obvious, but climate change could also be fatal, especially if it continues to be widely ignored. As I pointed out in a previous blog, we are also vulnerable to pandemic diseases and do little to prevent them. Now, here we go with more progress, which some pretty smart guys tell us can do good stuff, but it also has the potential to eliminate the human race.
Hawking has predicted that our survival depends on how we handle these existential threats over the next one hundred years. He acknowledges that Artificial Intelligence is here to stay and that further development could not be halted even if we wanted it to be. He does suggest there is great urgency in regulating the process. One phenomenon that caught my eye was an article in Scientific American about a group who is attempting to teach robots to feel emotions. It occurred to me that I would not like that robot, Atlas, to be pissed off at me, so maybe it would be a good idea to cool it with the emotion stuff. It also makes sense that we follow Hawking’s recommendation that there be a ban on the production of military robots like there was with poison gas. In another issue, I hope to expand more on the effect Artificial Intelligence might have on work.
This essay has violated both tenets of the Maggie rule about blogging: keep it short and write about subjects you know something about. I have solved the problem of dealing with that rule by firing Maggie and hiring Caroline, who I am hoping will be more permissive than her mother. Maggie took her dismissal well, but then jobs without pay are probably easier to leave. I am looking forward to working with Caroline who is very intelligent, and I believe could give Watson a run for his money.
To read the final post of this series, click here.