The speaker claims that people who are the most firmly committed to an idea or policy are the same people who are most critical of that idea or policy. While I find this claim paradoxical on its face, the paradox is explainable, and the explanation is well supported empirically. Nevertheless, the claim is an unfair generalization in that it fails to account for other empirical evidence serving to discredit it.
A threshold problem with the speaker's claim is that its internal logic is questionable. At first impression it would seem that firm commitment to an idea or policy necessarily requires the utmost confidence in it, and yet one cannot have a great deal of confidence in an idea or policy if one recognizes its flaws, drawbacks, or other problems. Thus commitment and criticism would seem to be mutually exclusive. But are they? One possible explanation for the paradox is that individuals most firmly committed to an idea or policy are often the same people who are most knowledgeable on the subject, and therefore are in the best position to understand and appreciate the problems with the idea or policy.
Lending credence to this explanation for the paradoxical nature of the speaker's claim are the many historical cases of uneasy marriages between commitment to and criticism of the same idea or policy. For example, Edward Teller, the so-called "father of the atom bomb," was firmly committed to America's policy of gaining military superiority over the Japanese and the Germans; yet at the same time he attempted fervently to dissuade the U.S. military from employing his technology for destruction, while becoming the most visible advocate for various peaceful and productive applications of atomic energy. Another example is George Washington, who was quoted as saying that all the world's denizens "should abhor war wherever they may find it." Yet this was the same military general who played a key role in the Revolutionary War between Britain and the States. A third example was Einstein, who while committed to the mathematical soundness of his theories about relativity could not reconcile them with the equally compelling quantum theory which emerged later in Einstein's life. In fact, Einstein spent the last twenty years of his life criticizing his own theories and struggling to determine how to reconcile them with newer theories.
In the face of historical examples supporting the speaker's claim are innumerable influential individuals who were zealously committed to certain ideas and policies but who were not critical of them, at least not outwardly. Could anyone honestly claim, for instance, that Elizabeth Stanton and Susan B. Anthony, who in the late 19th Century paved the way for the women's rights movement by way of their fervent advocacy, were at the same time highly critical or suspicious of the notion that women deserve equal rights under the law? Also, would it not be absurd to claim that Mahatma Gandhi and Martin Luther King, history's two leading advocates of civil disobedience as a means to social reform, had serious doubts about the ideals to which they were so demonstrably committed? Finally, consider the two ideologues and revolutionaries Lenin and Mussolini. Is it even plausible that their demonstrated commitment to their own Communist and Fascist policies, respectively, belied some deep personal suspicion about the merits of these policies? To my knowledge no private writing of any of these historical figures lends any support to the claim that these leaders were particularly critical of their own ideas or policies.
To sum up, while at first glance a deep commitment to and incisive criticism of the same idea or policy would seem mutually exclusive, it appears they are not. Thus the speaker's claim has some merit. Nevertheless, for every historical case supporting the speaker's claim are many others serving to refute it. In the final analysis, then, the correctness of the speaker's assertion must be determined on a case-by-case basis.
Must we choose between tradition and modernization, as the speaker contends; I agree that in certain cases the two are mutually exclusive. For the most part, however, modernization does not reject tradition; in fact, in many cases the former can and does embrace the latter.
In the first place, oftentimes so-called "modernization" is actually an extension or new iteration of tradition, or a variation on it. This is especially true in language and in law. The modern English language, in spite of its many words that are unique to modern Western culture, is derived from, and builds upon, a variety of linguistic traditions and ultimately from the ancient Greek and Latin languages. Were we to insist on rejecting traditional in favour of purely modern language, we would have essentially nothing to say. Perhaps an even more striking marriage of modernization and tradition is our system of laws in the U.S., which is deeply rooted in English common-law principles of equity and justice. Our system requires that new, so-called "modern" laws be consistent with, and in fact build upon, those principles.
In other areas modernization departs from tradition in some respects, while embracing it in others. In the visual arts, for example, "modern" designs, forms, and elements are based on certain timeless aesthetic ideals such as symmetry, balance, and harmony. Modern art that violates these principles might hold ephemeral appeal due to its novelty and brashness, but its appeal lacks staying power. An even better example from the arts is modern rock-and-roll music, which upon first listening might seem to bear no resemblance to classical music traditions. Yet, both genres rely on the same twelve-note scale, the same notions of what harmonies are pleasing to the ear, the same forms, the same rhythmic meters, and even many of the same melodies.
I concede that, in certain instances, tradition must yield entirely to the utilitarian needs of modern life. This is true especially when it comes to architectural traditions and the value of historic and archaeological artefacts. A building of great historic value might be located in the only place available to a hospital desperately needing additional parking area. An old school that is a prime example of a certain architectural style might be so structurally unsafe that the only practicable way to remedy the problem would be to raze the building to make way for a modern, structurally sound one. And when it comes to bridges whose structural integrity is paramount to public safety, modernization often requires no less than replacement of the bridge altogether. However, in other such cases architecturally appropriate retrofits can solve structural problems without sacrificing history and tradition, and alternative locations for new buildings and bridges can be found in order to preserve tradition associated with our historic structures. Thus, even in architecture, tradition and modernization are not necessarily mutually exclusive options.
To sum up, in no area of human endeavour need modernization supplant, reject, or otherwise exclude tradition. In fact, in our modern structures, architecture and other art, and especially languages and law, tradition is embraced, not shunned.
The speaker asserts that television and computer connectivity will soon render tourism obsolete. I agree that these technologies might eventually serve to reduce travel for certain purposes other than tourism. However, I strongly disagree that tourism will become obsolete, or that it will even decline, as a result.
As for the claim that television will render tourism obsolete, we already have sufficient empirical evidence that this will simply not happen. For nearly a half-century we have been peering through our television sets at other countries and cultures, yet tourism is as popular today as ever. In fact, tourism has been increasing sharply during the last decade, which has seen the advent of television channels catering exclusively to our interest in other cultures and countries. The more reasonable conclusion is that television has actually served to spark our interest in visiting other places.
It is somewhat more tempting to accept the speaker's further claim that computer connectivity will render tourism obsolete. However, the speaker unfairly assumes that the purpose of tourism is simply to obtain information about other people and places. Were this the case, I would entirely agree that the current information explosion spells the demise of tourism. But, tourism is not primarily about gathering information. Instead, it is about sensory experience seeing and heating first hand, even touching and smelling. Could anyone honestly claim that seeing a picture or even an enhanced 3-D movie of the Swiss Alps serves as a suitable substitute for riding a touting motorcycle along narrow roads traversing those mountains? Surely not. The physical world is laden with a host of such delights that we humans are compelled to experience first-hand as tourists.
Moreover, in my view tourism will continue to thrive for the same reason that people still go out for dinner or to the movies, we all need to "get away" from our familiar routines and surroundings from time to time. Will computer connectivity alter this basic need? Certainly not. In short, tourism is a manifestation of a basic human need for variety and for exploration. This basic need is why humans have come to inhabit every corner of the Earth, and will just as surely inhabit other planets of the solar system.
In fact, computer connectivity might actually provide a boon for tourism. The costs of travel and accommodations are likely to decrease due to Internet price competition. Even more significantly, to the extent that the Internet enhances communication among the world's denizens, our level of comfort and trust when it comes to dealing with people from other cultures will only increase. As a result, many people who previously would not have felt safe or secure traveling to strange lands will soon venture abroad with a new sense of confidence. Admittedly, travel for purposes other than tourism might eventually decline, as the business world becomes increasingly dependent on the Internet. Products that can be reduced to digital "bits and bites" can now be shipped anywhere in the world without any human travel. And the volume of business-related trips will surely decline in the future, as teleconferencing becomes more readily available. To the extent that business travelers "play tourist" during business trips, tourism will decline as a result. Yet it would be absurd to claim that this phenomena alone will render tourism obsolete.
In sum, while business travel might decline as a result of global connectivity, tourism is likely to increase as a result. Global connectivity, especially the Internet, can only pique our curiosity about other people, cultures, and places. Tourism helps satisfy that curiosity, as well as satisfying a fundamental human need to experience new things first-hand and to explore the world.
Few would argue that since its inception broadcast television has greatly enhanced communication to the masses. The circulation of even the most widely read newspapers pales compared to the number of viewers of popular television news programs. Yet traditional television is a one-way communications medium, affording viewers no opportunity to engage those so-called "talking heads" in dialogue or respond. Of course, there is nothing inherent about television that prevents us from meaningful and thoughtful communication with each other. In fact, in television's early days it was a fairly common occurrence for a family to gather around the television together for their favorite show, then afterwards discuss among themselves what they had seen and heard. Yet over time television has proven itself to serve primarily as a baby-sitter for busy parents, and as an means of escape for those who wish to avoid communicating with the people around them. Moreover, in the pursuit of profit, network executives have determined over time that the most effective uses of the medium are for fast-paced entertainment and advertising whose messages are neither thoughtful nor meaningful.
Do computers offer greater promise for thoughtful and reflective communication than television? Emphatically, yes. After all, media such as email and the Web are interactive by design. And the opportunity for two-way communication enhances the chances of meaningful and thoughtful communication. Yet their potential begs the question: Do these media in fact serve those ends? It is tempting to hasten that the answer is "yes" with respect to email; we have all heard stories about how email has facilitated reunions of families and old friends, and new long-distance friendships and romances. Moreover, it would seem that two-way written communication requires far more thought and reflection than verbal conversation. Nevertheless, email is often used to avoid face-to-face encounters, and in practice is used as a means of distributing quick memos. Thus on balance it appears that email serves as an impediment, not an aide, to thoughtful and reflective communication.
With respect to Web-based communication, the myriad of educational sites, interactive and otherwise, is strong evidence that the Web tends to enhance, rather than prevent, meaningful communication. Distance learning courses made possible by the Web lend further credence to this assertion. Nonetheless, by all accounts it appears that the Web will ultimately devolve into
a mass medium for entertainment and for e-commerce, just like traditional television.
Meaningful personal interactivity is already yielding to advertising, requests for product information, buy-sell orders, and titillating adult-oriented content.
Thus, on balance this high-speed electronic media does indeed tend to prevent rather than facilitate meaningful and thoughtful communication. In the final analysis, any mass medium carries the potential for uplifting us, enlightening us, and helping us to communicate with and understand one another. However, by all accounts, television has not fulfilled that potential; and whether the Web will serve us any better is ultimately up to us as a society.
The speaker actually raises two distinct issues here: (1) whether information can eliminate, or at least help reduce, prejudice; and (2) if not, whether this is because prejudice is rooted in emotion rather than reason. Despite the evidence to the contrary, I fundamentally agree with the speaker's essential claim that prejudice is here to stay because it is firmly rooted in emotion rather than reason.
Regarding the first issue, it would appear at first glance that prejudice is declining as a result of our becoming a more enlightened, or better informed, society. During the past quarter-decade, more so than any other period in human history, various voices of reason have been informing us that racial, sexual and other forms of prejudice are unfounded in reason, morally wrong, and harmful to any society. During the 1960s and 1970s such information came from civil-rights and feminist activists; more recently the primary source of this information has been mainstream media, which now affirmatively touts the rights of various racial groups, women, and homosexuals. Moreover, increasing mobility and cultural awareness surely serve to inform people the world over that we are all essentially alike.
It would seem that, as a result of this flood of information, we would be making clear progress toward eliminating prejudice. However, much of this so-called progress is forced upon us legislatively--in the form of anti-discrimination laws in the areas of employment, housing, and education, which now protect all significant minority groups. Without these laws, would we voluntarily refrain from the discriminatory behaviour and other forms of prejudice that the laws prevent? Perhaps not.
Moreover, signs of prejudice are all around us today. Extreme factions still rally around bigoted demagogues; the number of "hate crimes" is increasing alarmingly; and the cultural gap between white Americans and African-Americans seems to be widening as the level of mutual distrust heightens. Besides, what appears to be respect for one another's differences may in fact be an increasing global homogeneity-that is, we are becoming more and more alike. In short, on a societal level an apparent decline of prejudice is actually legislated morality and increasing homogeneity. Accordingly, I find the speaker's threshold assertion-that no amount of information can eliminate prejudice compelling indeed.
The second issue that the statement raises is whether prejudice is learned or instinctive. If it were learned, then it would seem that by obtaining certain information, or by purging one's mind of certain dis-information, one could learn to not be prejudiced. Despite popular notions that this is possible, I have my doubts because these are age-old theories but we see little evidence that prejudice is on the wane. Thus it seems that the root of prejudice lies more in an instinctive, almost primal, sense of fear than in the sort of distrust that is learned and can therefore be "unlearned." Accordingly, I also find the speaker's second assertion-that prejudice is rooted in emotional compelling as well.
In sum, despite a deluge of information debunking our false notions about people who are different than us, as a society it appears we have not reversed our inclination toward prejudice. Therefore, I find convincing the speaker's claim that prejudice is rooted in the sort of emotion that reason cannot override.
Should the only responsibility of a business executive be to maximize business profits, within the bounds of the law? In several respects this position has considerable merit; yet it ignores certain compelling arguments for imposing on businesses additional obligations to the society in which they can operate.
On the one hand are two convincing arguments that profit maximization within the bounds of the law should be a business executive's sole responsibility. First, imposing on businesses additional duties to the society in which they operate can, paradoxically, harm that society. Compliance with higher ethical standards than the law requires has an impact on environmental and workplace conditions adds to business expenses and lowers immediate profits. In turn, lower profits can prevent the socially conscious business from creating more jobs, and from keeping its prices low and the quality of its products and services high. Thus if businesses go further than their legal duties in serving their communities the end result might be a net disservice to those communities.
Secondly, by affirming that profit maximization within legal bounds is the most ethical behaviour possible for business, we encourage private enterprise, and more individuals enter the marketplace in the quest of profits. The inevitable result of increased competition is lower prices and better products, both of which serve the interests of consumers. Moreover, since maximizing profits enhances the wealth of a company's stakeholders, broad participation in private enterprise raises the wealth of a nation, expands its economy, and raises its overall standard of living and quality of life.
On the other hand are three compelling arguments for holding business executives to certain responsibilities in addition to profit maximization and to compliance with the letter of the law. First, a growing percentage of businesses are related to technology, and haws often lag behind advances in technology. As a result, new technology-based products and services might pose potential harm to consumers even though they conform to current laws. For example, Internet commerce is still largely unregulated because our lawmakers are slow to react to the paradigm shift from brick-and-mortar commerce to e-commerce. As a result, unethical marketing practices, privacy invasion, and violations of intellectual-property rights are going unchecked for lack of regulations that would clearly prohibit them.
Secondly, since a nation's laws do not extend beyond its borders, compliance with those laws does not prevent a business from doing harm elsewhere. Consider, for example, the trend among U.S. businesses in exploiting workers in countries where labour laws are virtually non-existent in order to avoid the costs of complying with U.S. labour laws.
Thirdly, a philosophical argument can be made that every business enters into an implied social contract with the community that permits it to do business, and that this social contract, although not legally enforceable, places a moral duty on the business to refrain from acting in ways that will harm that community.
In sum, I agree with the statement insofar as in seeking to maximize profits a business serves not only itself but also its employees, customers, and the overall economy. Yet today's rapidly changing business environment and increasing globalization call for certain affirmative obligations beyond the pursuit of profit and mere compliance with enforceable rules and regulations. Moreover, in the final analysis any business is indebted to the society in which it operates for its very existence, and thus has a moral duty, regardless of any legal obligations, to pay that debt.
The speaker contends that students should be skeptical in their studies, and should not accept passively whatever they are taught. In my view, although undue skepticism might be counterproductive for a young child's education, I strongly agree with the speaker otherwise.
If we were all to accept on blind faith all that we are taught; our society would never progress or evolve.
Skepticism is perhaps most important in the physical sciences. Passive acceptance of prevailing principles quells innovation, invention, and discovery. In fact, the very notion of scientific progress is predicated on rigorous scientific inquiry--in other words, skepticism. And history is replete with examples of students of science who challenged what they had been taught, thereby paving the way for scientific progress. For example, in challenging the notion that the Earth was in a fixed position at the centre of the universe, Copernicus paved the way for the corroborating observations of Galileo a century later, and ultimately for Newton's principles of gravity upon which all modern science is based. The staggering cumulative impact of Copernicus' rejection of what he had been taught is proof enough of the value of skepticism.
The value of skepticism is not limited to the physical sciences, of course. In the fields of sociology and political science, students must think critically about the assumptions underlying the status quo; otherwise, oppression, tyranny and prejudice go unchecked. Similarly, while students of the law must learn to appreciate timeless legal doctrines and principles, they must continually question the fairness and relevance of current laws. Otherwise, our laws would not evolve to reflect changing societal values and to address new legal issues arising from our ever-evolving technologies.
Even in the arts, students must challenge established styles and forms rather than learn to imitate them; otherwise, no genuinely new art would ever emerge. Bee-bop musicians such as Charlie Parker demonstrated through their wildly innovative harmonies and melodies their skepticism about established rules for harmony and melody. In the area of dance Ballanchine showed by way of his improvisational techniques his skepticism about established rules for choreography. And Germany's Bauhaus School of Architecture, to which modern architecture owes its existence, was rooted in skepticism about the proper objective, and resulting design, of public buildings.
Admittedly, undue skepticism might be counterproductive in educating young children. I am not an expert in developmental psychology; yet observation and common sense informs me that youngsters must first develop a foundation of experiential knowledge before they can begin to think critically about what they are learning. Even so, in my view no student, no matter how young, should be discouraged from asking "Why?" and "Why not?"
To sum up, scepticism is the very stuff that progress is made of, whether it be in science, sociology, politics, the law, or the arts. Therefore, scepticism should be encouraged at all but the most basic levels of education.
Should parents and communities participate in local education because education is too important to leave to professional educators, as the speaker asserts? It might be tempting to agree with the speaker, based on a parent's legal authority over, familiarity with, and interest in his or her own children. However, a far more compelling argument can be made that, except for major decisions such as choice of school, a child's education is best left to professional educators.
Communities of parents concerned about their children's education rely on three arguments for active parental and community participation in that process. The first argument, and the one expressed most often and vociferously, is that parents hold the ultimately legal authority to make key decisions about what and how their own children learn including choice of curriculum and text books, pace and schedule for learning, and the extent to which their child should learn alongside other children. The second argument is that only a parent can truly know the unique needs of a child including what educational choices are best suited for the child. The third argument is that parents are more motivated--by pride and ego--than any other person to take whatever measures are needed to ensure their children receive the best possible education.
Careful examination of these three arguments, however, reveals that they are specious at best. As for the first one, were we to allow parents the right to make all major decisions regarding the education of their children, many children would go with little or no education. In a perfect world parents would always make their children's education one of their highest priorities. Yet, in fact many parents do not. As for the second argument, parents are not necessarily best equipped to know what is best for their child when it comes to education. Although most parents might think they are sufficiently expert by virtue of having gone through formal education themselves, parents lack the specialized training to appreciate what pedagogical methods are most effective, what constitutes a balanced education, how developmental psychology affects a child's capacity for learning at different levels and at different stages of childhood. Professional educators, by virtue of their specialized training in these areas, are far better able to ensure that a child receives a balanced, properly paced education.
There are two additional compelling arguments against the speaker's contention. First, parents are too subjective to always know what is truly best for their children. For example, many parents try to overcome their own shortcomings and failed self-expectations vicariously through their children's accomplishments. Most of us have known parents who push their child to excel in certain areas to the emotional and psychological detriment of the child. Secondly, if too many parties become involved in making decisions about day-to-day instruction, the end result might be infighting, legal battles, boycotts, and other protests, all of which impede the educational process; and the ultimate victims are the children themselves. Finally, in many jurisdictions parents now have the option of schooling their children at home, as long as certain state requirements are met. In my observation, home schooling allows parents who prefer it great control over a child's education, while allowing the professional educators to discharge their responsibilities as effectively as possible unfettered by gadfly parents who constantly interfere and intervene.
In sum, while parents might seem better able and better motivated to make key decisions about their child's education, in many cases they are not. With the possible exceptions of responsible home-schoolers, a child's intellectual, social, and psychological development is at risk when communities of parents dominate the decision-making process involving education.
The speaker claims that all observation is subjective coloured by desire and expectation.
While it would be tempting to concede that we all see things differently, careful scrutiny of the speaker's claim reveals that it confuses observation with interpretation. In fact, in the end the speaker's claim relies entirely on the further claim that there is no such thing as truth and that we cannot truly know anything. While this notion might appeal to certain existentialists and epistemologists, it runs against the grain of all scientific discovery and knowledge gained over the last 500 years.
It would be tempting to afford the speaker's claim greater merit than it deserves. After all, our everyday experience as humans informs us that we often disagree about what we observe around us. We've all uttered and heard uttered many times the phase "That's not the way I see it!" Indeed, everyday observations-for example, about whether a football player was out of bounds, or about which car involved in an accident ran the red light vary depending not only on one's spatial perspective but also on one's expectations or desires. If I'm rooting for one football team, or if the player is well-known for his ability to make great plays while barely staying in bounds, my desires or expectations might influence what I think I observe. Or if I am driving one of the cars in the accident, or if one car is a souped-up sports car, then my desires or expectations will in all likelihood color my perception of the accident's events.
However, these sorts of subjective "observations" are actually subjective "interpretations'' of what we observe. Visitors to an art museum might disagree about the beauty of a particular work, or even about which colour predominates in that work. In a court trial several jurors might view the same videotape evidence many times, yet some jurors might "observe" an incident of police brutality, will others "observe" the appropriate use of force to restrain a dangerous individual.
Thus when it comes to making judgments about what we observe and about remembering what we observe, each person's individual perspective, values, and even emotions help form these judgments and recollections. It is crucial to distinguish between interpretations such as these and observation, which is nothing more than a sensory experience. Given the same spatial perspective and sensory acuity and awareness, it seems to me that our observations would all be essentially in accord that is, observation can be objective.
Lending credence to my position is Francis Bacon's scientific method, according to which we can know only that which we observe, and thus all truth must be based on empirical observation. This profoundly important principle serves to expose and strip away all subjective interpretation of observation, thereby revealing objective scientific truths. For example, up until Bacon's time the Earth was "observed" to lie at the center of the Universe, in accordance with the prevailing religious notion that man (humankind) was the center of God's creation.
Applying Bacon's scientific method Galileo exposed the biased nature of this claim. Similarly, before Einstein time and space were assumed to be linear, in accordance with our "observation." Einstein's mathematical formulas suggested otherwise, and his theories have been proven empirically to be true. Thus it was our subjective interpretation of time and space that led to our misguided notions about them. Einstein, like history's other most influential scientists, simply refused to accept conventional interpretations of what we all observe.
In sum, the speaker confuses observation with interpretation and recollection. It is how we make sense of what we observe, not observation itself, that is colored by our perspective, expectations, and desires. The gifted individuals who can set aside their subjectivity and delve deeper into empirical evidence, employing Bacon's scientific method, are the ones who reveal that observation not only can be objective but must be objective if we are to embrace the more fundamental notion that knowledge and truth exist.
This statement actually consists of a series of three related claims: (1) machines are tools of human minds; (2) human minds will always be superior to machines; and (3) it is because machines are human tools that human minds will always be superior to machines. While I concede the first claim, whether I agree with the other two claims depends partly on how one defines "superiority," and partly on how willing one is to humble oneself to the unknown future scenarios.
The statement is clearly accurate insofar as machines are tools of human minds. After all, would any machine even exist unless a human being invented it? Of course not. Moreover, I would be hard-pressed to think of any machine that cannot be described as a tool. Even machines designed to entertain or amuse us-for example, toy robots, cars and video games, and novelty items-are in fact tools, which their inventors and promoters use for engaging in commerce and the business of entertainment and amusement. And, the claim that a machine can be an end in itself, without purpose or utilitarian function for humans whatsoever, is dubious at best, since I cannot conjure up even a single example of any such machine. Thus when we develop any sort of machine we always have some sort of end in mind a purpose for that machine.
As for the statement's second claim, in certain respects machines are superior. We have devised machines that perform number-crunching and other rote cerebral tasks with greater accuracy and speed than human minds ever could. In fact, it is because we can devise machines that are superior in these respects that we devise them as our tools to begin with. However, if one defines superiority not in terms of competence in performing rote tasks but rather in other ways, human minds are superior. Machines have no capacity for independent thought, for making judgments based on normative considerations, or for developing emotional responses to intellectual problems.
Up until now, the notion of human made machines that develop the ability to think on their own, and to develop so called "emotional intelligence," has been pure fiction. Besides, even in fiction we humans ultimately prevail over such machines as in the cases of Frankenstein's monster and Hal, the computer in 2001: A Space Odyssey. Yet it seems presumptuous to assert with confidence that humans will always maintain their superior status over their machines. Recent advances in biotechnology, particularly in the area of human genome research, suggest that within the 21st Century we'll witness machines that can learn to think on their own, to repair and nurture themselves, to experience visceral sensations, and so forth. In other words, machines will soon exhibit the traits to which we humans attribute our own superiority.
In sum, because we devise machines in order that they may serve us, it is fair to characterize machines as "tools of human minds." And insofar as humans have the unique capacity for independent thought, subjective judgment, and emotional response, it also seems fair to claim superiority over our machines. Besides, should we ever become so clever a species as to devise machines that can truly think for themselves and look out for their own well-being, then query whether these machines of the future would be "machines'' anymore.