Wednesday, July 07, 2010

By Steven H. VanderLeest

Anthropomorphizing technology – suggesting human characteristics for non-human technological products – is a common pastime.  We attribute motives to our gadgets, particularly malicious intent when our gadgets repeatedly fail at the most inauspicious times.  We sometimes slip in a personal pronoun for a device. Our technology is often so advanced and complex that the ordinary user has very little clue about its inner workings.  Arthur C. Clarke’s third “law” reflects this idea: “any sufficiently advanced technology is indistinguishable from magic.”  Even so, I doubt that anyone actually believes that the technology is truly sentient.  Rather, we joke about its seemingly human behavior, knowing full well that the computer, automobile, and space shuttle are all simply objects with no ability to act on their own, though they are sufficiently complex that we cannot always predict their behavior.  There may be an added benefit to this humorous approach to our technology.  Humor about technology can be a comfort because it recognizes that we share a common experience – who hasn’t lost something important when a computer crashed on them? 

I would like to see technology help us share experiences and improve human relationships more often, but not just centered on technological failures.  This would honor the design norm of integrity.  This is the principle that a technology design ought to harmonize function and form, ought to integrate all the parts into an aesthetic whole, and ought to promote positive human relationships.  Where technology brings people together, encourages peaceful interaction, brings out the best of what makes us human, then it observes the principle of integrity.  One example would be collaborative document technologies that allow multiple users to edit a document simultaneously (though I think these product are still in their infancy, and I expect many improvements yet). 

To “technopomorphize”, if that were actually a word, would refer to the use of technical analogies to understand human concepts and relationships.  Our daily language is full of them:  “switching gears”, “like clockwork”, “grease the skids”, “turn the crank”, “pull the plug”.  “really pushed his buttons”, “need to wind down”.  We use these as symbols to explain our own and others behavior.  When understood as a simplification, as an abstraction in order to communicate succinctly, these analogies can be helpful.  They reflect our God-given and God-reflecting character as tool makers and tool users.  When mistaken to be the behavior itself rather than a symbol or analogy, then I think we do ourselves and our fellow humans an injustice.  Trying to understand the human brain by comparing it to a computer can be a helpful approach; implying that the human brain is simply a biological computer, no more, leaves out the nuance and beauty, the mystery and complexity and soul of the human creature.  Yes, you can push my buttons, but my output is not deterministically dependent on those inputs.

Guns Don’t Kill People

Wednesday, June 30, 2010

By Steven H. VanderLeest

On Monday the U.S. Supreme Court ruled in McDonald vs. Chicago that the Second Amendment of the U.S. Constitution provides an individual (not simply collective) right to bear arms. This reminds me of the old saw from gun-rights advocates:  “guns don’t kill people; people kill people”.  This is a succinct argument for the neutrality of technology.  It makes the point that technology (at least in the form of guns) has no responsibility because it has no ability to make a moral choice (i.e., it has not agency).  While I agree with the slogan in its narrow sense (that technology has no ability to act, no agency), I do not draw with the broader implication (that therefore the technology is neutral).  Consider plugging other technologies into this phrase, for example, “exploding gas tanks on Ford Pintos do not kill people; people kill people” or “o-rings on the space shuttle Challenger do not kill people; people kill people” or “crack cocaine does not kill people; people kill people.”  In each case, while it is true that the technology in isolation does not act to kill (it has no intent, no volition), it still feels strange to say that the technology had nothing to do with the deaths.  This is because the technology embodies the will of the designer and amplifies the power of the user.  It is not neutral.  It biases the user towards certain actions.  We tend to use a hammer to pound nails more often than to paint pictures because it is designed to perform pounding functions.  We tend to use dynamite to blow things up rather than to pound nails because it is designed to explode.  Our actions when holding dynamite or a hammer or crack cocaine will tend to proceed in the ways that the technology is biased. 

That’s not to say I couldn’t use a hammer to kill someone instead of pounding nails.  But it is to say that we should recognize the bias in technology and use appropriate caution in light of the bias.  Because the technology amplifies my abilities, I should take more care when swinging my arm around while holding a hammer than without.  Some technologies are particularly powerful, like guns or automobiles or thermonuclear weapons.  We must be acutely aware of their power and use it judiciously.  We must recognize that rash decisions can lead to deadly results.

I think the court made a good decision because they balanced the decision, making it clear that while sweeping prohibitions against gun ownership were not constitutional, states and municipalities could use reasonable regulations regarding the sale and ownership of guns.  This recognizes the power of the technology and the resulting need for careful judgment on the part of the user and the community.  All the stakeholders – everyone affected by the technology – must be considered in making decisions about the design, manufacture, distribution, use, and disposal of technology.

TULA = Technology Uses a Lot of Acronyms

Tuesday, June 22, 2010

By Steven H. VanderLeest

I’m at the ASEE (American Society for Engineering Education) conference this week, where LED stands not for Light Emitting Diode, but for Liberal Education Division (and that’s liberal as in “liberal arts” not as in politics).  It reminded me of how many acronyms we tend to use in technical disciplines.  We often use rather unwieldy shorthand, such as PCMCIA for Personal Computer Manufacturer Interface Adaptor (though some say it should really stand for People Can’t Memorize Computer Industry Acronyms).  We even have acronyms that contain acronyms, such as VHDL, which stands for VHSIC Hardware Description Language, where VHSIC stands for Very-High Speed integrated Circuit.  Of course that one has its origin with the military, also famous for the liberal use of acronyms (liberal, as in generous).  The military also gets the prize for the longest acronym (perhaps mythical), ADCOMSUBORDCOMPHIBSPAC apparently used by the US Navy to designate Administrative Command: Amphibious Forces, Pacific Fleet, Subordinate Command.

Technology users love acronyms.  They proliferate with every new innovation.  Why do we see so many acronyms when it comes to technology?  Sometimes the limitations of the technology encourage a burgeoning dictionary of short-cut words.  The acronym LOL (Laugh Out Loud) was originally seen on Usenet, a short-hand form that cut down on typing when using slow connections.  Text messages today use LOL to cut down on the number of characters one needs to thumb out on a tiny keyboard.  The World Wide Web is now everywhere called simply “the web”, no more in need of additional qualifiers than the word satellite is in need of the clarification of “artificial”.

Another reason for new acronyms may be that when a new invention appears on the scene, we often need new words to describe it.  The acronyms appear because we require new combinations of words to describe features of the new technology.  These acronyms then make their way into conversation as the technology becomes more familiar.

We love inventing new language.  Teens do it all the time—to hide their meaning from adults and at the same time assert some power over them by knowing something they don’t.  I wonder if we sometimes use technical jargon in the same way, to assert power over those who are less technically savvy than us.  Sometimes we recycle old words, turning a noun into a related verb (have you googled anyone lately?) or hijacking a word with an entirely new meaning.  Sometimes we create new words by combining several shorter ones, or by shortening up a longer one.  All this word play seems to come naturally to humans and I suspect it is part of what makes us human.  Language is a tool in some senses (an instrument for communication), but I’m not sure I’d call it technology.  The definition of technology gets fuzzy in this area.  I think I would include computer programming languages, network protocols, and encryption codes.  Strangely we call these artificial languages, while this thing I am typing and you are reading is called natural language.  But if artificial means man-made, then isn’t even our natural language artificial by that measure? 

Naming something, adding it to our lexicon, has power.  The name is the symbol by which we conjure up the concept and definition and identify of the thing named.  It helps us to communicate our thoughts to others, and indeed can help us think our own thoughts.  Names are inevitably stereotypes that do not sufficiently represent the complexity of the thing they symbolize.  The label is always a summary that abstracts away much of the detail. 

I love reading and hearing about new technology.  I revel in the new language and relish the novel terminology.  It is a particular joy to participate in the naming itself, a privilege enjoyed by many engineers, scientists, and inventors.  Perhaps we should use a little more ceremony when christening an innovation or discovery.  Perhaps we should celebrate a little more when a new thing has come into being, a birth that must be recognized because it requires a name.  Yes, let’s take care that we don’t become proud because of our creativity, but rather give glory to God for granting us the gift to unfold His creation.

Estimating Risk in the Face of Technological Complexity:  The BP Oil Spill

Friday, May 28, 2010

By Steven H. VanderLeest

The BP oil spill in the Gulf of Mexico that started with a well blowout on 10 April 2010 now appears to be a greater volume of oil than the 1989 disaster when the Exxon Valdez ran aground on a reef and dumped over 250,000 barrels worth of oil into the waters of Prince William Sound off the coast of Alaska.  This makes it the largest oil spill in US history (though not the largest spill globally).  We do not yet know the full extent of the environmental toll since oil is just starting to wash ashore in the marshlands of Louisiana.  The upcoming hurricane season could accentuate the problem by driving oil further inland.

Disasters like this are often related to the technological complexity of the system.  This oil rig was a deep water well, extending over 5,000 feet below the surface.  Even though we have extensive knowledge of oil drilling in this country and the company involved was certainly experienced, it appears that once again we were pushing the envelope.  As oil has become harder to find, the wells must go deeper, thus going beyond the current bounds of experience.  For example, one of the methods BP attempted to cap the blown well, by lowering a large dome over it, had been used successfully before, but never at this great of depth.  The greater depth resulted in unforeseen problems with methane crystals forming inside the dome, causing it to become clogged.  Complexity also appears in the organizational structures.  BP leased the rig from Transocean.  Perhaps this was a good business arrangement or perhaps it also provided some legal separation to lower risk of liability in the event of an accident.  The US government was involved with regulatory oversight before the spill and environmental cleanup after the spill.  The news media provided informational coverage, resulting in public pressure to limit the damage from the spill quickly and effectively.

Measuring and limiting the risks inherent in complex systems is a difficult task.  It is not merely a science but also an art, as it requires creative imagination so that unforeseen consequences are now foreseen.  We regularly engineer new technologies or adapt old technologies to new situations.  Each time, we face the question of what could go wrong (and how we would deal with such a failure).  The difficulty of estimating the probability of a given failure makes it tempting to forego the cost of a backup system to deal with the potential failure.  The risk of some types of failures might be so low and the cost of preventing or handling such a failure might be so high that we cannot justify the added protection. 

Some argue that safety has no price tag too high, but that is not the reality.  We implicitly estimate safety’s value in most technological products, both as individuals and as a society.  For example, we are aware as individuals and as a society that there is a noticeable risk of injury or death from accidents while driving in an automobile.  That risk could be significantly reduced if we required cars to be built like tanks and also severely limited their maximum speed.  But we choose to accept a higher risk in trade for lower cost and greater convenience.  In the present case, each stakeholder accepted some risk (perhaps too much so), including BP, Transocean, the US federal government, and even BP customer’s, all of us who purchase oil via its derivative products such as gasoline.  Thus we all bear some share of responsibility for that risk and the consequences of failures. 


Wednesday, May 19, 2010

By Steven H. VanderLeest

Colin Gunton claims that Western communication and technology have helped to disseminate a “unitary and homogeneous public culture.”  (Colin E. Gunton, The One, The Three and the Many: God, Creation, and the Culture of Modernity, Cambridge :  Cambridge University Press, 1993, p. 33-34).  It is unfortunate that he lumps communication and technology together, since the two are not really comparable concepts.  Rather they overlap with each other. Many forms of communication are technologically based; many are not.  Technology can communicate function via form, but often it obscures intent.  Only in the broad sense that both are cultural activities can we compare the two, but then surely there are many more modern activities that lead to bland uniformity.

Perhaps Gunton means to specifically accuse that overlapping arena of communication via technology as the culprit that homogenizes our culture, rather than one or the other in isolation.  Indeed, a consumerist mindset regarding technology can certainly result in a bland similarity, where little stands out and variety is scarce.  Is this a necessary implication of technology or is it a result of our economic system that drives to low cost by standardizing production?  Mass production has frequently had this effect, from at least the time of Henry Ford: “Any customer can have a car painted any colour that he wants so long as it is black.”  (I was also trying to think of some examples of films that illustrate the deadening influence of consumerism – I’m hoping one of my readers might suggest a couple?)  The state can also drive extreme standardization in the enforcing of the will of the many to overcome the distinctiveness of the individual, as in classic 20th century dystopian novels of Huxley (Brave New World) and Orwell (1984).

A central tension in Gunton’s book is between the individual (“the one”) and the community (“the many”).  Although Gunton makes out technology to be an accomplice in much that ills modern society, we need not then despair of using technology at all.  The individual can tame technology, customizing it and personalizing it so that one’s distinct character comes through.  Technology can help us both express our individuality and at the same time continue to honor our humanity as it is defined by our relationships with each other.  For example, social networking technologies such as those provided by Facebook , Twitter, or LinkedIn allow connection to a community while also establishing an individual’s own distinct style and characteristics.  Collaborative tools such as Google shared documents allow individuals to contribute to combined efforts.  Blogs allow individuals to express their thoughts, sharing them with readers who can comment back.  Multiplayer video games can foster teamwork and camaraderie, particularly when they provide collaborative game-play modes that allow players to work together to achieve a goal.

Technology can be a powerful tool to achieve the right balance between individuality and community, but it must be wielded with care and forethought.  All too easily we can slip into isolation as individuals, using technology to shield ourselves and to hide, avoiding human contact and believing we can be self-sufficient.  At the other extreme, all too easily we can be lulled into complacency, following the crowd without thinking for ourselves, mindlessly accepting technology and dampening our own personalities to fit the demands of the machine and the expectations of our social peers.  The cure?  Perhaps we need to be more assertive in customizing and personalizing our tools in ways that also fit into a mosaic of the people around us.

Page 16 of 19 pages « First  <  14 15 16 17 18 >  Last »
(c) 2013, Steven H. VanderLeest