Thinking about Systems

Wednesday, April 27, 2011

By Steven H. VanderLeest

I recently attended a meeting of the Michigan chapter of the International Council on Systems Engineering (INCOSE). A relatively new professional society (founded in 1990), INCOSE defines their discipline of Systems Engineering as “an interdisciplinary approach and means to enable the realization of successful systems. It focuses on defining customer needs and required functionality early in the development cycle, documenting requirements, then proceeding with design synthesis and system validation while considering the complete problem.”  They recognize that engineering problems are not purely technical in nature but also involve much broader constraints such as cost, schedule, and operational issues. 

Systems engineering tries to draw the box large enough to truly solve the problem at hand.  The envelope defining “the system” will always be larger than the science, math, and technical content.  It will range across the whole spectrum of socio-cultural aspects because the problems we solve are human problems and thus multidimensional and complex.  A good design process will attempt to identify all the factors that influence a new technology design and all the variables that may in turn be affected by the introduction of that new product.  This is no trivial task!  The physical factors are perhaps the easiest.  One can perform an analysis of variance (such as the ANOVA statistical technique) to characterize the most important factors that influence a desired outcome, though some factors may be well known, such as the influence of automobile weight on gas mileage.  Even here, the engineer or scientist can only characterize the variables that she has named.  Creativity and wisdom are required to tease out all the important influences. 

The more challenging factors are not physical but socio-cultural.  The psychology and sociology of driver behavior that influences gas mileage can be complex.  Consider the thrill of acceleration that tempts us to toss mileage to the wind or the subtle influence of the enclosed and thus isolated interior space of a vehicle that somehow lowers our inhibition against rage on the road. One would think we could predict these effects because they often seem to be common sense outcomes – at least in hindsight.  One would think that we all would have realized that the advent of ubiquitous text messaging among teens would lead to student cheating by the use of this communication technology during a test.  But many teachers were caught unaware and surprised by this development.

Many engineers and scientists chose their career because of the attraction of discovery and invention.  Christians can certainly participate in this joy because it enables us to appreciate the creative wonder of our God more fully.  If our work enhances that “vertical” relationship, there is another aspect that is more about the “horizontal” relationships to the rest of creation.  We serve our fellow humans by solving problems with technology.  Making a difference in the world via our technical products requires systems thinking in order to truly solve the whole problem.  Otherwise we risk optimizing locally (for some subset of stakeholders) without realizing that the overall performance of our solution is sub-optimal (perhaps because we aggravated the situation of stakeholders we failed to recognize). 

I’m rather fond of block diagrams as a good way to start the process of describing a problem and proposed solutions because it also serves as communication tool.  But drawing a box to define the system inherently neglects whatever is outside that box.  Our finite minds often need to reduce the complexity of a system by focusing on only the most significant factors, so building those fences is often necessary.  However, let’s take care that we don’t neglect real people – our neighbors – and miss the opportunity to be the Good Samaritan with our technological solutions.

Technology Vices

Wednesday, April 20, 2011

By Steven H. VanderLeest

During Holy Week I was thinking about some on-line acquaintances that have given up Facebook for Lent.  This is an interesting twist on the ancient tradition of self-deprivation during this season, an expansion on the contemplative practice of fasting.  Just as fasting does not imply there is anything wrong with food, so too I take it that a short break from a certain technology does not imply anything wrong with that technology.  However, fasting from food or technology does imply that we might sometimes get our priorities wrong, putting too much emphasis on a good thing so that it displaces the ultimate good:  God.  Fasting can also help us to appreciate, by their absence, these good gifts that God has given us.

Technology that takes too high a place in our lives is then by definition an idol.  Even good things can tempt us away from following God’s will, drawing us by their convenience, elegance, beauty, or other allures.  This got me wondering if there might be specific aspects of our technical gadgets that might have an insidious aspect to lure us into other sins.  I have explored certain virtues that technology ought to encourage or exhibit, so perhaps it is time to look at the dark side, perhaps appropriate as we contemplate the darkest day in history on this coming Friday that we ironically call Good. 

No matter how novel and inventive, technology has not introduced any new sins, but merely a new twist on an old vice.  The vice of lust (excessive desire, particularly of a sexual nature) is certainly easy to spot in modern technology: Internet pornography is well-known to be a common ill that tempts web surfers.  Pornography is not a recently invented evil, but it appears to be more wide-spread now because of the ease of access over the Internet.  Even more significant, the apparent anonymity of Internet access reduces the perceived threat of getting caught, which likely increases the temptation. 

The vice of gluttony (wasteful overindulgence or excessive consumption, particularly of food) shows up in our societal predilection for rapid adoption of the latest gadget (and junking the previous device).  We have amassed mountains of discarded cell phones in our drive to consume the latest and greatest phone that came out last week, making our year-old phone (supposedly) obsolete.  Technology has made gluttony of food easier too, by pre-packaging foods that are hard to resist because of their carefully concocted compositions of sweet and fatty ingredients that give the perception of satisfaction but degrade our health over the long term. 

Technology often aims for convenience, which easily translates into the vice of sloth.  A TV remote that allows coach potatoes to control their gadgets without lifting more than a finger, or a microwave meal that can be prepared with the push of a button, or a power tool that reduces the job time from hours to seconds – all these devices could produce much good by allowing humans to express their creativity and direct their energies more productively.  Too often, the convenience results in lazy, lethargic behaviors.  Technology as tool should amplify our abilities, ambitions, and strivings, but the vice of sloth too easily tempts us to let the tool do the work so that we can rest and relax instead.

The vice of wrath (undue anger or rage) is an ancient sin, but technology can put too much power into the frail human who does not think through the consequences of angry actions.  Road rage has become all too common, where a small offense (often more perceived than actual) becomes the occasion for rude gestures and then physical aggression.  Without tools, such a brawl might come to blows, but likely the two combatants walk away (bruised, but not permanently injured).  With tools, whether a gun or an automobile, the results can be devastating:  a disproportionate response that often encompasses innocents.  After the anger subsides, the perpetrator often experiences great guilt, wondering how they could have wrought such terrible vengeance for so small an affront.

We moderns are often quick to turn to technological fixes for our technological problems, and I do not fault this approach as part of a solution.  But technology by itself is unlikely to solve a problem that is founded in our fallen nature.  Vices are not technological at root, but rather a product of human sin.  So when you fight that latest technological temptation, certainly use all means at your disposal to resist.  If your eye causes you to sin, pluck it out.  If your computer causes you to sin, throw it out.  The extreme measures that Jesus suggested were, I think, to clearly wake us to the danger of sin and to the need for intense resistance.  If you keep your eye and your computer, then certainly also put special measures in place.  For example, defy Internet porn through filter software but also through human accountability partners. 

One final danger to consider is the vice of pride, which could arise if we think that our feeble efforts somehow will save us.  We cannot save ourselves – only by grace do we resist our sinful nature.  In this Holy Week, I look to Jesus Christ who stood against temptations and thus was our perfect sacrifice, fully human so that he could my place and fully divine so that he could bear the weight of God’s just anger against sin for us all.  Blessed be his name.

Is Flying Safe?

Friday, April 08, 2011

By Steven H. VanderLeest

On a recent trip I flew on a number of aircraft: an Airbus A320 narrow-body jetliner and an Embraer ERJ135 regional jet.  This was right after the news of Southwest Airlines flight 812 making an emergency landing in Yuma, Arizona after a 5 foot tear ripped opened in the fuselage of the Boeing 737 flying at 30,000 feet.  No one was seriously injured, but this event was obviously a cause for serious concern.  Southwest grounded most of its 737s while they performed inspections of the lap joints for evidence of cracking (several planes had some cracking). Following this incident, Boeing issued new guidelines recommending more frequent inspection of 737s to look for fatigue cracks in lap joints of these aircraft.

We’ve all heard that flying is safer than driving, but it doesn’t feel safer to most of us.  Perhaps it is because we are not in control of the aircraft like we are of the automobile. Perhaps it is because most of us drive much more frequently than we fly, so that we are more habituated to the risk of car travel.  Perhaps it is because the spectacular nature of the rare failure so that airline crashes almost always make headlines while car crashes rarely does. Furthermore, perhaps that adage is not actually true.  In a paper in the 2006 Christian Engineering Education Conference, Professor Gayle Ermer notes that a “risk level per mile for driving that is in the same range as for flying. In other words, contrary to the popular wisdom used to reassure fearful airplane passengers, it is not safer to fly than to drive on a per mile traveled basis.” (Ermer, Gayle, “Understanding Technological Failure: Finitude, Fallen-ness, and Sinfulness in Engineering Disasters,” Proceedings of the Christian Engineering Education Conference, 2006).  Her paper looks at some of the causes of technology failures, many of which connect intimately with our human nature.

One possible cause of aircraft failure is a flaw in the design of the avionics hardware or software.  The Federal Aviation Administration (FAA) oversees and certifies aircraft for flight worthiness.  For electronics hardware the FAA imposes a guideline for a rigorous process of testing for robustness against a variety of environmental conditions in the DO-160 standard, a rigorous process of development and testing of digital logic in the DO-254 standard, and a rigorous process of development and testing of software in the DO-178 standard.  Newly engineered technology is subjected to careful peer-review of the design and then substantial testing of the implementation.  The DO-254 and DO-178 standards apply stricter standards for technology that, should it fail, would have a more dire impact.  Design assurance level A is the strictest standard for avionics that would cause catastrophic failure (and likely multiple deaths) if it failed.  Level E is the lowest level, for technology whose failure would not impact safety in any foreseeable event.  Flight control systems are an example of level A; passenger entertainment systems are level E.  Technology at the highest levels of safety criticality must be designed with redundancy so that no single point of failure in the hardware results in system failure. 

I think back-up systems and redundancy for fault tolerance nicely reflects the virtue of humility because it recognizes that we cannot design perfect systems and must account for potential failures (which hopefully are handled gracefully by redundancy so that no injuries occur).  This works relatively well for hardware, but we have not yet found a similarly strong approach for software.  At one time, multi-version software was in vogue as a supposed redundant approach.  In this method, the requirements for the software were given to several independent development teams.  Each software version was run simultaneously (either on multiple processors or as independent parallel processes in a multitasking system), with a “golden” voter taking the result from each version to determine the actual action taken for the system.  The thought was that it was unlikely that independent teams would make the same mistake in the same place.  Thus as long as the majority of the versions got it right, any mistakes would be outvoted.  The flaw in this approach was that it turns out that even very diverse teams often make similar mistakes for similar inputs and system situations.  There are other ways to try to account for design flaws in software (which often show up for unusual boundary conditions that were not anticipated), such as checkpoints, built-in tests that check for sanity or test that the system still is operating within the expected bounds, and so forth.  Down the road, software designers for safety critical markets anticipate that proofs (sometimes called “formal methods”) will be used to verify software is correct with mathematical certainty.  For now, these methods tend to be too difficult to apply to anything more than small sections of software code.

Whether we are estimating the risk for flying or for other technology, such as the danger of nuclear power plant failures, the impact of energy technologies on climate change, or the risk of eating genetically modified food, I think we must be careful to avoid the trap of believing we can calculate the risk in a completely unbiased, objective way.  In the end, risk assessment is not simply a mathematical formula (though quantitative analysis certainly is part of the process), but is a human decision that requires wisdom and insight.  In a recent article, David Caudill notes various viewpoints on scientific knowledge, particularly with regard to weighing risk.  He describes a perspectives which “views all risk assessments as judgment calls.  Even a scientist’s degree of confidence is not a scientific matter, and our assessment of whether a scientific analysis is relatively certain is grounded in pragmatic decisions about what to study, which variables to consider, how accurate our measurements need to be, and how much potential error we’re willing to accept.  When we say something is ‘safe’ or ‘injurious’ or we say that the evidence is ‘ample’ or ‘convincing’ or ‘reasonably certain,’ those words sound scientific but are actually non-scientific judgments.”  (David S. Caudill, “Science in Law: Reliance, Idealization & Some Calvinist Insights,” Pro Rege, March 2010, pp. 1-9)  Caudill then argues for a still more perspectival position that sees culture and worldview not only affecting our assessment of risk and uncertainty (where our values are applied to unbiased facts), but also affecting the way we interpret the facts themselves.  He notes “multiple interpretive frames, which reflect values but which see facts differently… Our selection of facts and values is not so much conscious and voluntary as it is grounded in our cultural assumptions” (p. 7)

A machine cannot do science nor engineering – only a human can perform these tasks because they involve more than mindlessly following a recipe or rote formula.  These are creative activities that require sophisticated thinking, insight, and wisdom.  There is truth as well as beauty to be found in these activities, but perhaps like beauty, truth is also partly in the eye of the beholder.  I’m not arguing for relativism here nor strong postmodernism.  I do believe in an absolute truth, but I also believe that no mere mortal has a lock on that truth.  We are all affected by sin.  Even without sin, we are finite created beings and our limitations may prevent us from coming to a common understanding.  Even beyond our finite and fallen nature, we each come with a socio-cultural interpretive framework that gives us slightly different lenses through which we view the world. I also believe in common grace—that all truth is God’s truth, and thus we each may hold a piece of the grand puzzle. When it comes to risk assessment, it is important to discuss risk together and not solely depend on the “experts” advice.  Those conversations can help tease out our own particular values and worldviews so that we understand one another better and also understand our technology better. 

Page 1 of 1 pages
(c) 2013, Steven H. VanderLeest