OCD Engineer

Wednesday, September 11, 2013

By Steven H. VanderLeest

Monk would make a great engineer.  I don’t mean the monk that dedicates his life to quiet solitude in an abbey.  Rather, I mean Adrian Monk, the fictional detective of the eponymous USA Network series.  Monk is a great detective, but his defining characteristic is his Obsessive Compulsive Disorder (OCD for most of us, CDO for those who have it and therefore the letters should be in alphabetical order).  It might seem odd that a great detective also has a multitude of phobias and neuroses.  This awkward combination of strength and fragility make for compelling and hilarious episodes.  Great observational powers and OCD are not unrelated.  Monk often solves the mystery by noticing small inconsistencies that others breeze over.  Breaks in a pattern are jarring for him, so they stand out.  Monk is a great detective not in spite of his compulsions, but because of them.

OCD is also a handy characteristic for engineers.  Inconsistency is a telltale sign of a problem.  Good engineers have a good eye for breaks in the pattern.  When reviewing a design, there are a number of red flags that pop out at us as potential problems because we see a disparity:

  • measurement outside the norm
  • unusual combination of characteristics
  • intermittent or odd behavior during testing
  • gaps in analysis
  • missing test case
  • parameters out of order

OCD is handy for scientists too.  The most interesting phenomenon is the one that is out of place.  It is the signal that there is more here than meets the eye.  “The most exciting phrase to hear in science, the one that heralds new discoveries, is not ‘Eureka!’ (I found it!) but ‘That’s funny…’” (Isaac Asimov)

An inconsistent design is certainly incorrect. Observing two inconsistent measurements almost certainly means one or both are wrong.  The converse, however,  is not necessarily true.  Consistent design could be consistently incorrect; consistent measurements could be systemically wrong.  In the discipline of systems engineering, this contrast is the key difference between validation and verification.  Validation confirms that one is pursuing the correct requirements and specifications—solving the real problem.  Validation is “do the right thing”.  Verification confirms we are pursuing a goal in a consistent manner.  Verification is “do the thing right”.  Verification without validation leaves us vulnerable to solving the wrong problem.  Validation without verification leaves us vulnerable to incorrectly solving the right problem. 

Many good engineers and scientists settle for mere verification in their professional lives.  If our solution is elegant and clever, we are satisfied.  We rarely consider whether the solution is to the correct problem.  It is easy to claim all science and engineering is morally neutral, so that we need not worry about the ends and goals of our work.  If we do our job correctly, that is enough. If we are simply consistent,  that is sufficient.  Unfortunately, this bliss is ignorance.  It is not enough and not sufficient.  When we solve a problem incorrectly, i.e., get verification wrong, we may have made an honest mistake or perhaps might be guilty of negligence.  Verification addresses technical questions of correctness which may rise to the level of a moral question if we are negligent or worse, purposely subversive.  Thus, verification may occasionally address moral questions.  In the case of validation, moral questions frequently arise.  When we solve the wrong problem, i.e., get validation wrong, we may have made an honest mistake, not thinking carefully enough about choice of goals.  However, our selection of problem is often a moral choice from the start, because choosing which problem we will tackle amounts to assigning values.  It is a matter of prioritization and thus a matter of worth when we choose which scientific research program to pursue or which engineering problem to address. 

Let me provide one case study to bring this point home.  In the 1930s. IBM was engineering punch card systems to enhance the efficiency of train schedules.  They excelled at verification, ensuring that the machines could quickly and accurately compute the schedules.  Narrowly speaking, they perhaps thought about validation, customizing their general-purpose calculating device to the needs of scheduling a complex network of trains.  Broadly speaking, they did not consider this a moral question, even though their customer was none other than Nazi Germany. Hitler’s Third Reich was using the machines to improve the effectiveness of their program to exterminate the Jews.  Worse, according to at least one published report, IBM knew the end-purposes of their customer, yet continued to work closely with them right up to the time of the US entry into World War II. (Paul Festa,  “Probing IBM’s Nazi connection,” 28 June 2001,
http://news.cnet.com/2009-1082-269157.html )
.  The engineers and managers at IBM had verified, but not validated, at least not in the broadest and most important sense.

Christians working in technology areas ought to pay attention to both V’s.  Verification is important because we should do exemplary work that is accurate and correct.  “And whatever you do, whether in word or deed, do it all in the name of the Lord Jesus, giving thanks to God the Father through him.” (NIV, Colossians 3:17)  Validation is even more important because we should honor God’s will in the questions we choose to pursue.  “Finally, brothers and sisters, whatever is true, whatever is noble, whatever is right, whatever is pure, whatever is lovely, whatever is admirable—if anything is excellent or praiseworthy—think about such things.”  (NIV, Philippians 4:8)
 

 

Page 1 of 1 pages
(c) 2013, Steven H. VanderLeest