×
INTELLIGENT WORK FORUMS
FOR ENGINEERING PROFESSIONALS

Log In

Come Join Us!

Are you an
Engineering professional?
Join Eng-Tips Forums!
  • Talk With Other Members
  • Be Notified Of Responses
    To Your Posts
  • Keyword Search
  • One-Click Access To Your
    Favorite Forums
  • Automated Signatures
    On Your Posts
  • Best Of All, It's Free!
  • Students Click Here

*Eng-Tips's functionality depends on members receiving e-mail. By joining you are opting in to receive e-mail.

Posting Guidelines

Promoting, selling, recruiting, coursework and thesis posting is forbidden.

Students Click Here

Jobs

Testing

How does one Interpret Unexpected Test Results? by RichGeoffroy
Posted: 10 May 04 (Edited 27 Sep 04)


Rich Geoffroy
Polymer Services Group
POLYSERV@cox.net


DonÆt Trust the Numbers.  If the data doesnÆt look right, then donÆt believe it!  All too often laboratory data is taken as gospel.  The data comes back from the lab, the values are averaged, and the person writing the report, ômakes upö some, albeit plausible, explanation to support the results.

Granted, some pretty unusual and surprising results occur in science, some of which may fly in the face of reasonable, commonly-accepted principles --- but these are uncommon occurrences.  More often than not, something is wrong and subsequently the data may not be reliable.

First, check the input data (dimensions, equipment settings, etc).  A simple entry error can alter the test results significantly.  Next, recheck the calculations carefully.  If the calculations are performed automatically, performing a sample calculation ôby handö might be appropriate.  A word of caution here --- itÆs very easy to make the same mistake over and over again, if youÆre not paying attention to the details.

One thing I always try to do is to go back and inspect the individual specimens (itÆs a good idea to identify and preserve each individual specimen and all its pieces after testing).  A close examination of the specimens may reveal some potential explanation for the unexpected results.

Sampling error can be another source of inaccuracy.  We often mistakenly assume that the material we are evaluating is homogeneous and/or isotropic (having the same mechanical properties in different directions).   Be concerned with specimen orientation, location of exposed surfaces, skin layers, non-uniform blending, and simple variability from specimen to specimen.  If all else fails, go back and question the customer as to exactly what it was that was sent.

Make sure the test equipment is functioning properly, is set at the proper settings, and that the equipment has been properly calibrated.  If possible, perform a cursory check with a known standard --- stick a thermometer in the bath, hang a weight on the test cell, etc.
 
Sample preparation can sometimes be a source of variability.  Voids, contamination, or other discontinuities can act as defects in the specimen.  Chips or nicks in the edges of specimens may result in stress concentrations which can significantly affect the results of a test.  Be aware of anisotropic behavior that may arise from machine direction, or filler/reinforcement or crystalline orientation.  Examine the cross-section of the specimen for stratification or possibly even delamination.

Remember, all data has error, naturally-occurring variability.  Some materials exhibit more variability then others.  DonÆt misinterpret normal variability as a significant difference between samples.  Examine the data --- not just the averages.  Is the average skewed by the results of one or two specimens?  Resist the temptation to ôthrow outö outliers unless there is substantial evidence that the data is erroneous.  Outlier data presents a rare view of the uncontrolled variability that can reasonably be expected to occur, although, itÆs validity in the context of a controlled test program may require some interpretation.  I always prefer to include all test data in a report, identifying outliers, and explaining why I chose to include or exclude a value in my interpretation.  That allows others to use the data in some other way to either confirm my conclusion or support an alternative hypothesis.

In this age of computers, most of our test equipment is becoming ôcomputerizedö to make testing more reliable and efficient.  However, we often unquestionably rely on the ôcalculatedö output without a clear understanding as to how, or under what conditions the computer actually calculates the data.  In some situations, transitions can occur outside the detection range set in program, which causes the computer select the ôbestö point --- as opposed to the correct one.  Sometimes filters are used to ôaverage outö data variability --- the results can appear quite different from another test which did not utilize the computer filtering.  Certain electronic instruments in the vicinity of the test equipment can result in a power drain or surge which can alter the electronic data being recorded.  This electronic noise can be erroneously interpreted by the computer as specific events.  There are many advantages to computerization --- but donÆt blindly accept the results without question.  Understand how the instrument performs its computations, and on what basis it makes its determinations, and always be observant to conditions which may develop erroneous electronic data.

DonÆt get fooled by the right answer.  Review the raw data to ensure that everything looks right.  Technicians quickly learn what the acceptable results are expected to be for a particular test --- this talent comes with proficiency.  The technician, however, should be aware of the often unconscious compulsion get the ôrightö answer.  The technician should resist any attempt to interpret the numbers during the testing; rather he/she should merely report the findings, while at the same time maintaining a constant vigil for any out-of-the-ordinary occurrences which may later give some insight into unexplained variability.

Anyone can develop data, however, it takes a sound understanding of the test equipment, the process, and the material behavior to properly interpret the results and provide a suitable answer to the problem.



Back to Plastics Engineering general discussion FAQ Index
Back to Plastics Engineering general discussion Forum

My Archive


Resources

White Paper – Choosing the Right Spring Loaded Connector
In today’s cost-sensitive world, designers are often driven to specify the lowest cost solution for every aspect of their designs to ensure that their solution is competitively priced and their company remains profitable. However, specifying a low-cost, low-quality connector solution can result in premature failure, considerable re-work costs and damage to reputations. Download Now
eBook – Own the Lifecycle: Sustainable Business Transformation
Increasingly, product and services companies are seeking more information and control in the operational lifecycle of their products, including service and use. Better information about the operational lifecycle, and the ability to use that information, requires more than just unstructured data flowing back from products in the field. Download Now

Close Box

Join Eng-Tips® Today!

Join your peers on the Internet's largest technical engineering professional community.
It's easy to join and it's free.

Here's Why Members Love Eng-Tips Forums:

Register now while it's still free!

Already a member? Close this window and log in.

Join Us             Close