DonÆt Trust the Numbers. If the data doesnÆt look right, then donÆt believe it! All too often laboratory data is taken as gospel. The data comes back from the lab, the values are averaged, and the person writing the report, ômakes upö some, albeit plausible, explanation to support the results.
Granted, some pretty unusual and surprising results occur in science, some of which may fly in the face of reasonable, commonly-accepted principles --- but these are uncommon occurrences. More often than not, something is wrong and subsequently the data may not be reliable.
First, check the input data (dimensions, equipment settings, etc). A simple entry error can alter the test results significantly. Next, recheck the calculations carefully. If the calculations are performed automatically, performing a sample calculation ôby handö might be appropriate. A word of caution here --- itÆs very easy to make the same mistake over and over again, if youÆre not paying attention to the details.
One thing I always try to do is to go back and inspect the individual specimens (itÆs a good idea to identify and preserve each individual specimen and all its pieces after testing). A close examination of the specimens may reveal some potential explanation for the unexpected results.
Sampling error can be another source of inaccuracy. We often mistakenly assume that the material we are evaluating is homogeneous and/or isotropic (having the same mechanical properties in different directions). Be concerned with specimen orientation, location of exposed surfaces, skin layers, non-uniform blending, and simple variability from specimen to specimen. If all else fails, go back and question the customer as to exactly what it was that was sent.
Make sure the test equipment is functioning properly, is set at the proper settings, and that the equipment has been properly calibrated. If possible, perform a cursory check with a known standard --- stick a thermometer in the bath, hang a weight on the test cell, etc.
Sample preparation can sometimes be a source of variability. Voids, contamination, or other discontinuities can act as defects in the specimen. Chips or nicks in the edges of specimens may result in stress concentrations which can significantly affect the results of a test. Be aware of anisotropic behavior that may arise from machine direction, or filler/reinforcement or crystalline orientation. Examine the cross-section of the specimen for stratification or possibly even delamination.
Remember, all data has error, naturally-occurring variability. Some materials exhibit more variability then others. DonÆt misinterpret normal variability as a significant difference between samples. Examine the data --- not just the averages. Is the average skewed by the results of one or two specimens? Resist the temptation to ôthrow outö outliers unless there is substantial evidence that the data is erroneous. Outlier data presents a rare view of the uncontrolled variability that can reasonably be expected to occur, although, itÆs validity in the context of a controlled test program may require some interpretation. I always prefer to include all test data in a report, identifying outliers, and explaining why I chose to include or exclude a value in my interpretation. That allows others to use the data in some other way to either confirm my conclusion or support an alternative hypothesis.
In this age of computers, most of our test equipment is becoming ôcomputerizedö to make testing more reliable and efficient. However, we often unquestionably rely on the ôcalculatedö output without a clear understanding as to how, or under what conditions the computer actually calculates the data. In some situations, transitions can occur outside the detection range set in program, which causes the computer select the ôbestö point --- as opposed to the correct one. Sometimes filters are used to ôaverage outö data variability --- the results can appear quite different from another test which did not utilize the computer filtering. Certain electronic instruments in the vicinity of the test equipment can result in a power drain or surge which can alter the electronic data being recorded. This electronic noise can be erroneously interpreted by the computer as specific events. There are many advantages to computerization --- but donÆt blindly accept the results without question. Understand how the instrument performs its computations, and on what basis it makes its determinations, and always be observant to conditions which may develop erroneous electronic data.
DonÆt get fooled by the right answer. Review the raw data to ensure that everything looks right. Technicians quickly learn what the acceptable results are expected to be for a particular test --- this talent comes with proficiency. The technician, however, should be aware of the often unconscious compulsion get the ôrightö answer. The technician should resist any attempt to interpret the numbers during the testing; rather he/she should merely report the findings, while at the same time maintaining a constant vigil for any out-of-the-ordinary occurrences which may later give some insight into unexplained variability.
Anyone can develop data, however, it takes a sound understanding of the test equipment, the process, and the material behavior to properly interpret the results and provide a suitable answer to the problem.
This brief provides a high level summary of advancements in Mentorâ€™s AMS Verification solutions since we saw you all last year at DAC 2019 in Las Vegas. Products featured: Analog FastSPICE Platform, Symphony Mixed-Signal Platform, Solido Variation Designer, and Solido Characterization Suite. Download Now
This resource will help you find key sessions-panels at DAC from researchers, customers, and our expert technologists; and various resources detailing the recent innovations in HLS, Verification, and RTL Low-Power since DAC 2019. Download Now
Among the challenges in the design flow has been aligning the metrics for design-for-test and for functional safety. This paper describes using logic built-in-self-test as both a functional safety mechanism and as a part of in-system testing, which allows for alignment of metrics required for safety certifications. Download Now
IoT systems are multi-domain designs that often require AMS, Digital, RF, photonics and MEMS elements within the system. Tanner EDA provides an integrated, top-down design flow for IoT design that supports all these design domains. Learn more about key solutions that the Tanner design flow offers for successful IoT system design and verification. Download Now