User login

Navigation

You are here

Journal Club for April 2023: Material Testing 2.0

Fabrice Pierron's picture

Prof. Fabrice Pierron

University of Southampton, UK

MatchID NV, Ghent, Belgium

Digital imaging through CCD/CMOS cameras has been one of the major technological breakthroughs of the 21st century. Such cameras are now everywhere around us: smartphones, CCTVs, sports broadcasting, wildlife observation etc. Their costs have kept decreasing and performance, drastically increasing. Such cameras have also started a revolution in mechanical testing of materials and structures. Through digital image processing algorithms like ‘Digital Image Correlation’ (DIC), it is now possible to track the 3D displacement of a random grey-level pattern attached to a deforming surface, with an accuracy down to 1/100th of a pixel. Initiated in the early 1980’s, DIC is now largely emerged, with commercial software available since the early 2000’s. However, intimate integration of this technology into material testing has not happened yet, through a combination of lack of understanding and confidence in the technology and a general lack of training, certification and standards. The paradigm change brought by the shift from a few tens of data points to hundreds of thousands and more as provided by DIC needs to be accompanied by a thorough redesign of the testing methodologies to deliver the full potential of this technology.

Coupon-based material testing has not changed much since the development of the electrical strain gauge after WWII. Current ISO or ASTM standards mostly rely on simple test configurations (uniaxial tension or 3-pt bending for instance) to extract material parameters like elastic modulus or yield stress. The underpinning concept is to use statically determinate stress solutions so that point strain measurements are enough to identify constitutive parameters. This is in stark contrast with numerical simulation, which has seen spectacular progress in the last decades. There is a growing gap between the two, which has consequences in terms of cost and efficiency. It is therefore essential to develop the next generation of data-rich image-based tests, coined ‘Material Testing 2.0’ (MT2.0) in reference [1]. Figure 1 illustrates this new paradigm. While a uniaxial tensile test provides one data point per load step, MT2.0 tests provide the equivalent of one MT1.0 test for each measurement point, resulting in an immensely richer database from which models can be better and more efficiently calibrated.

Schematic illustration of the concept of MT2.0

Figure 1 - Schematic illustration of the concept of Material Testing 2.0

The two main tools used in MT2.0 are full-field deformation measurements such as Digital Image Correlation (DIC) or the Grid Method (GM), and inverse identification tools like the Virtual Fields Method (VFM) or Finite Element Model Updating (FEMU). Both sets of tools are now mature enough so that the development of MT2.0 becomes timely. MT2.0 will simplify the test pyramid by reducing the number of tests, improve the formulation and identification of material models and facilitate the emergence of innovative tailored heterogeneous materials like fibre placement manufactured composites or 3D printed materials which spatially variable properties cannot reliably be obtained by current standard tests.  

The video presentation linked below (keynote at the IDDRG conference in Lorient, France, 2022) explains the general concept of MT2.0 and delves into applications for anisotropic plasticity (sheet metal forming), though the concept can be applied to a wide range of materials. A review article on MT2.0 has been recently published [2] that gives an overview of applications in this field.

 IDDRG 2022 presentation on MT2.0

Presentation on MT2.0 for anisotropic plasticity 

References

[1] Pierron F., Grédiac M., Towards Material Testing 2.0.  A review of test design for identification of constitutive parameters from full-field measurements. Strain, vol. 57, n° 1, e12370, 2021. https://doi.org/10.1111/str.12370

[2] Pierron F., Material Testing 2.0: a brief review, Strain, accepted, e12434, 2023. https://doi.org/10.1111/str.12434

 

Comments

Benjamin Cameron's picture

Great overview! A few thoughts:

Regarding the cost of materials testing 2.0 in comparison to previous approaches like tensile testing (materials testing 1.0), I believe it's possible to achieve similar costs when employing methodologies that only require 2D DIC. To achieve this, all that's needed is a camera and a means of applying a DIC pattern (e.g., spray paint). Of course, this would not be applicable to all problems.

A crucial factor to consider when evaluating methodologies is the level of user expertise required. It's essential to develop methodologies and software that allow non-experts to obtain reliable results for a wide range of problems. 

Different inverse identification is an open problem, and ongoing research into various methodologies is likely to considerably enhance the utility of materials testing 2.0. This includes leveraging modern advancements in machine learning and optimization, as well as the development of alternative theoretical approaches based on unique sets of assumptions, which could offer significant advantages (I'm currently working on some ideas in this area). Key challenges that seem possible to address include the cost of experimental information (stereo DIC/2D DIC; simpler experimental tests), coverage of a broader range of materials and deformation regimes, accuracy, reliability, and user expertise.

Fabrice Pierron's picture

Normal
0

false
false
false

en-BE
X-NONE
X-NONE

/* Style Definitions */
table.MsoNormalTable
{mso-style-name:"Table Normal";
mso-tstyle-rowband-size:0;
mso-tstyle-colband-size:0;
mso-style-noshow:yes;
mso-style-priority:99;
mso-style-parent:"";
mso-padding-alt:0cm 5.4pt 0cm 5.4pt;
mso-para-margin-top:0cm;
mso-para-margin-right:0cm;
mso-para-margin-bottom:8.0pt;
mso-para-margin-left:0cm;
line-height:107%;
mso-pagination:widow-orphan;
font-size:11.0pt;
font-family:"Calibri",sans-serif;
mso-ascii-font-family:Calibri;
mso-ascii-theme-font:minor-latin;
mso-hansi-font-family:Calibri;
mso-hansi-theme-font:minor-latin;
mso-bidi-font-family:"Times New Roman";
mso-bidi-theme-font:minor-bidi;
mso-fareast-language:EN-US;}

 

Thanks for this, Ben. I generally agree with your points though I am a bit doubtful about the use of 2D DIC. Certainly, in the elastic part of the response, the inevitable out of plane movements make it nearly impossible to obtain robust elastic constant. There have been some tricks published to overcome this issue, but the added effort required means that stereo-DIC remains the best option according to me. In terms of cost, the extra camera is not an issue, and the additional time required for calibration is not a deciding factor. For speckling, I think we should move away from spray paint as this does not ensure high quality measurements and adds a random element to the uncertainty quantification which would preclude standardization for instance. Like using a strain gauge with a randomly varying gauge factor.

 

Your comment about the level of expertise is interesting, and I certainly agree to part of it. Let me explain. If we are talking about routine tests for, say, material characterisation, including quality control, we need standards that could be followed by lab technicians with a detailed operational roadmap. On the other hand, outside of this framework, the complexity of DIC is akin to that of finite element simulation and handling this without thorough training and understanding of the theory is a recipe for disaster. As for numerical simulation, DIC brings a lot of information but there is a price to pay in terms of training.

 

As for data inversion, there is scope for progress indeed, but the current tools (FE updating, VFM) are already sufficiently mature to establish new MT2.0test configurations now. Regardless of the technical nitty-gritty, I think a vast space of improvement is in the very formulation of constitutive material models. For anisotropic plasticity for instance, models for yield surfaces are closed, i.e., the model predicts the behaviour for any combination of stress components. This is because the identification based on statically determinate tests only puts one point in the stress space for each test. With MT2.0, we have a point cloud, so it would be possible to formulate a yield surface using B-spline interpolation for instance, which would provide very robust data in the space activated by the test. But this would not provide any information elsewhere. This may be seen as a weakness but I think it is a strength as if you want data in a particular part of the stress space, you should perform a test providing data there. Uniaxial tests do not tell you anything about a biaxial stress state. Extrapolation from a certain experimental window to another one where tests are not available is very risky...

 

Subscribe to Comments for "Journal Club for April 2023: Material Testing 2.0"

Recent comments

More comments

Syndicate

Subscribe to Syndicate