User login

Navigation

You are here

Research directions in computational mechanics

Dear all,

I just joined this group last week. And, I'd like to share some of reading material that I found regarding research directions in computational mechanics. The paper was published in 2003, written by Tinsley Oden, Belytschko, Babuska and Hughes. It's entitled "Research Directions in Computational Mechanics" (Computer Methods in Applied Mechanics and Engineering, 192, pp 913-922, 2003). They outlined six areas with significant research opportunities in CM:

1. Virtual design 

In this regard, they mentioned that although great strides have been made in simulation in the past two decades, virtual prototyping is still more of an art than a science. To develop a virtual prototyping capability, many tests must be performed since many of the physical phenomena can not be modeled on the basis first principles today. Instead, models are tuned to tests, and the technology is not applicable to radically new designs. Specific obstacles to virtual prototyping include the inability to simulate problems with multiphysics phenomena, such as burning and change of phase, fracture and spalling, phenomena involving large disparities in scales, and behavior with a significant stochastic characteristics.

2. Multi-scale phenomena (bridging of molecular to continuum models)

A major challenge to CM for the future is to model events in which these remarkably varying scales are significant in a single system or phenomena. It is then necessary to model multi-scale phenomena simultaneously for predictive capability. Analysis of multi-scale phenomena, while apparently beyond the horizon of contemporary capabilities, is one of the most fundamental challenges of research in the next decade and beyond. So-called scale bridging, in which the careful characterization of mechanical phenomena require that the model ‘‘bridge’’ the representations of events that occur at two or more scales, require the development of a variety of new techniques and methods. In this area, integration of computational methods and devices with experimental or sensing devices is critical. High fidelity simulation and computational mechanics must involve innovative and efficient use of a spectrum of imaging modalities, including X-ray tomography, electron microscopy, sonar imaging, and many others. Similarly, in modeling phenomena such as climate changes, weather conditions, and the interaction of ocean and atmosphere, satellite-generated data must be incorporated seamlessly into viable computational models to obtain meaningful predictions. Again, the spectrum of computational mechanics must be significantly broadened to include the use of these technologies. Once more, the intrinsically interdisciplinary nature of the subject will be expanded and reinforced.

3. Model selection and adaptivity

Model selection is a crucial element in automating engineering analysis and applications are unlimited; the subject could conceivably embrace classes of models including diverse spatial and temporal scales, enabling the systematic and controlled simulation of events modeled using atomistic or molecular models to continuum models. Model selection, model error estimation, and model adaptivity are exciting areas of CM and promise to provide an active area of research for the next decade and beyond.

Areas in which adaptive modeling have great promise include the study and characterization of composite materials, unsteady turbulent flows, multiphase flows of fluids, etc. Other techniques for model adaptivity involve the use and integration of test and imaging data, feedback from experiments and measurements, and various combinations of these methodologies.

4. Very large-scale parallel computing

One of the most difficult issues facing researchers in CM in the next decade will be purely a conceptual one: the recalibration of their own education, approach, and perceptions to allow them to use efficiently the extraordinary computational tools that will be developed during this period. Today, mechanicians using computational products for engineering analysis and design can routinely develop computational models involving 500,000–10,000,000 degrees of freedom. Problems of this size today are being solved on contemporary workstations. Nevertheless, these contemporary models employ rather crude characterizations of materials, geometry, boundary conditions, failure criteria, and many other important features of the system, because it is taken for granted by the modeler that to include these details will result in computational problems so large and complex that they would exceed the capacities of modern computational facilities.

This argument is no longer correct. As the 21st century begins, computational devices capable of delivering five trillion operations per second and storing a thousand trillion bytes of data are in use and larger machines are being developed. In a decades time, machines with capabilities an order-of-magnitude beyond this level may be available. It is probable that such terascale computation capabilities will soon be in the hands of most engineers and mechanicians, thus making possible models with a level of detail and sophistication completely unimaginable only a decade ago.

The proper use of this extraordinary toolkit will itself represent a significant challenge. Included in the challenge is the education of the next generation of engineers and mechanicians who will be expected to not only master the principles of mechanics but also the use of the computational tools available to them.

These new capabilities, and advances in modeling and parallel computation, will ultimately have a remarkable and irreversible impact in education in science and engineering. Simplified models and approximate theories remain important in developing understanding, but students need no longer rehearse only idealized situations: they can now tackle more realistic models.

Highspeed parallel computing together with the software developments, alluded to elsewhere in this document, will create a revolution in engineering analysis and ultimately in the way it is taught in colleges and universities. Less than a decade ago many feared that access to modern computational methods and machines would breed overconfidence in engineers, at the expense of common sense, judgement and reasoning. Now, the new concern is one of underestimation of the power of modern computational methods and devices and the danger of their underutilization in important simulations, analysis and design.

5. Controlling uncertainty: probabilistic methods 

The random nature of many features of physical events is widely recognized by industry and researchers. The natural stimuli that activate physical systems may be completely unpredictable by deterministic models: the randomness of a gust of wind, the characterization of forces in boundary and initial conditions on mechanical systems, random microstructural features of engineering materials, the random fluctuations in temperature, humidity, and other environmental factors, all make the characterizations provided by deterministic models of mechanics less satisfactory with respect to their predictive capabilities. Fortunately, the entire subject of uncertainty can itself be addressed in a scientific and mathematically precise way and the random characteristics of nature can be addressed by computational models. During the next decade, probabilistic modeling of mechanical problems will be a topic of great importance and interest.

6. Biomedical applications

Predictive bones, nerves, and other biological systems. 

Note: Please feel free to add emerging areas in CM that can pose as interesting research topics for the next decade. 

Markus J. Buehler's picture

Hello,

I agree with most of your comments posted above.

I wanted to emphasize that studying mechanical properties of biological materials, in particular those made out of proteins poses a particularly interesting set of problems. These materials are model systems for structures that have high chemical complexity (i.e., they are made out of various chemical bonds with varying strengths). To understand the mechanical properties of these materials, it is vital to include a proper description of such chemical reactions. Such chemical reactions can now be accurately modeled using first principles based reactive force fields that enable us to treat tens of thousands of atoms with full chemical reactivity, over time scales of nanoseconds.

In addition to feature the chemical component, protein materials are intriguing because of their hierarchical structure. At this interface 'structure' and 'material' merge and become indistinguishable.

In this area, computational mechanics has great future - with many outstanding, exciting and important problems: For example, many biological processes are related to mechanics; the cell's shape/stiffness, adhesion problems, tunable elasticity of biopolymers, mechanical properties of tissues and many other applications.

Here, mechanics can be a true asset to do very interesting science. Being able to carry out simulations is sometimes advantageous over experiment - as it is easier to control experiments, to manipulate such small structures as proteins, and to set up studies with different boundary conditions.

See also: http://imechanica.org/node/1010 

Markus

Subscribe to Comments for "Research directions in computational mechanics"

Recent comments

More comments

Syndicate

Subscribe to Syndicate