User login

Navigation

You are here

Comments

Subscribe to Comments feed
Comments
Updated: 18 hours 52 min ago

Best adult education materials now

Thu, 2020-03-26 13:26

In reply to Freely Downloadable Special Issue of Journal of Applied Mechanics/Century of Fracture Mechanics/John W. Hutchinson's 80th Birthday

Kejie, Pradeep and Yanfei,

Since our “social distance” is more than 600 miles, we should “meet” often and explore advance in fracture mechanics? I just find cracks in a window which faces my backyard need new theory---the crack branching angle is much larger than the theoretical angle ( 60 degrees between two branching cracks). I cannot find any explanation  from these PDF papers.  

Wonderful work!

Wed, 2020-03-25 15:11

Thanks!

Wed, 2020-03-25 09:30

In reply to Download issue for the free special issue of JAM is now fixed

It works well here. Excellent reading materials when staying at home! 

Download issue for the free special issue of JAM is now fixed

Tue, 2020-03-24 18:33

In reply to Everyone, thanks for your

Several users had trouble downloading the pdf of the special issue of JAM. This has now been fixed by the ASME.

Everyone, thanks for your

Fri, 2020-03-20 18:39

In reply to Me neither

Everyone, thanks for your notes...there does appear to be a problem for some users. I have alerted the editor to check on this.

Objective time derivatives

Fri, 2020-03-20 12:18

In reply to Why rate equations in Nonlinear FE?

Note a constructive introduction of the objective time derivatives based on the formulation of solid mechanics as a simple Lagrangian system in [1]. This enables to distinguish between deformation rates, which are in principle Lie derivatives, and stress rates, which are actually covariant derivatives along a curve representing a deformation process. Besides, the role of Daleckii-Krein formula in understanding the theory with generalized strains is highlighted, and a special attention is also paid to the logarithmic time derivative.

[1] Fiala, Z.: Objective time derivatives revised. ZAMP 71, Article number: 4 (2020)

The time-derivative of deformation gradient

Fri, 2020-03-20 12:11

In reply to Geometry of Non-Linear Continuum Mechanics

Let me point out that the time-derivative of deformation gradient F(t) is mathematically exactly defined in terms of the material time derivative, expressed via the covariant derivative in the ambient space, as specified in [1] (Def. 4.19 in Ch. 1.4 together with Prop. 2.2 in Ch. 4.2). A more complex derivation, based on the infinite-dimensional Riemannian manifold of all smooth embeddings of a body into the ambient three-dimensional vector space, is exposed in [2] and reviewed in [1] Ch. 2.4 Box 4.2. For more, see [3]

[1] Marsden, J.E., Hughes, T.J.R.: Mathematical Foundations of Elasticity. Dover, New York (1993)
[2] Epstein, M., Segev, R.: Dierentiable manifolds and the principle of virtual work in continuum mechanics. J. Math. Phys. 21, 1243-1245 (1980)
[3] Fiala, Z.: Objective time derivatives revised. ZAMP 71, Article number: 4 (2020)

Me neither

Fri, 2020-03-20 09:17

In reply to Freely Downloadable Special Issue of Journal of Applied Mechanics/Century of Fracture Mechanics/John W. Hutchinson's 80th Birthday

 I have tried before when this issue just appeared, but always got error. With the new post from Pradeep, I checked again but still couldn't connect to any PDF. I'll ask our library to help. 

Roy, I tried and those papers

Fri, 2020-03-20 00:55

In reply to not able to get free PDF files

Roy, I tried and those papers are downloadable.    -Kejie

not able to get free PDF files

Thu, 2020-03-19 17:46

In reply to Freely Downloadable Special Issue of Journal of Applied Mechanics/Century of Fracture Mechanics/John W. Hutchinson's 80th Birthday

Pradeep,

Thanks for your efforts! I cannot download any paper. Did you ask another person to try?

 

Roy

 

The submission deadline has been extended

Tue, 2020-03-17 02:45

In reply to Call for Abstract Submission to SES2020 Symposium "Distinct Element Method Mechanics Across Scales And Domains"

Submission deadline: March 31, 2020

The submission deadline has been extended from March 17, 2020 to March 31, 2020. Below is the list of scientific tracks and the minisymposia titles for abstract submission for the 2020 SES conference.

Please review before entering your abstract submission.

mesh size and analysis effects

Mon, 2020-03-16 01:32

In reply to flexural wave propagation

By choosing the wrong equilibrium or mesh size, some elements of the stress wave analysis do not and therefore are not included in the analysis.

Dear Ajit and Miguel,

Sun, 2020-03-01 05:05

In reply to About training cost for ML algorithms

Dear Ajit and Miguel,

Thank you for your reply. Your comments are very insightful.

 

Chenna

Some videos from this work

Fri, 2020-02-28 21:49

In reply to Fatigue-resistant hydrogel adhesion

https://static-content.springer.com/esm/art%3A10.1038%2Fs41467-020-14871...

Video 1: 90-degree peeling of fatigue-resistant hydrogel adhesion on a glass substrate under cyclic loading. 

https://static-content.springer.com/esm/art%3A10.1038%2Fs41467-020-14871...

Video 2: Fatigue-resistant hydrogel coating on a stainless steel plate sliding against cartilage.

About training cost for ML algorithms

Fri, 2020-02-28 06:52

In reply to Aspects of computational cost in ML algorithms

Dear Chenna,

Your question is very relevant. Here's my two cents on this:

  • Usually, the very large computational costs that you are referring to are associated to deep learning tasks on massive datasets with high dimensionality. If you go to AI conferences, you hear about training times of several weeks in super computers with state-of-the-art GPUs. This makes sense because you have to find billions of weights for complex network architectures that need to learn from enormous datasets. Certainly, there will be problems in Computational Mechanics where addressing this research question is going to be important. Some pointers have been provided by Ajit already, and there are many more ideas out there. 
  • A large class of problems in Computational Mechanics does not have access to such large datasets, especially if the data comes from low-throughput experiments of a single lab (most common in our community) or from high-fidelity simulations that predict complex behavior for a specific problem (plasticity, fracture, fatigue, fluid-structure interaction, turbulence, etc.). In those cases, what the average mechanics person considers "a lot of data" is actually very little data in the context of Computer Science. For example, training neural networks on datasets of 100,000 points and reasonable dimensionality is usually not a big deal once you get familiar with the tools.

I hope this addresses directly the question you asked, and I hope more people like Ajit can provide more insight. Best,

Miguel

Computational costs in ML vis-a-vis in traditional techniques

Fri, 2020-02-28 02:51

In reply to Aspects of computational cost in ML algorithms

1. Computational costs involved in training ANNs, esp. complex DL-networks, can be enormously high. However, once the network is trained, drawing an inference is, comparatively speaking, extremely fast.

2. I haven't run into comprehensive theoretical analyses regarding computational complexity of modern (latest) DL networks, esp. those using GPGPU's and clusters. For some empirical data, see, e.g., https://arxiv.org/abs/1811.11880 .

3. If you want to use ANNs for engineering simulations, you first have to generate massive ``training datasets'' from which the networks will learn. These datasets have to be created using the traditional engineering simulation techniques (FEM/FVM/FDM/LBM/SPH/others). This phase of creating the very training dataset itself spells further costs (which are absent in many other applications of ANNs like for image, text, or video data).

But once again, the additional costs pertain only to the training phase. Certainly not for the inference phase.

One somewhat promising approch in this context (currently very much under development, like all ML techniques) is: Transfer learning.

4. Inference costs would obviously be negligible when compared with conducting separate simulations using the traditional simulation techniques. However, ANN-inferences are probabilistic.

5. So, think of ANN-drawn inferences as approximate solutions to be fed into tradiational iterative techniques for solution refinements, thereby reducing costs of traditional simulations.

Caveat : This idea would work only if the network has at all managed to land the inference into the right solution regime in the first place. Solution regimes are a serious issue for nonlinear problems like those in CFD/FSI etc.

For a pop-sci a/c on the recent progress in this direction (using ANNs for Navier-Stokes/chaos), see https://www.quantamagazine.org/machine-learnings-amazing-ability-to-pred... .

Best,

--Ajit

 

Aspects of computational cost in ML algorithms

Thu, 2020-02-27 10:08

In reply to Journal Club for February 2020: Machine Learning in Mechanics: simple resources, examples & opportunities

Hi Miguel,

Thank you for starting this excellent thread on ML for applications in Computational Mechanics.

I have one question regarding this topic.

One common conclusion that I came across in the majority of papers on this topic of ML for Computational Mechanics is that the results with ML algorithms match with some reference solution but the computational cost is enormous, often requiring vast amount HPC resources, GPUs in particular.

As far as my understanding of ML (with DNN and other related techniques) goes, this issue of cost is not particular to Computational Mechanics but common in any sufficiently large ML application.

If the computational cost is one of the main bottlenecks (in training the network/model), then what is the solution? Appreciate any papers that address the cost issues related to ML and how to overcome them.

 

Best,

Chenna

 

Judea Pearl’s answer to: Are you excited about AI?

Sat, 2020-02-22 11:14

In reply to Journal Club for February 2020: Machine Learning in Mechanics: simple resources, examples & opportunities

Judea Pearl's answer to the following question: People are excited about the possibilities for AI. You’re not?

"As much as I look into what’s being done with deep learning, I see they’re all stuck there on the level of associations. Curve fitting. That sounds like sacrilege, to say that all the impressive achievements of deep learning amount to just fitting a curve to data. From the point of view of the mathematical hierarchy, no matter how skillfully you manipulate the data and what you read into the data when you manipulate it, it’s still a curve-fitting exercise, albeit complex and nontrivial."

And like he elloquently says, it's still very impressive what curve fitting can do for us.

I recommend reading the full interview of Judea Pearl, 2011 Alan Turing awardee (the "Nobel prize" of Computer Science): https://www.quantamagazine.org/to-build-truly-intelligent-machines-teach-them-cause-and-effect-20180515/

Pages

Recent comments

More comments

Syndicate

Subscribe to Syndicate