
Artificial intelligence accelerates discovery of metallic glass - rbanffy
https://news.northwestern.edu/stories/2018/april/artificial-intelligence-accelerates-discovery-of-metallic-glass/
======
yazr
This is random-forest supervised learning from a set of 4000 historical
experiments

(Lots of feature engineering based on domain expertise. This is not end-to-end
DL)

Do a smaller set of new experiments to explore a small subset of the solution
space.

Retrain the model with these new experiments.

Perform another smaller set of experiments, this time over a more varied
sample of the solution space.

Overall, a x10 improvement in predicting the glass property of an un-tested
sample (although the entire process is biased toward positive samples)

Conclusion: classical ML still rocks.

I really dont see any reason why this could not have been done 10 or even 20
years ago.

~~~
scep12
> I really dont see any reason why this could not have been done 10 or even 20
> years ago.

The advancements in tooling, infrastructure and accessibility of ML in the
last 3 years alone have made the difference. That's seems obvious.

Maybe your point is that the underlying techniques haven't changed, and thus
it would have been possible to have made this discovery decades ago. But isn't
that true of even the greatest inventions? Much of what's created or
discovered is a function of the environment and conditions surrounding it.

In other words, it's not surprising to see a halo effect in other sectors as a
result of tech investment in ML.

~~~
kmax12
I agree that it is exactly this. New tooling has made machine learning easier
to use. As a result, people with deep domain knowledge but less machine
learning expertise are starting to apply ML to the problems they understand
that best.

One of the biggest roadblocks to this happening more today is that people
don't know how to perform feature engineering to prepare raw data for existing
machine learning algorithms. If we could automate this step, it would be a lot
easier for subject matter experts to use ML.

For example, I work on an open source python library called featuretools
([https://github.com/featuretools/featuretools/](https://github.com/featuretools/featuretools/))
that aims automated feature engineering for relational datasets. We've seen a
lot of non-ml people use it make their first machine learning models. We also
have demos for people interested in trying it themselves:
[https://www.featuretools.com/demos](https://www.featuretools.com/demos).

I expect to see a lot more work in the automated feature engineering space
going forward.

~~~
petra
That's interesting! So could we see a gui based machine learning for non-
programmers becoming a reality soon ?

And how close in performance this could get vs code based solutions ?

~~~
kmax12
Yes, I think so. Featuretools is actually the core of my company's commercial
product.

Performance is tricky thing to answer. If you care about machine learning
performance such as AUC, RSME, F1, then I think the answer would be 80%-90% of
coding. If you care about building a first solution, then I think the
automation would be 5-10x better.

------
plaidfuji
As a researcher in this field I'll just add that in many cases, automating the
mat sci workflows (the sample prep and the characterization) is a massive leap
in and of itself, even without adding machine learning. The benefit of machine
learning in many of these projects is to pick the automated runs optimally
(choose he right neighborhood of composition space), which probably adds a
10-100x speedup on top of the already 100-1000x speedup gained from just not
making and characterizing samples manually. It's truly a synergistic
combination of advancements in both fields and has great potential for
accelerated discovery. /shill

~~~
jv22222
Long live the Singularity! (For which these types of exponentials bode
well...)

------
car
Found the original article more informative:
[https://www6.slac.stanford.edu/news/2018-04-13-scientists-
us...](https://www6.slac.stanford.edu/news/2018-04-13-scientists-use-machine-
learning-speed-discovery-metallic-glass.aspx)

And the actual publication:
[http://advances.sciencemag.org/content/4/4/eaaq1566](http://advances.sciencemag.org/content/4/4/eaaq1566)

 _The paper is the first scientific result associated with a DOE-funded pilot
project where SLAC is working with a Silicon Valley AI company, Citrine
Informatics, to transform the way new materials are discovered and make the
tools for doing that available to scientists everywhere._

[https://citrine.io](https://citrine.io)

------
DrNuke
This field is ebullient! Bad joke, I know, but glorious times for materials
scientists, expecially when not grant-constrained and free to go deep into
domain applications. One recent state-of-the-art lit review is here
[https://www.nature.com/articles/s41524-017-0056-5](https://www.nature.com/articles/s41524-017-0056-5)
, outdated here and there already.

------
setquk
Transparent aluminum?

~~~
rbanffy
It's very hard to make a metal (where electrons can easily jump from atom to
atom) transparent.

~~~
hpcjoe
Transparency is a function of wavelength/photon energy as well as the
underlying material. Glass (the silica version) is opaque to portions of the
UV spectrum.

Generally you'll get absorption (opacity) when the photon energy exceeds an
energy gap, allowing valence band electrons to be bumped into conduction
bands, creating a corresponding hole in the valence. In the case of metals,
there is effectively no gap at points in k-space, so there is absorption
throughout the spectrum.

~~~
rbanffy
I imagine a clever arrangement of atoms where non-metallic regions alternate
with metallic ones could do the trick, as long as the transparent regions line
up enough.

But anything this orderly doesn't look like a glass anymore.

------
beautifulfreak
I read about efforts to discover compounds using random methods back in the
90s, and have been trying to research it lately, to see if the "shake and
bake" method is still a thing. Can anyone point me to relevant research? I was
surprised by estimates given about the number of possible compounds, so many
that there would not be enough time in the universe to make and test them all,
even limiting the primary elements to a dozen or so. I guess there are a
multitude of ways that the same atoms can fit together. I've tried to find
research on computer simulations. Apparently, only rough predictions can be
made. My searches have been pretty fruitless, though, and I'd welcome help.

------
ganzuul
Bulk metallic glass based mainly on Fe is very interesting. It could likely be
injection molded into very complex shapes.

------
nl
There's an old Kaggle competition[1] "Predicting Transparent Conductors" which
had a similar objective.

There's a decent discussion of the 5th place entry[2]. Judging by a very quick
read it looks like performance for methods like this could improve
dramatically with larger amounts of data.

[1] [https://www.kaggle.com/c/nomad2018-predict-transparent-
condu...](https://www.kaggle.com/c/nomad2018-predict-transparent-conductors)

[2] [https://www.kaggle.com/c/nomad2018-predict-transparent-
condu...](https://www.kaggle.com/c/nomad2018-predict-transparent-
conductors/discussion/49903)

------
p1esk
_The amorphous material’s atoms are arranged every which way, much like the
atoms of the glass in a window. Its glassy nature makes it stronger and
lighter than today’s best steel_

Why would this arrangement of atoms be stronger than a crystal lattice of
steel?

~~~
plaidfuji
From the actual paper: """ For example, the absence of deformation pathways
based on gliding dislocations leads to exceptional yield strength and wear
resistance """

Yield strength means "how much strain a material can handle before it
fractures or otherwise breaks"

In crystalline metals, a crack that forms anywhere can propagate through the
lattice quickly and lead to bulk fracture (see the southwest engine failure
recently). In an amorphous material, the deformation caused by a local crack
can be "absorbed" by the surrounding atoms because they're able to reposition
more easily.

~~~
p1esk
How come this does not make regular glass strong?

~~~
pharke
It does, check out its properties compared to other materials. The process of
turning silica into glass makes it much stronger than other materials made of
silica.

[https://en.wikipedia.org/wiki/Strength_of_glass](https://en.wikipedia.org/wiki/Strength_of_glass)

[https://en.wikipedia.org/wiki/Mohs_scale_of_mineral_hardness](https://en.wikipedia.org/wiki/Mohs_scale_of_mineral_hardness)

[https://en.wikipedia.org/wiki/Ultimate_tensile_strength](https://en.wikipedia.org/wiki/Ultimate_tensile_strength)

------
xoroshiro
Maybe a dumb question, but speedier in reference to what? Design of
Experiments?

------
jcims
Wonder if the same thing could be done with high temperature superconductors.

~~~
plaidfuji
definitely:
[https://arxiv.org/abs/1709.02727](https://arxiv.org/abs/1709.02727)

I work kind of adjacent to two of the supervisors on this project at NIST, and
the workflow for these types of projects usually goes:

(1) build predictive model of chemical composition -> superconductivity from
past experiments or databases of simulations (2) Build and program automated
sample prep (this is actually the hardest part, not the machine learning) (3)
Build and program automated structural characterization and superconductivity
measurement

The difficulty is finding a system whose design space can be explored with and
measured with automated tools, otherwise the machine learning isn't used
effectively. As others have noted the models are decades old in some cases,
what's driving this is researchers who know how to automate traditional mat
sci workflows and know enough about machine learning to pick the automated
runs optimally

------
fallingfrog
Every time I see "artificial intelligence" in a headline I mentally replace it
with "a sleep deprived computer science grad student". The result is usually
much more accurate.

